Well, what now?
After a stupendous run in the last 12 months, Nvidia is now the third most valuable company in the U.S.
1. Microsoft Corporation with a market cap of $2,988 billion
2. Apple Inc. with a market cap of $2,612 billion
3. Nvidia Corporation with a market cap of $2,183 billion
4. Amazon.com, Inc. with a market cap of $1,802 billion
5. Alphabet Inc. with a market cap of $1,640 billion
Will Nvidia’s run continue? My answer is Yes. To wit:
+ 10 for 1 stock split coming.
+ New faster chips coming. Blackwell.
+ Huge demand, especially from data centers.
+ Jensen. There’s no CEO with the charisma and vision of Jensen Huang.
Says Friday’s Wall Street Journal:
The ride is far from over” in reference to Nvidia’s continued success in selling its older chips while anticipating the release of its new AI product line. .. Wall Street expects Nvidia’s annual data center revenue to exceed $100 billion over the next year, up from nearly $60 billion in the previous year.
Here’s the Wall Street Journal’s chart.
If you’re not thoroughly bored with my enthusiasm for Nvidia…
Here’s The Economist’s latest piece:
Can Nvidia be dethroned? Meet the startups vying for its crown
A new generation of AI chips is on the way
“He who controls the GPUs, controls the universe.” This spin on a famous line from “Dune”, a science-fiction classic, is commonly heard these days. Access to GPUs, and in particular those made by Nvidia, the leading supplier, is vital for any company that wants to be taken seriously in artificial intelligence (AI). Analysts talk of companies being “GPU-rich” or “GPU-poor”, depending on how many of the chips they have. Tech bosses boast of their giant stockpiles. Nvidia’s dominance has pushed its market value above $2trn. On May 22nd it reported that its sales for the quarter ending in April grew by 262%, year on year (see chart).
GPUs do the computational heavy lifting needed to train and operate large AI models. Yet, oddly, this is not what they were designed for. The initials stand for “graphics processing unit”, because such chips were originally designed to process video-game graphics. It turned out that, fortunately for Nvidia, they could be repurposed for AI workloads.
Might it be better to design specialist AI chips from scratch? That is what many companies, small and large, are now doing in a bid to topple Nvidia. Dedicated AI chips promise to make building and running AI models faster, cheaper or both. Any firm that can mount a credible threat to the reigning champion will have no shortage of customers, who dislike its lofty prices and limited supplies.
Ordinary processing chips, like those found inside laptop and desktop computers, are in essence designed to do one thing after another. GPUs, by contrast, contain several thousand processing engines, or “cores”, which let them run thousands of versions of the same simple task (like drawing part of a scene) at the same time. Running AI models similarly involves running lots of copies of the same task in parallel. Figuring out how to rewrite AI code to run on GPUs was one of the factors that triggered the current AI boom.
Yet GPUs have their limitations, particularly when it comes to the speed with which data can be shuffled on and off them. Modern AI models run on large numbers of interconnected GPUs and memory chips. Moving data quickly between them is central to performance. When training very large AI models, some GPU cores may be idle as much as half of the time as they wait for data. Andrew Feldman, the boss of Cerebras, a startup based in Sunnyvale, California, likens it to the gridlock in a grocery store on the day before Thanksgiving. “Everybody’s in a queue, so there are blockages in the parking lot, there are blockages in the aisles, blockages at the checkout. That’s exactly what’s happening with a GPU.”
Cerebras’s response is to put 900,000 cores, plus lots of memory, onto a single, enormous chip, to reduce the complexity of connecting up multiple chips and piping data between them. Its CS-3 chip is the largest in the world by a factor of 50. “Our chip is the size of a dinner plate—a GPU is the size of a postage stamp,” says Mr Feldman. On-chip connections between cores operate hundreds of times faster than connections between separate GPUs, Cerebras claims, while its approach reduces energy consumption by more than half, for a given level of performance, compared with Nvidia’s most powerful GPU offering.
Groq, another startup, is taking a different approach. Its ai chips, called language processing units (LPUs), are optimised to run large language models (LLMs) particularly quickly. In addition to containing their own memory, these chips also act as routers, passing data among the interconnected LPUs. Clever routing software eliminates the variation in latency, or time spent waiting for data, allowing the whole system to run in lockstep. This greatly boosts efficiency, and thus speed: Groq says its LPUs can run big LLMs ten times faster than existing systems.
Yet another approach is that taken by MatX, also based in California. GPUs contain features and circuitry that provide flexibility for graphics, but are not needed for LLMs, says Reiner Pope, one of the firm’s co-founders. The GPU-like chip his firm is working on gets rid of such unnecessary cruft, boosting performance by doing fewer things better.
Other startups in this area include Hailo, based in Israel; Taalas, based in Toronto; Tenstorrent, an American firm using the open-source RISC v architecture to build AI chips; and Graphcore, a British company that is thought to be about to sell itself to SoftBank, a Japanese conglomerate. Big tech firms are also building ai chips. Google has developed its own “tensor processing units” (tpus), which it makes available as a cloud-computing service. (It unveiled its latest version on May 14th.) Amazon, Meta and Microsoft have also made custom chips for cloud-based ai; Openai is planning to do so as well. amd and Intel, two big incumbent chipmakers, make gpu-like chips already.
One danger for the newcomers is that their efforts at specialisation could go too far. Designing a chip typically takes two or three years, says Christos Kozyrakis, a computer scientist at Stanford University, which is “a huge amount of time” given how quickly ai models are improving. The opportunity, he says, is that the startups could end up with a chip that is better at running future models than Nvidia’s less specialised gpus are. The risk is that they specialise in the wrong thing.
Having previously worked at Google, which developed the currently dominant “transformer” architecture used in llms, Mr Pope of MatX is confident that his firm has “a somewhat good crystal ball”. And if a new approach comes along—“state-space models” are the latest thing—its chip is versatile enough to adapt, he says. Mr Feldman says all modern AI is still just “sparse linear algebra” under the hood, which Cerebras’s chip can do very quickly.
Is greatness a transitory experience?
Another challenge is that Nvidia’s software layer for programming its GPUs, known as CUDA, is a de facto industry standard, despite being notoriously fiddly to use. “Software is king,” says Mr Kozyrakis of Stanford, and Nvidia has a significant advantage, having built up its software ecosystem over many years. AI-chip startups will succeed only if they can persuade programmers to rejig their code to run on their new chips. They offer software toolkits to do this, and provide compatibility with the major machine-learning frameworks. But tweaking software to optimise performance on a new architecture is a difficult and complex business—yet another reason Nvidia is hard to dislodge.
The biggest customers for AI chips, and the systems built around them, include model-builders (such as OpenAI, Anthropic and Mistral) and tech giants (such as Amazon, Meta, Microsoft and Google). It may make sense for such companies to acquire an AI-chip startup, and keep its technology to themselves, in the hope of besting the competition. Instead of trying to compete with Nvidia, chip startups could position themselves as acquisition targets.
Mr Pope says MatX is targeting the “top tier” of the market, which suggests that it hopes to sell its chips—if not the whole company—to the likes of OpenAI, Google or Anthropic, whose AI models are the most advanced. “We would be happy with many kinds of exit,” he says, “but we think there is a sustainable business here as a standalone company.” That remains to be seen. Cerebras, for its part, is said to be preparing for an initial public offering. So far none of the startups has made a dent in Nvidia’s dominant position. Plenty of people are hoping that one of them will. ■
I am seriously worried about Google
+ Perplexity and ChatGPT/Chat.40 are much better search engines for research and information.
+ Amazon is a much better search engine for finding the products you want to buy.
And Google? You seen anything new and exciting to come out of it recently? I’ve basically given it up for Perplexity. I go to Perplexity first. My son Michael does too.
How Exercise Strengthens Your Brain
I’m obsessive about playing aggressive single tennis every day. Will I live longer, healthier and be more alert? The studies say I will. But that’s not the motivation. My real motivation is that it simply makes me feel good.
Segue to today’s New York Times.
Physical activity improves cognitive and mental health in all sorts of ways. Here’s why, and how to reap the benefits:
The brain benefits really start to emerge, though, when we work out consistently over time. Studies show that people who work out several times a week have higher cognitive test scores, on average, than people who are more sedentary. Other research has found that a person’s cognition tends to improve after participating in a new aerobic exercise program for several months. …
Physical activity also benefits mood. People who work out regularly report having better mental health than people who are sedentary. And exercise programs can be effective at treating people’s depression, leading some psychiatrists and therapists to prescribe physical activity. The Centers for Disease Control and Prevention’s recommendation of 150 minutes of moderate aerobic activity or 75 minutes of vigorous aerobic activity per week is a good benchmark.
Perhaps most remarkable, exercise offers protection against neurodegenerative diseases. “Physical activity is one of the health behaviors that’s shown to be the most beneficial for cognitive function and reducing risk of Alzheimer’s and dementia,” said Michelle Voss, an associate professor of psychological and brain sciences at the University of Iowa.
What kind of exercise is best for your brain?
The experts emphasized that any exercise is good, and the type of activity doesn’t seem to matter, though most of the research has involved aerobic exercise. But, they added, higher-intensity workouts do appear to confer a bigger benefit for the brain.
Improving your overall cardiovascular fitness level also appears to be key. “It’s dose-dependent,” Dr. Wrann said. “The more you can improve your cardiorespiratory fitness, the better the benefits are.”
For the full article, click here.
Late to the party
I’m sure your kids and grandkids know about these book series. But I didn’t. I read Diary of a Whimpy Kid – No Brainer in one sitting this weekend. It’s hysterical. I can’t recommend it more strongly. Your kids and grandkids will love it.
Check out how many “Wimpy Kid” books the author Jeff Kinney has sold. That’s a minimum of $500 million in royalties.
Why teachers drink
Q. Name the four seasons
A.. Salt, pepper, mustard and vinegar
Q. How is dew formed
A.. The sun shines down on the leaves and makes them perspire
Q. What guarantees may a mortgage company insist on
A.. If you are buying a house they will insist that you are well endowed
Q. In a democratic society, how important are elections
A.. Very important. Sex can only happen when a male gets an election
Q. What are steroids
A. Things for keeping carpets still on the stairs
Q. What happens to a boy when he reaches puberty
A.. He says goodbye to his boyhood and looks forward to adultery
Q. Name a major disease associated with cigarettes
A.. Premature death
Next time on a plane
Keep your seat belt fastened. From Australia:
The farm, courtesy Charlie Munger
“Experience tends to confirm a long-held notion that being prepared, on a few occasions in a lifetime, to act promptly in scale, in doing some simple and logical thing, will often dramatically improve the financial results of that lifetime.”
“A few major opportunities, clearly recognizable as such, will usually come to one who continuously searches and waits, with a curious mind that loves diagnosis involving multiple variables. And then all that is required is a willingness to bet heavily when the odds are extremely favorable, using resources available as a result of prudence and patience in the past.”
In short, when you know you have an edge, you should bet heavily. Charlie, “They don’t teach most people that in business school. It’s insane. Of course you’ve got to bet heavily on your best bets.”
Nvidia is my best ever bet. (Other than my own business — the one I started and sold.)
Played tennis today. Will play early tomorrow morning.
— Harry Newton