Skip to content

I’m not prone to hype. But I’ve seen the future. It’s Nvidia (again). I’m guessing the stock goes higher.

I’m not a software engineer. But I’ve been using computers  in my businesses and in my personal life for sixty years. I’ve designed systems and programmed the machines.

I’m aware of the immense productivity benefits to me  and my business friends. So I understand a smidgen of what Jensen Huang said last night in his keynote to Nvidia’s 2024 GTC Developer Conference. 11,000 people attended in person and thousands more viewed it (like me) remotely. It was mesmerizing.

I sort of understand that Jensen’s vision for computing is something I don’t fully understand. Jensen talked about how Nvidia has brought a one million times speed increase in the last ten years to computing. He calls it accelerated computing. He said that Nvidia would increase that to one billion. “It’s coming.”

Want a comparison? My latest Lenovo ThinkPad X1 powered by a miserable Intel chip is only 10% faster than the one I bought two years ago. Intel should be ashamed of itself.

Big difference between ten per cent and one million. Now you know why every self-respecting company in the entire world is clamoring to get their hands on Nvidia chips and Nvidia computing frameworks.

Image Credit: Tom’s Hardware

For example, Mark Zuckerberg says Meta (Facebook) is spending billion on Nvidia AI chips. By the end of 2024, the company’s computing infrastructure will include 350,000 H100 graphics cards — each one of which can cost upwards of $50,000. — though I bet Mark gets a discount.

Yesterday Jensen introduced a new, faster AI chip called Blackwell aimed initially at the data center — the fastest growing real estate. In a side quip, Jensen killed the return to the office by saying “Remote work is here to stay.”


Image Credit: Tom’s Hardware

The present H100 chip has 80 billion transistors. That’s the one (on the right) we’re using today. The new B200 Blackwell has 208 billion transistors. By contrast, the Intel’s i7 chip in the laptop I’m writing this blog on has about two billion transistors. It’s the fastest I can get in Lenovo’s popular X1 Carbon series.

We have actually come a long way. Intel’s first microprocessor, the 4004, released in 1971, which had 2,300 transistors. The 4004 was the first computer to be on a chip, and it ran five times faster than other designs that used aluminum gates.

Nvidia’s press release announcing Blackwell includes endorsements from Alphabet and Google CEO Sundar Pichai, Amazon CEO Andy Jassy, Dell CEO Michael Dell, Google DeepMind CEO Demis Hassabis, Meta CEO Mark Zuckerberg, Microsoft CEO Satya Nadella, OpenAI CEO Sam Altman, Oracle Chairman Larry Ellison, and Tesla and xAI CEO Elon Musk.

Blackwell is being adopted by every major global cloud services provider, pioneering AI companies, system and server vendors, and regional cloud service providers and telcos all around the world.

“The whole industry is gearing up for Blackwell,” which Huang said would be the most successful launch in the company’s history.

Generative AI changes the way applications are written, Huang said.

Rather than writing software, he explained, companies will assemble AI models, give them missions, give examples of work products, review plans and intermediate results.

These packages – NVIDIA NIMs – are built from NVIDIA’s accelerated computing libraries and generative AI models, Huang explained.

“How do we build software in the future? It is unlikely that you’ll write it from scratch or write a whole bunch of Python code or anything like that,” Huang said. “It is very likely that you assemble a team of AIs.”

Specific things that excited me personally about Jensen’s nearly  two-hour presentation —

+ Digital Twins. You build a factory or a plane or whatever. That’s physical. Then you build a digital twin of it in software. Want to make changes, improvements? Test them first in software. imagine the cost savings in not having to move the equipment around or buying new, useless equipment?

+ Robotics. Amazon is using them in its warehouses. Productivity gains are huge. That’s why you can pick a needle out of Amazon’s gigantic warehouses and get it delivered tomorrow.

+ Self-driving cars. They actually do drive themselves. Next year’s Mercedes will have Nvidia’s chips and software.

I loved this conversation with the car:

What’s that building on the left?

The Opera House.

What’s it playing?

Can you get me two tickets in he orchestra tonight?

They’ll be waiting for you at the box office.

(Imagine how this will work for fast-food drive-ins — like  Starbucks, Dunkin Donuts, Dairy Queen, etc.)

+ Biotech. Nvidia chips and software are discovering new drugs — without the burden of testing them on humans. You get a new drug in 18 months, instead of 5+ years today.

NVDA closed last night at $884, down 9% from its $974 52-week high. It’s still my biggest holding — by far.

I’m tempted to buy even more. I hope it falls 10% today.

I’ve never seen anything like this. Nor have I ever been as excited by a company.

You can watch Jensen here:

You can read a little more in this press release from  Nvidia:

We Created a Processor for the Generative AI Era,’ NVIDIA CEO Says
Kicking off the biggest GTC conference yet, NVIDIA founder and CEO Jensen Huang unveils NVIDIA Blackwell, NIM microservices, Omniverse Cloud APIs and more.

Click here.

Here’s a story from Tom’s Hardware:

Nvidia’s next-gen AI GPU is 4X faster than Hopper: Blackwell B200 GPU delivers up to 20 petaflops of compute and other massive improvements
The dual-die B200 GPU has 4X the AI training performance and 30X the inference performance of its predecessor.

Click here.

Last night, Richard Grigonis, technology editor of this blog , emailed me:

The whole presentation was mind-blowing.

Interesting that the switching fabric they use is Infiniband, which is a heavy-duty connecting  high bandwidth fabric that was used in mainframes and high-end fault resilient PCs. I used to write about it and they’ve upgraded it tremendously.

The “digital twin” idea is interesting as well. Design something and understand (simulate) it in digital form before you build it or do something with an existing thing physically. His most extreme example is climate and extreme weather. How can we predict weather at regional scales at high resolution to keep people out of harm’s way and save $150 billion a year? Answer is “Earth II,” a digital twin of the Earth — a detailed simulation — and a generative AI model called CorrDiff (and its weather forecasting model, FourCastNet) to accurately predict weather at a 2 kilometer resolution.

For years I said that the only way to accurately predict market movements and advance our knowledge of economic phenomena would be to create a simulation of the whole world economy. Well, soon that will be possible.

Philosophers like Bostrom believe that we are all living in some advanced civilization’s computer simulation of an entire universe! Doesn’t sound so preposterous now.

The NIMs or NVIDIA Inference Microservice pre-trained AI models that you can download and run on a workstation or cloud or corporate data center is also the logical extrapolation of the “AI agent” concept.  You end up with a hierarchy of them. If you were starting Telecom Library (the publishing company that Harry  started in the 1970s) now, you’d have a team of NIM chatbots directed by a super-AI NIM that would act as your version of Muriel (CFO), Rose (COO) and Gerry (co-CEO and Sales VP).

And applying the digital twin idea to robotics via the omniverse physical world model to help robots learn about “autonomous” systems (like people) is scary.  And streaming Omniverse to the Apple Vision Pro is just the first step in developing what will eventually be an actual Star Trek-type holodeck.

And most programmers (and just about everybody else) will be out of a job.

This is too mind-boggling. I have to lie down

Jensen made some references to my favorite Taylor Swift. He’s clearly become a Swifty for the AI Generation.

My favorite Taylor Swift song:

This was recorded nine years ago. It’s had 3.4 billion views. That’s just under half the entire world’s population.

I wonder how many Nvidia chips YouTube uses. I bet it’s oodles.

That’s it for me. It’s 3:03 AM. I have to be on the tennis court in less than four hours.

By the way, if you know of some friends who might be interested in my enthusiams, send them to today’s web site. Sign up at the top of the left hand column. Click here.

We’re doing OK: Wall Street is guessing $1,200 soon.

Nice to get so turned on I stay up half the night. My doctors are most disproving of my sleep habits. They believe everyone my age (81) needs 7-8 hours sleep a night.

They’re probably right. But, then, they don’t own Nvidia stock.

See you soon. — Harry Newton