[ad_1]

Nvidia CEO Jensen Huang unveiled the AI chipmaker’s highly anticipated new processor on Monday, saying tech giants like Microsoft and Google are already eagerly awaiting its arrival.

Huang made the announcement throughout the company’s closely watched GPU Technology Conference, or GTC, which has been dubbed the “Woodstock of AI” by employees and analysts alike. The annual convention in San Jose, California, got here as Nvidia has led the best way in a Wall Street frenzy for AI stocks to start out 2024 — blowing past sky-high earnings expectations, turning into the first chipmaker to reach a $2 trillion market cap, and hovering previous corporations together with Amazon to become the third-most valuable company in the world.

Nvidia’s rise has been fueled by its $40,000 H100 chips, which energy the so-called massive language fashions wanted to run generative AI chatbots like OpenAI’s ChatGPT. The chips are referred to as GPUs, or graphics processing models.

On Monday, Huang unveiled Nvidia’s next-generation GPU “Blackwell,” named after mathematician David Blackwell, the primary Black scholar inducted into the Nationwide Academy of Sciences. The Blackwell chip is made up of 208 billion transistors, and can have the ability to deal with AI fashions and queries extra rapidly than its predecessors, Huang stated. The Blackwell chips succeed Nvidia’s hugely in-demand H100 chip, which was named for the pc scientist Grace Hopper. Huang known as Hopper “probably the most superior GPU on the planet in manufacturing immediately.”

“Hopper is implausible, however we’d like greater GPUs,” he stated. “So women and gents, I’d wish to introduce you to a really large GPU.”

Microsoft, Google mother or father Alphabet, and Oracle are among the many tech giants making ready for Blackwell, Huang stated. Microsoft and Google are two of Nvidia’s largest clients for its H100 chips.

Nvidia inventory was largely flat Monday, nevertheless it’s up greater than 83% up to now this yr and greater than 241% during the last 12 months.

Learn extra: Is Nvidia stock in a bubble that will burst? Wall Street can’t make up its mind

Throughout his keynote tackle on the Nvidia convention Monday, Huang introduced new partnerships with laptop software program makers Cadence, Ansys, and Snyopsys. Cadence, Huang stated, is constructing a supercomputer with Nvidia’s GPUs. And Nvidia’s AI foundry is working with SAP, ServiceNow, and Snowflake, Huang stated.

Huang additionally shouted out Dell founder and CEO Michael Dell within the viewers, whose firm is partnering with Nvidia. Dell is expanding its AI offerings to clients, together with new enterprise knowledge storage with Nvidia’s AI infrastructure.

“Each firm might want to construct AI factories,” Huang stated. “And it seems that Michael is right here, and he’s glad to take your order.”

Huang additionally introduced that Nviida is making a digital mannequin of the Earth to foretell climate patterns with its new generative AI mannequin, CorrDiff, which may generate 12.5 occasions higher-resolution pictures than present fashions.

And Huang stated that Nvidia’s computing platform Omniverse now streams to Apple’s Vision Pro headset, and that Chinese language EV-maker BYD is adopting Nvidia’s next generation computer Thor.

Huang wrapped up his two-hour lengthy keynote accompanied by two robots, the orange and inexperienced Star Wars BD droids, which he stated are powered by Nvidia’s Jetson computer systems, and which discovered to stroll with Nvidia’s Isaac Sim.

All through the week on the GTC, Nvidia researchers and executives will be joined by power players in the AI industry — together with Brad Lightcap, chief working officer of OpenAI, and Arthur Mensch, chief govt of French OpenAI rival Mistral AI — to ship classes on matters starting from innovation to ethics.

Microsoft and Meta are Nvidia’s largest customers for its H100 chip, with each tech giants spending $9 billion on the chips in 2023. Alphabet, Amazon, and Oracle had been additionally prime spenders on the chips final yr.

However the frenzy over Nvidia’s H100 chips has prompted worries about shortages, and rivals eager to remain forward within the AI race have began constructing their very own variations of the chips. Amazon has labored on two chips known as Inferentia and Tranium, whereas Google has been engaged on its Tensor Processing Items.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *