Following Nvidia ‘s (NVDA) incredible quarter and strong guidance raise, we think shares of the AI-chip powerhouse can increase another 14% from its record highs in the next six to nine months. That’s generally our time horizon for a Club price target, which we’re boosting on Nvidia to $450 per share from $300. We’re keeping our 2-rating on the stock, which indicates we would want to wait for a pullback before buying more. No kidding, right? Nvidia closed Wednesday at $305 per share ahead of the amazing after-the-bell financials that pushed shares up as much as nearly 29% to Thursday’s all-time, intraday high of $394.80 each. It’s almost in the $1 trillion market cap club. Jim Cramer, a supporter of Nvidia since at least 2017, recently designated it the Club’s second own-it-don’t-trade-it stock. ( Apple was the first). Jim even renamed his dog Nvidia. Our new $450-per-share price target on Nvidia is about 45 times full-year fiscal 2025 (or calendar year 2024) earnings estimates. Nvidia has a weird financial calendar and Wednesday evening reported results for its fiscal 2024 first quarter. While 45 times isn’t cheap on a valuation basis at a little over two times the current valuation of the S & P 500, it’s only slightly above the 40 times average valuation that investors have placed on the stock over the past five. In our view, it’s more than justified when factoring in the runway for growth that Nvidia has in front of it. That’s what we are seeing Thursday as this latest round of upward estimates revisions also serves as a reminder that Nvidia, more often than not, has proven cheaper (or more valuable) than initially believed because analysts have been consistently overly conservative about the potential of Nvidia’s disruptive nature, which is now on full display as the undisputed leader in cards to run artificial technology. NVDA 5Y mountain Nvidia’s 5-year performance Jim has been singing the praises of Nvidia CEO Jensen Huang for years — not to mention covering many of the graphics processing unit (GPU) technologies already in place that enabled the company to capitalize on the explosion of AI into the consumer consciousness when ChatGPT went viral this year. On the post-earnings call Wednesday evening, management made clear that they see things getting even better later this calendar year. While they don’t officially release guidance beyond the current quarter, the team said that demand for generative AI and large language models has extended “our Data Center visibility out a few quarters and we have procured substantially higher supply for the second half of the year.” Put simply, management appears to be indicating that earnings in the second half of the year stand to be even greater than in the first half. The demand they’re talking about is broad-based, coming from consumer internet companies, cloud service providers, enterprise customers, and even AI-based start-ups. Keep in mind, Nvidia’s first-ever data-center central processing unit (CPU) is coming out later this year, with management noting that “at this week’s International Supercomputing Conference in Germany, the University of Bristol announced a new supercomputer based on the Nvidia Grace CPU Superchip, which is six times more energy-efficient than the previous supercomputer.” Energy efficiency is a major selling point. As we saw in 2022, energy represents a large input cost when operating a data center so anything that can be done to reduce that is going to be highly attractive to customers looking to enhance their own profitability. The Omniverse Cloud is also on track to be available in the second half of the year. At a higher level, management spoke on the call about the need for the world’s data centers to go through a significant upgrade cycle in order to handle the computing demands of generative AI applications, such as OpenAI’s ChatGPT. ( Microsoft , also a Club name, is a major backer of Open-AI and uses the start-ups’ tech to power its new AI-enhanced Bing search engine.) “The entire world’s data centers are moving towards accelerated computing,” Huang said Wednesday evening. That’s $1 trillion worth of data center infrastructure that needs to be revamped as it’s nearly entirely CPU based, which as Huang noted means “it’s basically unaccelerated.” However, with generative AI clearly becoming a new standard and GPU-based accelerated computing being so much more energy efficient than unaccelerated CPU-based computing, data center budgets will, as Huang put it, need to shift “very dramatically towards accelerated computing and you’re seeing that now.” As noted in our guide to how the semiconductor industry works, the CPU is basically the brains of a computer, responsible for retrieving instructions/inputs, decoding those instructions, and sending them along in order to have an operation carried out to deliver the desired result. GPUs, on the other hand, are more specialized and are good at taking on many tasks at once. Whereas a CPU will process data sequentially, a GPU will break down a complex problem into many small tasks and perform them at once. Huang went on to say that basically as we move forward, the capital expenditure budgets coming form data center customers are going to be focused heavily on generative AI and accelerated computing infrastructure. So, over the next five to 10 years, we stand to see what is now about a $1 trillion and growing worth of data center budgets shift very heavily into Nvidia’s favor as cloud providers look to them for accelerated computing solutions. In the end, it’s simple really, all roads lead to Nvidia. Any company of note is migrating workloads to the cloud — be it Amazon Web Services (AWS), Microsoft’s Azure or Google Cloud — and the cloud providers all rely on Nvidia to support their products. Why Nvidia? Huang noted on the call that Nvidia’s value proposition, at its core, is that it’s the lowest total cost of ownership solution. Nvidia excels in several areas that make that so. They are a full-stack data center solution. It’s not just about having the best chips, it’s also about engineering and optimizing software solutions that allow users the ability to maximize their use of the hardware. In fact on the conference call, Huang called out a networking stack called DOCA and an acceleration library called Magnum IO, commenting that “these two pieces of software are some of the crown jewels of our company.” He added, “Nobody ever talks about it because it’s hard to understand but it makes it possible for us to connect tens of thousands of GPUs.” It’s not just about a single chip, Nvidia excels at maximizing the architecture of the entire data center — the way it’s built from the ground up with all parts working in unison. As Huang put it, “it’s another way of thinking that the computer is the data center or the data center is the computer. It’s not the chip. It’s the data center and it’s never happened like this before, and in this particular environment, your networking operating system, your distributed computing engines, your understanding of the architecture of the networking gear, the switches, and the computing systems, the computing fabric, that entire system is your computer, and that’s what you’re trying to operate, and so in order to get the best performance, you have to understand full stack, you have to understand data center scale, and that’s what accelerated computing is.” Utilization is another major component of Nvidia’s competitive edge. As Huang noted, a data center that can do only one thing, even if it can do it incredibly fast, is going to be underutilized. Nvidia’s “universal GPU,” however, is capable of doing many things — again back to their massive software libraries — providing for much higher utilization rates. Lastly, there’s the company’s data center expertise. On the call, Huang discussed the issues that can arise when building out a data center, noting that for some, a buildout could take up to a year. Nvidia, on the other hand, has managed to perfect the process. Instead of months or a year, he said Nvidia can measure its time delivery times in weeks. That’s a major selling point for customers constantly looking to remain on the cutting edge of technology, especially as we enter this new age of AI with so much market share now up for grabs. Bottom line As we look to the future, it’s important to be mindful that while ChatGPT was an eye-opening moment, or an “iPhone moment” as Huang has put it, we’re only at the very beginning. The excitement over ChatGPT isn’t so much about what it can already do but more so about it being something of a proof of concept of what is possible. The first generation iPhone, released 16 years ago as of next month, was nowhere near what we have today. But it showed people what a smartphone could really be. What we have now, to extend the metaphor, is the original first-generation iPhone. If you are going to own, not trade Nvidia as we plan to, you have to — as impressive as generative AI applications are already —think less about what we have now and more about what this technology will be capable of when we get to the “iPhone 14 versions” of generative AI. That is the really exciting (and somewhat scary) reason to hold on to shares of this AI-enabling juggernaut. (Jim Cramer’s Charitable Trust is long NVDA, MSFT, AMZN, AAPL, GOOGL . See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.
Nvidia CEO Jensen Huang wearing his usual leather jacket.
Getty
Following Nvidia‘s (NVDA) incredible quarter and strong guidance raise, we think shares of the AI-chip powerhouse can increase another 14% from its record highs in the next six to nine months.