Buy Nvidia? Depends on what is already in your portfolio. However, consider: Huang has understood 21 Century computing for at least 30 years. He is getting physical because he understands that the Word was not the beginning. He will go after the edge and robotics. He is rethinking the datacenter and the laptop, Others will make some progress, but it will be on the margins. He knows his business is not selling chips to LM vendors. LMs delight me to no end, but they are a tiny step. Nvidia knows that so, repeating myself for emphasis. Physical. Robots. Edge. AI factories.
I think those who use compute, which I believe includes edge, to examine the physical world as well as web scrapings, will make leaps. Nvidia has shown that orientation for a while. A pod called "Acquired" has done 3 episodes on Nvidia over the years. In '97 Nvidia designed and tested the Riva 129 in software. Nvidia was developer centric from day one. The name 'CUDA' captures their architecture (Universal Device Architecture.) While their competition is fighting the last war, Nvidia is building billion dollar, non-existent industries. Acquired added an interview with Huang last fall. https://www.youtube.com/watch?v=y6NfxiemvHg
As an example of how and where estimates might go, here's some back of the envelope thinking on computing capacity / density - and what to do with it - 'out in the real world' from C. Stross. (Yes, science fiction, but still a useful way to begin to explore the territory...) in 2012:
"So. What sort of applications can we imagine that might result from this trend?
Let's look at London, a fairly typical large capital city. London has a surface area of approximately 1570 square kilometres, and around 7.5 million inhabitants (not counting outlying commuter towns). Let us assume that our hypothetical low-power processor costs 10 euro cents per unit, in large volumes. To cover London in CPUs roughly as powerful as the brains of the Android tablet I'm reading this talk from, to a density of one per square metre, should therefore cost around €150M in 2040, or €20 per citizen.
To put this in perspective, in 2007 it was estimated that councils in London spent around £150M, or €190M, per year on removing chewing gum from the city streets.
So for the cost of removing chewing gum, a city in 2030 will be able to give every square metre of its streets the processing power of a 2012 tablet computer (or a 2002 workstation, or a 1982 supercomputer)."
"Finally, I'd like to leave you with this question: what socially beneficial uses can you think of for a billion loosely coupled, low power microprocessors and their associated sensors? Because in 20 years time, buying and deploying such a network will be cheap enough for city planners to consider it routine. The logical end-point of Moore's Law and Koomey's Law is a computer for every square metre of land area on this planet — within our lifetimes. And, speaking as a science fiction writer, trying to get my head around the implications of this technology for our lives is giving me a headache. We've lived through the personal computing revolution, and the internet, and now the advent of convergent wireless devices — smartphones and tablets. Ubiquitous programmable sensors will, I think, be the next big step, and I wouldn't be surprised if their impact is as big as all the earlier computing technologies combined."
Not, as I don't buy stock, but picks & shovels to the gold rush/tulipmania sounds about right. CUDA is vendor lock-in at present. You can also see Nvidia nerfing Gaming GPU's to keep VRAM for the datacenter, that is not going to go well either. I don't think it's a bubble yet, there is latent demand. But how long the demand lasts depends on how many people want to actually train a foundation model. TSMC/Taiwan is the wildcard here I reckon,
I have NVDA in my portfolio and I'm not selling so yes that makes me a buyer (I'm not taking the gain to invest elsewhere for a better return, taxes be damned)
Off the cuff: near term it's a bubble (and good luck to those trying to navigate directly...). Re: does it develop into a proper boom, note it feels like a relevant broad parallel in terms of technology might be 'electrification' more so than railroads: with Nvidia mapping to something like electric motors (?) - which might lead to casting Microsoft as General Electric (?), etc.
Is there an opportunity for them to vertically integrate in biotech using the data from illumina? Another thing I’ve been thinking about is 40% of the revenue is actually inference please correct me if I’m wrong but I believe they said this in the last conference call. If the edge moves to proprietary like apple and Qualcomm, is that the biggest risk? Training will grow significantly because of the move to physical and the magnitude increase in data rate.
The time to buy NVDA was 12-24 months ago. The valuation is totally unsustainable and this has massive bubble written all over it
disclosure: I don't hold material amounts of NVDA.
We've done internal modelling on AI demand - and I can see paths which see global demand for compute expand enough to meet the current valuation.
Buy Nvidia? Depends on what is already in your portfolio. However, consider: Huang has understood 21 Century computing for at least 30 years. He is getting physical because he understands that the Word was not the beginning. He will go after the edge and robotics. He is rethinking the datacenter and the laptop, Others will make some progress, but it will be on the margins. He knows his business is not selling chips to LM vendors. LMs delight me to no end, but they are a tiny step. Nvidia knows that so, repeating myself for emphasis. Physical. Robots. Edge. AI factories.
Yeah - I can see that. nathan has some numebrs on our estimates for compute growth. They are wild.
We aren't sharing them right now
I think those who use compute, which I believe includes edge, to examine the physical world as well as web scrapings, will make leaps. Nvidia has shown that orientation for a while. A pod called "Acquired" has done 3 episodes on Nvidia over the years. In '97 Nvidia designed and tested the Riva 129 in software. Nvidia was developer centric from day one. The name 'CUDA' captures their architecture (Universal Device Architecture.) While their competition is fighting the last war, Nvidia is building billion dollar, non-existent industries. Acquired added an interview with Huang last fall. https://www.youtube.com/watch?v=y6NfxiemvHg
As an example of how and where estimates might go, here's some back of the envelope thinking on computing capacity / density - and what to do with it - 'out in the real world' from C. Stross. (Yes, science fiction, but still a useful way to begin to explore the territory...) in 2012:
"So. What sort of applications can we imagine that might result from this trend?
Let's look at London, a fairly typical large capital city. London has a surface area of approximately 1570 square kilometres, and around 7.5 million inhabitants (not counting outlying commuter towns). Let us assume that our hypothetical low-power processor costs 10 euro cents per unit, in large volumes. To cover London in CPUs roughly as powerful as the brains of the Android tablet I'm reading this talk from, to a density of one per square metre, should therefore cost around €150M in 2040, or €20 per citizen.
To put this in perspective, in 2007 it was estimated that councils in London spent around £150M, or €190M, per year on removing chewing gum from the city streets.
So for the cost of removing chewing gum, a city in 2030 will be able to give every square metre of its streets the processing power of a 2012 tablet computer (or a 2002 workstation, or a 1982 supercomputer)."
After suggesting some scenarios (here http://www.antipope.org/charlie/blog-static/2012/08/how-low-power-can-you-go.html if you're curious), he concludes with:
"Finally, I'd like to leave you with this question: what socially beneficial uses can you think of for a billion loosely coupled, low power microprocessors and their associated sensors? Because in 20 years time, buying and deploying such a network will be cheap enough for city planners to consider it routine. The logical end-point of Moore's Law and Koomey's Law is a computer for every square metre of land area on this planet — within our lifetimes. And, speaking as a science fiction writer, trying to get my head around the implications of this technology for our lives is giving me a headache. We've lived through the personal computing revolution, and the internet, and now the advent of convergent wireless devices — smartphones and tablets. Ubiquitous programmable sensors will, I think, be the next big step, and I wouldn't be surprised if their impact is as big as all the earlier computing technologies combined."
Not, as I don't buy stock, but picks & shovels to the gold rush/tulipmania sounds about right. CUDA is vendor lock-in at present. You can also see Nvidia nerfing Gaming GPU's to keep VRAM for the datacenter, that is not going to go well either. I don't think it's a bubble yet, there is latent demand. But how long the demand lasts depends on how many people want to actually train a foundation model. TSMC/Taiwan is the wildcard here I reckon,
measured observations.
I have NVDA in my portfolio and I'm not selling so yes that makes me a buyer (I'm not taking the gain to invest elsewhere for a better return, taxes be damned)
Off the cuff: near term it's a bubble (and good luck to those trying to navigate directly...). Re: does it develop into a proper boom, note it feels like a relevant broad parallel in terms of technology might be 'electrification' more so than railroads: with Nvidia mapping to something like electric motors (?) - which might lead to casting Microsoft as General Electric (?), etc.
Is there an opportunity for them to vertically integrate in biotech using the data from illumina? Another thing I’ve been thinking about is 40% of the revenue is actually inference please correct me if I’m wrong but I believe they said this in the last conference call. If the edge moves to proprietary like apple and Qualcomm, is that the biggest risk? Training will grow significantly because of the move to physical and the magnitude increase in data rate.