> By the way, if AI turns out to be a bubble and we’re much further from AGI than Silicon Valley thinks, what lasting value gets built? You could tell a story about how the dot-com bubble paved the way for all the value the internet has generated. What’s the equivalent for this AI buildout?
Now, I have no idea what the hell I'm talking about here, but what about biotech?
It's not doing so hot right now, but my impression is that one of the big problems in biotech is that we don't really understand a lot of biological systems - especially ones in complicated organisms like humans. So development is heavily weighted towards experimental work, but bio experiments are a huge pain in the ass, and even if you do hit on a process that works, it probably fails half the time. What we really need is the equivalent of Ansys or LTSpice for biology, so you can validate designs in silico and work on much faster cycles. But of course bio systems are much, much more complex, so this would use a lot of compute. I'm not sure how much of these problems you can solve with lots of FLOPs, though, e.g. see neuroscience where we can't even simulate the 302 neuron worm. There's also what Deepmind has been doing rather than trying to just run physics sims directly - building simulations of these systems by hand does not seem tractable.
The GPUs deprecate, yes. But if AI collapses, the lifespans of existing GPUs are immediately extended. Nvidia probably stops releasing new chips every 6 months, and there is no longer enough demand to run every chip at 70% utilization 24/7.
If demand for current AI collapses, you have a big population of GPUs which are basically worthless, which are going to flood secondary markets all at once, and my impression is that biotech stuff whether startups or research is a big part of this secondary market. Right now AI applications are very profitable per watt, so compute is expensive and doesn't get applied to bio so much. There are surely some other less profitable compute applications that are promising here, like climate modelling, but biotech seems like the most important field.
Thinking about who is showing up to these liquidation auctions for e.g. Stargate, most of them will be there to repurpose the GPUs (or hoard them, or something). Unlike past bubbles there is not much value in scrapping H100s in bulk. My impression is that a similar thing happened with dark fibre.
What problems are there? The low precision operations AI chips are good at are maybe not so useful for high precision sims. Data is definitely still a big problem, but there are areas where we have some data, like the UK biobank. Maybe bio stuff is just really, really irreducible. The chips will still be quite expensive to run even if the cost is just power. And H100s are pretty hard to set up in a university closet, but I think this is not so much of a problem, it looks more like the university department leasing cloud compute from the lowest bidder at the liquidation auction.
Could someone with actual expertise tell me why I'm wrong?
> By the way, if AI turns out to be a bubble and we’re much further from AGI than Silicon Valley thinks, what lasting value gets built? You could tell a story about how the dot-com bubble paved the way for all the value the internet has generated. What’s the equivalent for this AI buildout?
Now, I have no idea what the hell I'm talking about here, but what about biotech?
It's not doing so hot right now, but my impression is that one of the big problems in biotech is that we don't really understand a lot of biological systems - especially ones in complicated organisms like humans. So development is heavily weighted towards experimental work, but bio experiments are a huge pain in the ass, and even if you do hit on a process that works, it probably fails half the time. What we really need is the equivalent of Ansys or LTSpice for biology, so you can validate designs in silico and work on much faster cycles. But of course bio systems are much, much more complex, so this would use a lot of compute. I'm not sure how much of these problems you can solve with lots of FLOPs, though, e.g. see neuroscience where we can't even simulate the 302 neuron worm. There's also what Deepmind has been doing rather than trying to just run physics sims directly - building simulations of these systems by hand does not seem tractable.
The GPUs deprecate, yes. But if AI collapses, the lifespans of existing GPUs are immediately extended. Nvidia probably stops releasing new chips every 6 months, and there is no longer enough demand to run every chip at 70% utilization 24/7.
If demand for current AI collapses, you have a big population of GPUs which are basically worthless, which are going to flood secondary markets all at once, and my impression is that biotech stuff whether startups or research is a big part of this secondary market. Right now AI applications are very profitable per watt, so compute is expensive and doesn't get applied to bio so much. There are surely some other less profitable compute applications that are promising here, like climate modelling, but biotech seems like the most important field.
Thinking about who is showing up to these liquidation auctions for e.g. Stargate, most of them will be there to repurpose the GPUs (or hoard them, or something). Unlike past bubbles there is not much value in scrapping H100s in bulk. My impression is that a similar thing happened with dark fibre.
What problems are there? The low precision operations AI chips are good at are maybe not so useful for high precision sims. Data is definitely still a big problem, but there are areas where we have some data, like the UK biobank. Maybe bio stuff is just really, really irreducible. The chips will still be quite expensive to run even if the cost is just power. And H100s are pretty hard to set up in a university closet, but I think this is not so much of a problem, it looks more like the university department leasing cloud compute from the lowest bidder at the liquidation auction.
Could someone with actual expertise tell me why I'm wrong?