Why build a nuclear power plant with your data center if you could just get power from the grid and drive up everyone else’s price too? It’s cheaper for the data center operator.
Because eventually the grid electricity costs too much. If you consume more than the community, then the prices will be astronomical. Then it is profitable to build a power plant. If they are generous, they could even sell some electricity to the community for a “reasonable” price.
Because eventually the grid electricity costs too much
Only a couple hundred mill a year per data center. Building a nuclear power plant costs several billion and isn’t free to operate either, so it’s not pure capex, there’s still opex involved.
If you consume more than the community, then the prices will be astronomical
Prices rise for everyone so it becomes the community’s problem as much as it becomes the data center’s problem. The US in particular has three grids, so in reality, the community is either the western US, the eastern US, or Texas.
Then it is profitable to build a power plant.
Profitable over a decade or more maybe. The data center isn’t guaranteed to be in operation for that long. You know those ~30-40k USD “graphics” cards they use? The ones that a single AI data center would likely have tens of thousands of, often even around 100k? They’re used for about 3 years usually, often less. They become obsolete in that timeframe, just unable to compete with newer products in terms of both raw performance as well as efficiency. That’s up to 3 billion dollars of GPUs every 3 years or less, per data center. Just a tiny economic downturn or people seriously realizing that this bubble is going to have to pop eventually and they’ll have to stop running these data centers.
NPPs also usually take many years to complete. It took nearly two decades for the Finns to get Olkiluoto 3 running. Data centers need to be ready in a few years because in 5 years the AI craze could be over and they’ll no longer be needed.
AI companies ain’t gonna do shit for electricity generation if they’re not forced to.
In my country, joining the grid or upgrading your circuit breaker has a one-time amperage-based fee (assuming you’re close to the substation, otherwise it gets more expensive). I propose that for companies looking to consume huge amounts of electricity, there should also be a mandatory generation capacity increase fee that could be paid out to a nearby municipal power company that then uses it to build more power plants, or to some level of local government that could then sponsor building a power plant or 10.
Edit: Whoever downvoted me must think that data center operators are going to do anything out of the good of their hearts lol
People on lemmy downvote you just for disagreeing ALL the time, even if you make (as you just did) an informed and thoughtful reply. It’s honestly just as bad as Reddit with the downvote shit
Why build a nuclear power plant with your data center if you could just get power from the grid and drive up everyone else’s price too? It’s cheaper for the data center operator.
Because eventually the grid electricity costs too much. If you consume more than the community, then the prices will be astronomical. Then it is profitable to build a power plant. If they are generous, they could even sell some electricity to the community for a “reasonable” price.
Only a couple hundred mill a year per data center. Building a nuclear power plant costs several billion and isn’t free to operate either, so it’s not pure capex, there’s still opex involved.
Prices rise for everyone so it becomes the community’s problem as much as it becomes the data center’s problem. The US in particular has three grids, so in reality, the community is either the western US, the eastern US, or Texas.
Profitable over a decade or more maybe. The data center isn’t guaranteed to be in operation for that long. You know those ~30-40k USD “graphics” cards they use? The ones that a single AI data center would likely have tens of thousands of, often even around 100k? They’re used for about 3 years usually, often less. They become obsolete in that timeframe, just unable to compete with newer products in terms of both raw performance as well as efficiency. That’s up to 3 billion dollars of GPUs every 3 years or less, per data center. Just a tiny economic downturn or people seriously realizing that this bubble is going to have to pop eventually and they’ll have to stop running these data centers.
NPPs also usually take many years to complete. It took nearly two decades for the Finns to get Olkiluoto 3 running. Data centers need to be ready in a few years because in 5 years the AI craze could be over and they’ll no longer be needed.
AI companies ain’t gonna do shit for electricity generation if they’re not forced to.
In my country, joining the grid or upgrading your circuit breaker has a one-time amperage-based fee (assuming you’re close to the substation, otherwise it gets more expensive). I propose that for companies looking to consume huge amounts of electricity, there should also be a mandatory generation capacity increase fee that could be paid out to a nearby municipal power company that then uses it to build more power plants, or to some level of local government that could then sponsor building a power plant or 10.
Edit: Whoever downvoted me must think that data center operators are going to do anything out of the good of their hearts lol
People on lemmy downvote you just for disagreeing ALL the time, even if you make (as you just did) an informed and thoughtful reply. It’s honestly just as bad as Reddit with the downvote shit