Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>let me repeat something you've already quoted

>>> the public is on the hook for $150B of loans to be payed by inflationary pricing.

That framing makes even less sense. Even if we grant that capital spending is inflationary, nobody thinks the public is "on the hook" or pays for it "by inflationary pricing". If I bought a box of eggs, it probably drives up the price of eggs by some minute amount in the aggregate, but nobody would characterize that as the public being "on the hook" for it, or that the public is paying for it "by inflationary pricing". Same if I bought anything else supply constrained, like an apartment or GPU. Seemingly the only difference between those and whatever Micron is doing is that you don't like Micron and/or the AI bubble, whereas you at least tolerate me buying eggs, apartments, or GPUs, so your whole spiel about "payed by inflationary pricing" is just a roundabout way of saying you don't like Micron/AI companies' spending. I also disagree with people dropping $30K on hermes handbags, but I wouldn't characterize buying them as "the public is on the hook for $30k to be payed by inflationary pricing".

>The actual truth, with numbers, just for 2024 and Virginia alone:

"actual truth"? That settles it, then.

On a more substantive note, since you clearly haven't bothered to look into either article to examine their methodology, here's the relevant snippets for your convenience:

>Mike Jacobs, a senior energy manager at UCS, uncovered these additional costs by analyzing last year’s filings from utilities in seven PJM states and identifying 130 projects that will connect private data centers directly to the high-voltage transmission system. Over 95% of the projects identified passed all of their transmission connection costs onto local people’s electricity bills, totaling $4.3 billion in costs previously undistinguished from other, more typical expenses to upgrade and maintain the electricity grid.

and

>The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers.

>chart 2: https://www.economist.com/content-assets/images/20251101_USC...

Looking at the two different methodologies, the economist methodology seem far more reasonable, because the UCS's methodology is basically guaranteed to come up with a positive number. It just counts how much money was spent on connecting datacenters, and assumes assumes household users are paying the entire bill. It doesn't account for different rates/fees paid by retail/household users, or the possibility that datacenters could be paying more than their "fair share" of costs through other means (eg. they might require disportionately less infrastructure to service, but pay the same transmission rates as everyone else).



[flagged]


>"Eggs"? "Eggs" are the same as "apartment or GPU"? You display all of the comprehension abilities of an LLM... or the mainstream economist "teaching" it.

How did you miss the part about "anything else supply constrained", which was between "eggs" and "apartments"? Or are too lazy to search up "egg shortage"? Maybe you should check your own reading comprehension before accusing others of poor reading comprehension.

>Semiconductor capex are huge compared to, eh, "eggs", and they must be payed off as part of pricing.

Yet the aggregate demand of all americans drive the price of eggs, apartments, or GPUs. Either something drives inflation, or it doesn't. Otherwise you're just critiquing Micron's size rather than what their behavior is.

>Artificial jump in demand, as from hoarding, makes the capex artificially high and the pricing inflationary.

>Also, hoarding semiconductors (private NPUs like TPUs, Trainiums, etc, stocking on hard to obtain GPUs) reduces competition and via renting, the respective services can extract the inflated capex plus high profits.

1. There's no alternate universe where absent "hoarding", everyone would be running LLMs on their own GPUs at home. The frontier models require multiple 128+GB GPUs to run, and most people only do a few minutes of inference a day. There's no way the economics of buying works out.

2. "AI GPUs sitting idle because it doesn’t have enough power to install them" doesn't imply "hoarding", any more than buying PC parts but not building them because the last part hasn't arrived yet isn't "hoarding". It could simply be that they expected that power be available, but due to supply chain issues they aren't. Moreover hoarding GPUs would be a terrible idea, because they're a rapidly depreciating asset. No cloud provider is trying to corner the market on CPUs by buying them up and renting it back to people, for instance, and CPU is a much easier market to corner because the rate of advancement (and thus rate of depreciation) is much lower.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: