I think the idea is that this is sped up significantly.
Also, in real life, there's not a human twisting the dials for "more steam" or "more reactor", that's undoubtedly handled with PIC controllers and software. Humans are just keeping an eye on things and running checklists when the software doesn't respond properly.
Also most nuclear reactors are run at near full capacity and load variance is dealt with by other energy types. The reason is that the cost base isn't really fuel but the construction and running which is all fixed so regardless of energy pricing the reactor is just run at its peak stable performance all the time.
This is a process very few nuclear power stations do, they get started and then they stay running for enormous lengths of time stopping only for fuel replacements or other maintenance.
Also further down this thread, there's a NE talking about how modern plants basically self regulate based on load, so, the reactor itself isn't throttled manually at all, but instead is moderated by cooling water temperature in a natural equilibrium.
However, while a 1GW nuclear power plant is only ~6,000$ an hour in fuel costs that still adds up. So, curtailing nuclear in favor of solar/wind with zero fuel costs is still a net gain assuming it’s not going to cause other issues. Further the massive ramp off of these sources is becoming a serious issue for the nuclear industry.
Solar/wind don't have fuel costs, but they do have maintenance costs, and they tend to get replaced fairly often relative to nuclear installations, and while none of the above lasts forever. Like a modern wind turbine is rated at 20 years, but most installations are operating at half that in favor of installing larger turbines instead of using existing turbines that produce less power. Gridscale solar is much in the same situation, where, the panels themselves are rated for 25-ish years, but, are replaced early in favor of something newer and more productive.
The reasons for the early replacement is simple really, land costs alot of money, and not all land is suitable for solar/wind, so, you can't expand your solar/wind farm past a certain point, so it makes sense to get into newer, better, generation devices as cost/subsidy availability allows.
Hence why you really have to look at total cost over time instead of just static operating costs, or post-subsidy costs, or just cost/MWH
What you'll find is that solar/wind are indeed pretty cheap, but, they're also only pretty available, so you need base load, and if your goal is clean power, you need clean base load, for that job there's nothing even close to nuclear.
Really what we should be doing is yanking 100% of the future oil/gas/coal subsidies and shoving them at standardized reactor/plant designs and uranium mining. Then allow that standardized design to bypass existing nuclear regulators, and spin up new regulators and regulations to fast track those standardized designs being rolled out. That's more or less what France did, and it's served them well so far.
We can probably run with that model until fusion power is a reality, or we find a way to make solar panels way more efficient (we're running at like 20% currently, but we can get to ~40% in a lab, theoretical maximum is 80%, so room left to grow there), or some other major breakthrough.
First base load isn’t a benefit it’s a downside. The ideal generation has low fixed cost per kWh and flexible generation right now that’s hydro where you get a limited number of kWh per month but have a lot of flexibility when in the month you’re producing power. Natural gas turbines fill the same niche at higher cost per kWh which opens the door for “base load” generation as long as it’s cheap enough.
What’s really scaring nuclear is battery backed solar as ~5 year construction timelines + 50 year lifespan means they need to complete not just with todays low prices but solar and battery prices from 5 and 55 years from now. Battery backed solar is lower risk and similar ROI today, but projecting forward things keep getting worse for nuclear.
> Gridscale solar is much in the same situation, where, the panels themselves are rated for 25-ish years, but, are replaced early in favor of something newer and more productive.
That never really happens for grid scale installations. Land costs are seriously negligible unlike roof space which is more limited.
1 acre of solar farm in a decent location for solar generates ~500,000 kWh per year. Ex: https://en.wikipedia.org/wiki/Springbok_Solar_Farm At even extreme land costs of say 1,000$/acre per year you’re only adding 0.02c/kWh as in 2 / 100th of 1 cent per kWh.
Old solar farms are almost pure profit. There’s literally 30 year old solar panels still in operation and little reason to replace them any time soon. Sure, old panels are lower efficiency but not low enough to really matter here.
I mean, if you have have gigawatts of base load online inside of 5 years, that sounds like a good model to me. We're more than 5 years away from viable gridscale batteries for the sole reason that you have to manufacture them and currently we aren't even close to having enough capacity to do so. Current deployments of batteries are like 80% efficient, and they're not designed to run indefinitely, usually only being used for peak loads, and really never more than 4 hours at a time. If you have 12+ hours of darkness, well, you're going to need alot more batteries than you think.
Also, Springbok is in a highly ideal site in the western mojave, and it's capacity factor is only 31%, so, it's not really generating it's nameplate power very often. It's also 1400acres of land used, which isn't trivial. Granted it's mostly worthless desert land in this case, that won't be the case everywhere. Sure the up-front cost/mwh of PV is much lower than nuclear ($40/MWh vs $82/MWh), you also need 12+ hours worth of batteries, which can easily place your total costs at roughly double what nuclear is, which is why you see exactly zero installations operating like that. In your scenario with no base load but batteries, they'd have to. Batteries also piss away roughly 20% of the energy you put into them, so, you need 20% higher capacity and 20% more generation.
It seems that nuclear is mostly curtailed by regulation. That's somewhat a scale problem because every plant is bespoke, were they standardized, there would likely be less need for regulation. That drives costs and lead times down significantly. Realistically, nuclear is $6/MWh, which is very, very cheap, comparable to pure PV, less than PV with batteries, the problem is building the plant which is incredibly expensive, and, with a high degree of uncertainty as to how expensive exactly. You could very well build a plant, and then have to rebuild it because some regulator found something they didn't like that was inherent to your design (that they'd looked at for years by that point). That's a thing that's actually happened.
What you need to remember is that the basic numbers you're given for things are often wrong for the things they appear to be useful for. Most of what gets quoted are hybridized numbers that attempt to account for a variety of externalities. When you look into what those externalities are, you'll find that they're accounting for costs that are a big deal in a small set of unusual circumstances, and failing to account for things that are a big deal in a large set of common circumstances. The body politic et al has substantial influence on this, as do corporate interests. Once you filter through all of that, you more or less arrive at the conclusion I've come to. Nuclear base load of at least 50%, PV/wind for the rest (mostly PV), build 4 hours of battery capacity, and your grid's stable and cheap for the foreseeable future. Whenever fusion power gets cheap, you can probably move to that, which at this rate will probably be around the time the reactors are reaching EOL given that we out and out refuse to fund it. (15.6B for renewables, 763M for fusion most of which is slated for ICF with no plans to generate power).
This is a misunderstanding on your part. A 1 GW at 31% capacity factor is producing 7.44 GWh per day having “24h” of batteries would therefore be 24GWh and thus 3 days output.
“4h” * 1GW actually represents over half of daily output from a 1GW solar farm, just what you want to provide power for 1/2 the day when the sun isn’t shining. Except even better because batteries add flexibility to better follow the demand curve.
> Batteries also piss away roughly 20% of the energy you put into them
No that 80% efficiency number assumes grid>battery>grid so AC>DC>AC conversion. However PV is DC as are batteries, so doing PV>Grid involves 1 DC>AC conversion but PV > Battery > Grid also only needs a single DC > AC conversion allowing for effective round trip efficiency in the 90-95% range depending on battery chemistry etc.
Further, if you store ~50% of the output and it costs 5% of that in losses then you need 5% more solar not 50%. More importantly nuclear only has a 70-90% capacity factor not 100%. Thus roughly 2.3 - 2.9 GW of battery backed nuclear = 1GW of nuclear but solar scales much better due to flexibility.
> Realistically, nuclear is $6/MWh
Unsubsidized nuclear is nowhere close to $6/MWh today. A 1 GW reactor produces 1GW * 24h * 365 Days * 50 years * ~70-90% capacity factor. If you’re lucky that’s ~400,000 GWh over it’s lifetime at 6$/MWh that’s only 400,000 * 1,000 * 6 = 2.4 billion dollars not enough to even cover construction costs. If you were thinking 60$ / GWh that might cover construction cost + interest assuming it’s not a disaster like but not a 500 person workforce, insurance, fuel, maintenance, decommissioning, etc.
PS: 5 years is seriously unrealistic for nuclear. The US’s most recent reactor didn’t take 7 years to build it was 7 years late and that’s for an extension to an existing nuclear power plant. Even China’s nuclear power commonly taking 7+ years. Ex: Changjiang (2008 - 2015), Fangchenggang 3 (2015-2022), Fangchenggang 4 ( 2016- not finished) etc
Programable Industrial Controllers. That can include PID, or it can be far smarter or dumber.
The idea behind them is, you have purpose built, redundant, single purpose controllers for everything, so, swapping one out is no big deal, and doesn't require any knowhow. You load the config, plug it in, done. Usually they're very simple, doing jobs like "read this sensor(s) every X miliseconds, make adjustments to this output signal according to Y math formula".
A cursory search for Programable Industrial Controller gave me no hits apart from PLCs. Those would typically contain PID controllers. Could a link be provided to a PIC as meant here?
In am effort to relate to their userbase big G has made their search engine just search for whatever the fuck instead of what it was asked for, sometimes some combination of quotion marks and verbatim mode will trick it into being useful.
Also, in real life, there's not a human twisting the dials for "more steam" or "more reactor", that's undoubtedly handled with PIC controllers and software. Humans are just keeping an eye on things and running checklists when the software doesn't respond properly.