For anybody who is interested in this stuff, I was the CTO of a Canadian geothermal company, and I became convinced that it was folly to dig 4km down to get hot water when we are surrounded by waste heat in large industrial plants that is largely untapped.
By way of example, I was working on a geothermal plant in Alberta that would generate 5MW and cost $60 million (CDN) to build. Similar heat can be found everywhere at surface, especially in Alberta which has a heavy industrial base. Harnessing the heat from industrial sources is exactly the same as a geothermal plant: only the source of heat is different.
For example, I scoped a project on a natural gas pipeline compressor station. It used a RB211 gas turbine to drive the compressor. Capturing the waste heat from the exhaust stack of that engine gave us 7MW of net generation, at a cost of $30 million, and much lower operating expenses to boot.
58% of the energy we generate each year is lost, much of it (upwards of 20%) as industrial waste heat. That's terajoules of electrical generation potential globally, just from the easily captured stuff.
It's not exactly like geothermal energy, because you have counterparty risk in that the cement factory (or whatever) might close. But the tech is identical to geothermal, and there's no exploration risk or digging expense.
I'm surprised that the waste heat is enough to drive a turbine and generate electricity! How much waste heat is that hot? Combined cycle natural gas turbines already do this. Natural gas combustion drives a turbine, then the "waste heat" drives a steam cycle. Even though that steam cycle is less than 50% efficient, I'm guess the wasted heat there can't be used, right?
How high of a COP would it take to make a heat pump a feasible way to make lower temperature waste heat a source? Or is this where the thermodynamics breaks down?
You won’t be able to take waste heat economically from a power plant, they’re generally designed to capture more and more heat until diminishing returns take over.
There are a lot of other industrial processes though that don’t do any heat capturing.
Something left out though is that you might have lower capital costs but if you try tacking a power plant on somebody else’s factory, they’re going to want a cut.
Efficiency is dictated by temperature differences, you get a bigger cut for less investment when the temperature difference between heat source and environment is largest.
I once came across a plant in Quebec that was only "profitable "because it had a huge discount on the rate it paid for electricity, only to later generate electricity from their waste heat and sell it back to the government at a highly inflated rate. The government was happy to do this so that the plant wouldn't shut down and workers wouldn't lose their jobs
What if you just sold the equipment (+ setup services) to the plants directly? They could turn around and reuse the recovered energy themselves, or sell it on down the line, but there would be no coordination/mixed-incentives problem.
Factories aren't in the business of generating electricity. They were already willing to throw that heat away normally, why take a bunch of capital that you could invest in the business and use it to buy equipment unrelated to your business when someone else will install and manage the waste-heat system for you and give you a 10% cut for no risk and no outlay on your end?
Probably best to sell a kit of a few designs, equipment, project management, and financing (avoid capital costs) that shares in income but also has contractual obligations for length of running.
Then you're the one taking the cut, still an additional cost that needs to be considered when doing a comparison with digging a hole for geothermal energy.
>How high of a COP would it take to make a heat pump a feasible way to make lower temperature waste heat a source? Or is this where the thermodynamics breaks down?
Yeah, there's no free lunch. You could use the low-temperature waste heat to pre-heat the working fluid for a combustion-based system, but using a heat pump to increase the temperature will cancel out the efficiency gain from having a higher temperature for the heat engine.
I feel like it completely misses the point of clean energy. In the future your goal should really be to power those industrial plants 100%. Geothermal can do that. It's basically free energy, and you are just redirecting the heat to do something else in the process, like whatever industrial processes they're doing, before it all gets emitted as heat anyway.
Harvesting an industrial plant's own waste heat isn't going to power the industrial plant, and thus, you still make them rely on burning some other fuel that would otherwise not have to be burned.
Recovering energy from industrial waste heat would still be a good idea, regardless of the source of the outside power. The only question is if there is a different strategy to improve efficiency that would have a better cost/benefit ratio.
There may just be no other reasonable way to recover energy from some processes, and in that case building a heat recovery power plant is probably a good idea.
I meant free in the natural sense, as in, by using geothermal energy you are not creating any more heat than would have been naturally released anyway -- you're just re-routing and extending the pathway for the same exact amount of heat to be released and getting some work done in the process.
As such, in some sense, with geothermal energy there is no notion of "wasted" power as far as the environment goes. Even if you leave your heaters on unnecessarily, you're just letting the same heat out from your living room instead of letting the same amount of heat escape directly from a fumarole. The heat will have been released either way, and whether you leave your heater on or not only changes where the heat gets released from.
Whereas if you were using, say, nuclear power, leaving your heaters on means that you are wasting nuclear fuel and generating heat that wouldn't have been released had you turned your heaters off.
You can only afford to do that if you manage to improve the efficiencies of those plants, which means scavenging or reusing BTUs that escape out the back end.
Yeah, it's unclear whether those 5 and 7MW figures quoted are electricity production or just heat energy.
Not to imply that "just heat energy" is useless -- https://en.wikipedia.org/wiki/District_heating is the term for using waste industrial heat to heat up homes in the surrounding areas. More common in Europe and Canada than the US I believe.
Cogeneration is when the concept is used specifically by electricity plants. It's used mostly in places that have need to generate their own electricity and heating like college and hospital campuses -- by harvesting the waste heat for another localized purpose, it becomes cheaper than relying solely on the grid.
A gas turbine has about 25-30MW thermal in the exhaust gasses, typical is gas at 480C and 90kg/s. At those temperatures we can convert that to electricity at about 20% efficiency. We lose about a megawatt in parasitic load such as pumps.
Thanks for clarifying and giving the conversion numbers!
Something like district heating has much higher infrastructure and coordination costs, but compared to an electricity conversion of only 20%, it's interesting to think about how other applications could have potentially higher efficiencies from that same waste stream.
Anyone know if there are examples of industrial parks that have been designed with waste stream "stacking" in mind? Instead of converting exhaust to electricity at 20% efficiency, could that power production waste heat be sent to something like a kiln or a chemical producer that needs heat as input?
All geothermal projects in Canada will require a direct heat off-take to be viable. In fact the revenue from direct heat use will generally be more than electricity sales. The 5MW plant in Alberta I referenced was built around a proposed industrial park, with the idea being to create a district heat system that upgraded the heat in some facilities and donated it at other facilities.
The best use of heat is as heat, without converting it. The problem is that heat doesn't travel very well, 5-8 km is really the practical limit before insulation costs swamp revenues. And a lot of industrial heat is too far away from a viable user of that heat. If the facility can use the heat internally their process engineers have already incorporated that (through pinch analysis etc).
We have one project now where we're replacing a heat exchanger that is supplying process heat from the exhaust with a power plant, but then using the waste heat from the power plant (still 80% of it left!) to replace the process heat. Overall efficiency will be around 60%.
I believe I’ve heard that a lot of plants only bother recycling heat from Material streams that are above 600F.
There are so many things you could run off of even 200F, like drying systems or counterflow heat exchangers. It may be that materials and handling limitations as you go up in temps outweighs the effort to extract.
That big industry that you talk about needs to actually switch to clean energy sources as well. It needs to happen for environmental reasons. But the reasons it is actually happening are more related to cost. There are a few aluminium and steel plants coming online running on hydrogen recently. With clean sources for hydrogen coming online as well, that puts us on a track to cleaning up energy in heavy industries. Right now natural gas is still cheaper but that may not last that long as prices for wind and solar continue to trend down and as there are ongoing improvements in cost of producing hydrogen. Geothermal could be part of the solution for that. But the price has to be right and I have a hunch that there are some challenges there long term.
With geothermal the innovation is not necessarily in building huge plants in geologically unstable areas, which in other places indeed requires drilling quite deep, but in using heat pumps to work with small temperature differences you get by digging only a few tens to hundreds meters deep. This is currently quite popular in northern Europe as a way of cheaply heating/cooling houses and reducing cost relative to burning natural gas. The main cost is putting pipes in the ground.
In any case, the main challenge with any energy generation project is cost and amortizing that over its lifetime. There are a few things on the market that have come down in price by orders of magnitude and are on track for more improvements in the next decades. This puts any predictions about cost and profitability at risk. In the case of coal, people came out on the wrong side of the equation and a lot of plants are being shut down ahead of their scheduled end of life. IMHO the same will happen to gas plants in the next decades. That makes investments in large scale geothermal very risky. Which is perhaps why there is far less of it than you would expect given that we've known how to build such plants for a very long time.
The combination of cheap storage and low cost solar panels or super efficient 10-20 MW wind mills is becoming quite hard to compete with and you see this stuff popping up in formerly very fossil fuel friendly places because it makes sense right now from a cost perspective. Even Iceland is investing in offshore wind.
Unlike geothermal heat, Industrial heat is too spread out.
The quantity of heat may be comparable, but because of wide area of distribution, it is not practical to tap it. By wide area, I mean the various sources of heat in any industrial plant.
Also, industries would not really prefer if you want to have equipment to tap waste heat at various points in their shops.
Even in the example you give, of a turbine outlet, it is economical and practical, only because the heat is concentrated in the exhaust. However, most plants that have turbines, do recoup energy from turbine exhausts.
Personally, I think residential solar is less preferable than commercial installations where you can benefit from greater centralization, but that said, residential solar seems to have done fine.
Now on the other hand, solar has less moving parts, so the reliability is greater and the distributed nature may be less important.
In solar, there are no naturally occurring concentrated sources of energy. We have to concentrate it in certain ways, or use solar panels.
Residential solar will definitely reduce load and transmission losses. However a broad strategy must be required by governments to plan and establish solar plants.
For example, it makes sense to have a coherent solar planning policy taking into account the huge consumers of electricity (cities and industries) and planning solar plants near them, reducing the losses in transmissions (which can go upto 33% or more). Additionally, if the production and consumption are close by, transmissions can be made with lower voltage and higher currents, perhaps even HVDC, to unlock even more efficiencies.
Thinking about this with a software engineering hat on I am breaking out in cold sweat. You attach these waste heat generators to random other industrial machinery and now suddenly you can't do any maintenance or power cycling for fear of disrupting the power supply to whole regions. It also seems organizationally very complicated to have co-located hardware owned by completely unrelated organizations. I guess if it was all nationalized or belonged to one huge multinational it may be less of a concern.
> You attach these waste heat generators to random other industrial machinery and now suddenly you can't do any maintenance or power cycling for fear of disrupting the power supply to whole regions.
It's not that you can't. But you have to coordinate with another party, particularly if you're generating large amounts of energy this way. But the grid is evolving towards increased storage to accommodate renewables, so this will mitigate the impact of shutting generation down to run maintenance on the heat source.
But if we don't want to couple random industrial plants to the supply side of the power grid, I wonder if we couldn't be using this with some self-contained carbon recapture devices. Heat -> power -> less CO₂ in the air. If that kind of tech will ever be workable, that is.
All this to say, when I put my software engineering hat on, I'm starting to get sick thinking of all that capacity being wasted.
The engineering miracle of the electrical grid is exactly this sort of coordination. Think of this as being like solar energy - except it's generating 95% of the time rather than 25% of the time. That's considered baseload electrical, and although it's not awesome if a 10MW plant drops off the grid, the system is capable of handling it.
Then why bother selling the electricity back to the grid? If the wholesale price is less than retail price, it would make more sense to use the power to offset the original consumption.
It seems like the only reason a setup like this would really make sense is as an efficiency improvement for the industrial customer. Maybe financing for these types of upgrades is hard to come by, in which case it kinda makes sense to have a 3rd party own and operate the equipment. But that seems like a short term niche. As soon as it is proven and de-risked financing should be quick to follow.
I think that point extends further. This entire idea goes away when efficiency in the tooling is enhanced over time. Sure you can get some extra power back from your waste, but next time you retool, the tech might make that proposition useless.
It's like if you invested in a system to get energy back from you car exhaust, but then the next model year is so much more efficient as to make the tech obsolete.
7MW is pretty small. Any generation gear may trip off the grid at any moment; provided it doesn't do so too frequently or in sync with other failure, this is not normally a problem.
That's roughly the load of an electric train. Eurostar are 16MW.
It's also the sort of problem that batteries providing grid stability services help a lot with.
I’m ignorant on the topic, but seems more like a great cost- & carbon-reduction technology for the industrial plants, rather than a power generation technology for the masses. For what it is, it sounds great though.
It never meets economic hurdle rates for industrials on their own, which is why we're building them as independent power plants. We prefer to sell the electricity back to the host facility, but we often sell into the grid.
are your advantages that prevent industrials from installing their own cogen know-how and economies of scale in project management, procurement, frabrication, installation and commissioning? I wouldn't have thought a company like fortis would ignore 7MW for 30 million * however many compressor stations they have that are near the distribution lines.
Sounds like low-hanging fruit. But not really scalable. A 7MW plant sounds OK for some applications. But a really big geothermal plant might compete with nuclear.
It's a good point, but there are many grid experts who argue that smaller, distributed generation is more robust and reliable than larger, 500MW+ plants. Outside a few areas, most geothermal plants are quite small: I doubt we'll ever see one over 15MW in Canada, for example.
A 7MW plant doesn't sound like much, but if you're cookie-cuttering 9 of them along a line, costs come way down since they're all essentially identical. I did some back-of-the-envelope calculations and if we did all the larger (over 15000hp) natural gas compressor stations in the US and Canada, we'd be looking at something like 42TWh per year of generation. That compares to 68 TWh for all deployed utility solar (2018 tho). That's just gas compressor stations, which are about 15% of the industrial heat generated in the US.
Responding specifically to the competition with nuclear: you're absolutely correct: the Geysers geothermal complex (which is actually about 22 power plants) is about 1.6GW. That's easily the world's largest though: I don't think there's another one over a gigawatt anywhere. And you need really special conditions for that kind of power - specifically really hot geological conditions.
The problem, as always, is that jerk Carnot and his limit. Most accessible geothermal fluid is <150C except in very geologically active areas. That puts a pretty hard limit on plant efficiency.
> It used a RB211 gas turbine to drive the compressor. Capturing the waste heat from the exhaust stack of that engine gave us 7MW of net generation, at a cost of $30 million, and much lower operating expenses to boot.
the main issue here is actually such a low efficiency turbine - an efficient one wouldn't have such an easy capturable exhaust heat to start with. One can understand that a gas turbine on a plane for example has weight limits so you can't tack on additional turbine stages, etc. to increase efficiency, yet for ground based it would only be about increased capital costs of such a turbine - so that means that running inefficient turbine is cheaper, i.e. the energy/fuel is still very cheap.
My understanding is that the most efficient turbines top out at about 40%. Reciprocating engines are better, but even they can't get much above 50%. There's still lots of heat there even in the most efficient engines.
You're absolutely right about the efficiency being just fine -- especially when you're a pipeline company and you just pull fuel for free out of the pipeline to run the turbine. That's a big reason why these companies haven't built these plants already.
>pull fuel for free out of the pipeline to run the turbine.
yep. The situation here and globally would be completely different if to the price of fuel, however low one can get it, the price of the emitted carbon were added. That would in particular have leap frogged our civilization efficiency.
Rankine cycle tops out at 40%, gas turbines are more like 50%, and combined cycle, which is really what's being implemented here, can reach around 60%.
When the gas is essentially free, there's not much incentive for efficiency, but they could have powered the compressor with a smaller yet more efficient combined cycle system. If you can economically extract power from a powerplant's waste heat, the original power plant was suboptimally designed.
The article suggests that the trick to high profits is really getting the depth for higher temperatures without melting the equipment. Because getting lower with higher temperatures dramatically increases the energy generation. As opposed to sort of current minimal viable depth/temperatures.
So is it possible that with some significant engineering advances the costs would go down and generation would go up quite a bit? To me the article is suggesting research investment to improve the materials and technology to go deeper less expensively.
Can you provide any insight into the feasibility of that type of research paying off? Do you know of any promising techniques, materials, designs? Or maybe you feel like they are insurmountable challenges? Thanks for any details.
You can go deeper, but in most regions you'll be into impermeable rock by the time you get usable temperatures. You need the rock to be permeable so brine can flow from your injection site to the extraction well.
Companies like Eavor claim a closed loop in impermeable rock will work, but then you're limited by the thermal conductivity of the rock around your pipe. You can only pull out as much heat as can flow through the rock around the pipe. Our simulations on Eavor's system showed they would work for about a couple months before they started dropping off.
Fracking seems like a more promising avenue, and it might work. A ton of political challenges there though.
Interesting, when I studied these a long time ago they talked about mining geothermal energy in these kinds of deep rocks but having to move the "heat mine" after it got cool due to the lack of heat transfer through the solid rock. I got the impression that heat flows into the hot rock are orders of magnitude lower than the heat flows out required to make power generation efficient and useful.
The radiator design is interesting but ultimately it's a heat sink and power output will be limited. Initially I bet the power output is great, and as the area cools they tap it out. Who knows how long it'd last, you'd think the higher the power capacity the faster it would cool down underneath.
So really you need to be able to build new mines cheap enough to justify doing such a hard mine knowing it'll only last for 22 years.
I don't have a good feel for the geology of hot spots. But remember seeing a paper on hot springs. Typically you have a situation when water seeps down a deep fault and then comes back up. Would seem you are correct. Heat transfer via conduction will be really slow. Also know that the geothermal plants in The Geysers Geothermal plants in Sonoma County have lower output then they did decades ago because the rocks have cooled somewhat.
So I guess on a log scale it's half way between! So I guess imagine it's drawing heat from a source that is 100x smaller cross section and made of aluminum. Not that that is particularly helpful...
Conduction is sort of dismal. Materials tend to conduct either too well when you don't want them to. And too little when you want them to.
If you want to move heat from point a to b usually you want to use a fluid. That why I tend to assume for geo thermal plants you want to drill into rock where there is a lot of hydrothermal action going on.
Something absolutely weird is Japan has basically zero installed geothermal power.
Yes, I agree, I was surprised they bothered to mention trying to seriously mine heat from hard rock, it seems like running fluid through a bore horizontally barely solves the conductivity issue since heat still has to get to the walls of the bore.
I'm not sure if you are unaware but:
"In 2007, Japan had 535.2 MW of installed electric generating capacity, about 5% of the world total."
Consider Northern California has about 900MW of geothermal installed and the state isn't really known for geothermal activity. Japan which is known for that, what are they doing?
To give California some credit, it is slightly physically larger than Japan, has far more open space without people despite the bay area (my first thought was "would they really be willing to wreck the onsens and such that have been there for a thousand years?"), is on active tectonic plates, and has been pretty focused on this for a while.
The geothermal plant in Basel, Switzerland was AFAIK among the very first production ready plants using fracking. I remember the news of them turning it on and then a series of 2-3.5 Richter earthquakes happening. The city was leveled by earthquakes before so people got scared and immediately turned it off. The tiny damage that happened in the very old housing stock was costlier to the insurances than the geothermal plant itself. It did produce quite a bit of clean electricity though.
The Washington, DC wastewater treatment plant figured out how to generate methane from sewage, which they then burn to drive turbines. They get 10MW of net electricity out of it, which they use themselves (sounds like their electric usage is 30MW). It would seem that a lot of industry installations with waste heat sources likely also use a lot of electricity and could benefit from doing something similar.
I think this is pretty common at sewage treatment plants. Our local plant (EBMUD in West Oakland) does this too. In fact it seems they import additional organic waste to throw into their methane digester, with the resulting generation covering all of the plant's needs:
Even ignoring the energy savings, methane is a potent greenhouse gas so you'd much rather it be burned than just released.
Though I wonder if there isn't an opportunity to sequester a lot of carbon being missed by digesting sludge instead of ... say, dumping it into a deep injection well.
They're a great, innovative company. They're mostly focused on smaller stuff - 150kW and below, although I think they did some larger bespoke projects earlier.
For larger stuff my favourite company is Exergy (www.exergy.it). They resurrected a format of turbine - a radial outflow configuration - that lost to axial turbines in the early 1900s, recognizing that it had special advantages when used with organic fluids rather than steam.
That sounds like a good efficiency measure to use in addition to adding new zero/low carbon energy sources. If you only capture waste heat, the net total is still all coming from carbon-based sources. Maybe it's the low hanging fruit for now, but it's not enough. We need to tap every feasible alternative energy source.
How long will the industrial plants that produce such heat be able to operate? The geothermal plants are being built as we try to address climate change, with many countries aiming for carbon neutrality by 2035. How much room is there for fossil fuel powered plants in that?
That is an interesting question, as a world without large industrial plants will not resemble the world we live in today, in all kinds of ways. Any sensible plan to reduce emissions will need to provide for industry (most industrial processes involve heat), otherwise we're all going back to living in caves.
What led to the Fukushima reactors failing the way they did was that their emergency generators were situated in low-lying, tsunami-reachable areas. Simply relocating/elevating them, and the whole mess wouldn't have happened.
That sounds a lot like cogeneration. I remember 25 years ago there was a plant near LA that produced steam to process oranges while also generating power.
I think you're right though, there is a lot of untapped waste heat out there.
Geothermal has a problem in that the capital to build a plant is much higher (2-3x) that of a combined cycle natural gas power plant. Even though the fuel is free, in order to make the economics work you have to project power prices out 50 years. Usually when we scope power plants the proformas only go out 15 years.
And by "make the economics work" I'm not talking about excessive returns. I'm talking about a <10% rate of return, getting the investment back in 10-14 years.
It seems like a natural pivot for Alberta, and we spent a lot of sweat trying to convince the government that it was good for drillers. Good for pipefitters and welders and millwrights too. No dice though, if anything I'd say the government actively sabotaged our company (retracted a $7m grant from the previous administration, for example). Just too green for them.
There is a lot of talent available, and one of the drilling companies (Beaver Drilling) has made themselves the spokesman for the geothermal industry. That's what got things moving with at least the government paying lip service to the industry. I left just as Beaver was stepping in, though.
As for costs: so many of the drilling costs are fixed - material costs for casing, consumables like drill bits and drilling slurry. Labor might be down but it's still several million dollars per hole.
If we took a modest chunk out of our defense spending and spent it on this I would think that would be a significant amount of "free" power. A volcanic TVA of sorts.
This is covering deep geothermal, trying to get into the huge heat source from radioactive decay in the earth’s core.
This is not covering back-yard geothermal for home cooling/heating.
I had no idea such a small amount of the earth’s heat could be harvested to satisfy all of humanity’s energy need (assuming humanity doesn’t start doing stupid things when we have access to unlimited energy...)
In industry we usually refer to "geothermal" as heat from radioactive decay, and "geoexchange" from the backyard stuff, where the source is solar heating.
You're not wrong, that just seems to be the way we get around constant disambiguation. I was the CTO for a Canadian geothermal company.
We also just called it ground source when I briefly worked in the field. Although, in my experience, even though someone shells out $$$ for a "ground source heat pump" in their yard, it won't stop them from telling everyone they have geothermal.
I was curious about the economics of building geothermal plants as a California statewide infrastructure strategy, since I assume it's a pretty good place for geothermal, so I did some quick maths from info I found online. Apparently CA consumes about 7% of the electricity in the US, at 259.9 TWh, therefore averaging a consumption of only 29.65 gigawatts; the capacity is actually 75.9 GW [1]. An energy.gov site [2] says a typical large geothermal plant only costs $2.5 to build per watt of capacity! Compared to a lot of huge-ticket federal and even statewide budget items such as the $686 billion military budget, eliminating off-peak non-industrial natural gas energy production with socialized electricity seems doable.
Well, assuming we can drill very deep, I would think the supervolcano calderas have massive amounts of energy that we could exploit and ... possibly ... avoid future extinction-level-event eruptions.
I also wonder how deep would be sufficient to "dump" nuclear waste. I would think once you get to a sufficiently viscous level, heavy-element nuclear waste would slowly descend into the core.
People have pondered cooling Yellowstone's magma chamber by surrounding it with geothermal wells.† The result would be 5GW (for comparison that is 1/6th of California's needs) of clean energy at about $0.10/kWH for thousands of years and a significantly reduced chance of a supervolcano eruption event killing millions of people.
It might alter or stop the geysers though, so it isn't going to go anywhere.
I doubt it's about preserving the geysers for tourists... seems more likely that a multibillion-dollar pricetag combined with the relative uncertainty of an eruption is what has cooled enthusiasm for the plan[1]. But that is a lot of power, so it's not impossible to imagine, if the investment paid off in the long term.
Why do volcanoes erupt? Because pressure builds up that is stronger than a weak/thin spot in the crust can take. If we drill a hole then the strength of the crust at the hole is... zero. Doing that in the neighborhood of a supervolcano might not be wise...
If I read the article correctly, Geothermal energy extraction involve pumping cold water in and hot water out. If I wanted to put nuclear waste somewhere underground, it would probably be very far away from the place I pump water from.
I expect the amount of heat one could extract would be absolutely negligible compared the the amount of heat in a supervolcano caldera, so I doubt there would be any affect at all.
It's not clear there's much decay energy in the core. That would depend on whether uranium, thorium, or potassium would dissolve in high pressure liquid iron-nickel.
Uranium and thorium are highly concentrated in the Earth's crust, btw, relative to the rest of the Earth.
If by "core" - you mean to exclude the mantle, then yes, we don't know exactly how much radiogenic [1] heat generation there is. But if we are using the term "core" loosely here to refer to everything under the crust, then a very significant fraction of the heat is generated via radioactive decay.
Depending on where you drill, you might end up in a situation where some layers come into contact with high-pressure water and start swelling, resulting in raised surface levels. Most prominent example I know of, is the city of Staufen in Germany:
Only some of the geothermal tech requires injection of fluids along the lines of Fracking. The article suggests for those cases the risks are lower, due to the different pressures and fluids required. Other techniques (closed-loop systems) don't require it at all (but do require advanced drilling tech).
also fwiw, the induced seismicity from fracking was found to come from injected wastewater associated with post-frac oil production [1]. it's my understanding that the majority of these geothermal systems are injecting about as much as their pumping up, so stress doesn't build/you don't have that issue.
Something to consider is that even at those depths there are bacteria and living things in the soil. Depriving them of heat, might disrupt another ecosystem that is providing food for things above or below.
I want to believe in GeoThermal. A recent experiment in Australia went bad, this really saddened me because the underlying model clearly works (Iceland, NZ) but something about the combination of drill, frack, extract and process to energy just didn't work out, nor did the post experiment remediation (IIRC)
I also believe, that we actually deplete both deep heat, and deep coolth. Deep heat, you have to keep expanding the deep heat extraction zone or create pumped states in a wider area somehow, Coolth: London underground was lovely and airconditioner cool, it now has 100+ years of soaked in heat and is significantly hotter than it used to be.
The brine in the permeable rock is heated by radiation from the earth's core. It is pumped to surface, power generated, then injected back into the earth at a sufficient distance, so that by the time it flows back to the extraction well it will have been re-heated to the original temperature.
So no, the subsurface will not cool over time (except with some crackpot "technologies" like one mentioned towards the end of the article that rhymes with "never"). It's actually harnessing the radiation of the earth's geology.
There are some places where power is extracted faster than the geology can support. In those cases electrical generation tapers off. This is happening in Turkey for example. But it's because of greed, not because of anything inherent in the technology.
I was (I think) referring to the cooper basin project. It was non-viable on LCOE given transmission costs and rate of extraction. I can't find the remediation story links, maybe confused this with other energy/fracking problems, but the do note the fracks made mag 3.7 earthquakes, which may have been played up as a risk-side problem.
Thanks. I thought it was that some prior radiative heat was being consumed, and wasn't being refreshed by deep water brine. Maybe thats what you said: the rate of extraction can exceed the rate of renewal?
Does this contribute towards global warming? In the literally sense - taking heat out of the core and into the surface level environment presumably increases temps?
Can somebody explain to me the limits of geothermal exploitation? It seems to be a limited resource to me. Even more limited than fossil fuels. Those will replenish in a few million years. But heat from the earth’s core will be gone till the end of time. No? Let alone effects on earthquakes or changing the magnetic field and these kind of things.
There is probably a natural nuclear reactor in the Earth’s core. Think when the earth was formed, heaviest elements like Uranium would naturally concentrate at the core.
Therefore Geothermal is potentially an unlimited resource, at least until the reactor runs out of fuel, but I would guess thats millions if not billions of years away.
Actually the heat arises from the natural decay of radioactive elements within the Earth. In the far future it will be gone!
It's true that in the early history of the Earth there would have been heating from gravitational contraction of the protoplanetary disc. But as soon as the Earth swept its orbit of material this source of heating disappeared.
If you would use the heat from earth's core to run power plants, presumably by leading in water and running a steam turbine, wouldn't that eventually heat up the atmosphere since you pump out the heat in the core faster than it would naturally?
It’s no different than generating heat in the surface environment any other way, such as in a nuclear reactor or by burning coal or gass.
Waste heat radiates into space fairly readily, especially at night, so at the rate we currently use energy it’s not a significant factor compared to the incredible amount of heat that radiates down on us from the sun.
No. There's always heat coming to earth and radiating back to space. Amount of greenhouse gases in the atmosphere determine equilibrium temperature. Adding heat doesn't change equilibrium temperature.
No offense but I have a hard time believing this. Isn't pumping heat from the center of the earth to the atmosphere more or less the same as having a larger amount of incoming solar energy? And certainly it would be hard to argue that more energy from the sun wouldn't heat the planet. Wouldn't the equilibrium temperature be both a function of energy in and radiation rate (which is also a function of temperature?)
> The Earth receives 174,000 terawatts (TW) of incoming solar radiation (insolation) at the upper atmosphere [1].
> The flow of heat from Earth's interior to the surface is estimated at 47±2 terawatts (TW) [2].
Much of heat flux happens in ocean where crust is thinner.
> The Earth's crust ranges from 5–70 kilometres in depth and is the outermost layer. The thin parts are the oceanic crust, which underlie the ocean basins (5–10 km) [3]
Earth radius is 6,371 km, Earth core radius is 3,485 km, world deepest borehole is 12 km [4], they are not going to pump energy directly from the core.
World energy consumption is 18 TW [5].
We would live in a different world if balance was different. No seasons and climate zones if internal Earth energy was comparable to Sun energy. Tropics near power plants if human consumption was comparable to Sun energy.
> Wouldn't the equilibrium temperature be both a function of energy in and radiation rate (which is also a function of temperature?)
Exactly, and since we're not changing the energy in (that energy is going to reach us whether we pump it up or not), and thus the equilibrium does not change.
Where else can all that generated energy go other than up?
But it would otherwise go "up" at a slower rate - so heat adding more slowly but over longer. Pumping heat up is effectively compressing this time horizon - resulting in a difference in rate of heat added. Now if that geothermal energy was replacing nuclear or fossil fuel use, then it might be a wash. But as soon as that energy replaces hydro, wind, or solar (which don't add new heat to the atmosphere - just use heat already there) - then we would increase the rate of heat energy being added to the atmosphere.
I looked it up again, and it seems like we're both a bit correct.
My explanation would be correct if the earth is in equilibrium and there is no primordial heat left and all heat comes from radioactive decay, but surprisingly (to me at least) that is not the case. About 1/2 of the heat flow from the core to the surface comes from primordial heat.
I'd say solar will also add some heat to the atmosphere because they reflect less sunlight than the surface.
Of course, in the big scheme of things this is all pretty irrelevant unless we keep on exponentially increase our energy usage. The greenhouse effect of CO2 is a few orders of magnitude higher.
Surface energy radiates from the Earth rapidly, something like a quartic function of temperature if I'm remembering correctly. We could add a lot of energy to the surface and suffer negligible effects.
Well, to be exact, you should model any regular injection of heat as a perturbation in the heat influx, which, in the case of humanity's total energy usage, comes to about 0.1% of insolation -- and we'd generate the same amount (if not more) of heat by burning oil anyway!
Heat flux from inside the Earth is limited by the surface area of the Earth. Adding a geothermal plant essentially increases the available surface area, which would increase the flux.
However, it would be a very small increase to a very small source of heat. Underground heat in total is only about 0.03% of total energy budget at the surface.
There was a company called Quaise in this field I read about in The Engine's portfolio, though I don't know enough about this subject matter to make much of a comment.
By way of example, I was working on a geothermal plant in Alberta that would generate 5MW and cost $60 million (CDN) to build. Similar heat can be found everywhere at surface, especially in Alberta which has a heavy industrial base. Harnessing the heat from industrial sources is exactly the same as a geothermal plant: only the source of heat is different.
For example, I scoped a project on a natural gas pipeline compressor station. It used a RB211 gas turbine to drive the compressor. Capturing the waste heat from the exhaust stack of that engine gave us 7MW of net generation, at a cost of $30 million, and much lower operating expenses to boot.
58% of the energy we generate each year is lost, much of it (upwards of 20%) as industrial waste heat. That's terajoules of electrical generation potential globally, just from the easily captured stuff.
It's not exactly like geothermal energy, because you have counterparty risk in that the cement factory (or whatever) might close. But the tech is identical to geothermal, and there's no exploration risk or digging expense.