mrb's blog

Bitcoin Mining is Not Wasteful

Keywords: bitcoin decentralization network security

The security of the Bitcoin block chain fundamentally depends on its proof-of-work, based on calculating SHA256 hashes. The network of miners use their SHA256 computational power to vote on which block chain to trust in order to confirm transactions. They are incentivized to do this by being rewarded with transaction fees as well as newly mined coins. Because computing power cannot be faked, voting cannot be cheated. This is in essence what makes Bitcoin's existence possible.

However some say this SHA256 proof-of-work used in mining consumes too much energy and is a "huge waste" or "unsustainable".

I strongly disagree. Here is why:

Argument 1: Miners currently use approximately only 0.0012% of the energy consumed by the world. Most are forced to use hydroelectric power (zero carbon footprint!) because using cheap renewable energy is a necessity to win in the ultra-competitive mining industry.

Argument 2: Even in the future, economic modeling predicts that if Bitcoin's market capitalization reaches $1 trillion, then miners will still not account for more than 0.74% of the energy consumed by the world. If Bitcoin becomes this successful, it would have probably directly or indirectly increased the world's GDP by at least 0.74%, therefore it will be worthwhile to spend 0.74% of the energy on it.

Argument 3: Mining would be a waste if there was another more efficient way to implement a Bitcoin-like currency without proof-of-work. But current research has so far been unable to demonstrate a viable alternative.

Argument 4: Bitcoin is already a net benefit to the economy. Venture capitalists invested more than $1 billion into at least 729 Bitcoin companies which created thousands of jobs. You may disregard the first three arguments, but the bottom line is that spending an estimated 150 megawatt in a system that so far created thousands of jobs is a valuable economic move, not a waste.

Argument 5: The energy cost per transaction is currently declining thanks to the transaction rate increasing faster than the network's energy consumption.

These arguments are explored in more details in the sections below.

1. Miners consume a negligible amount of energy

The energy efficiency—in joules per gigahash—of mining chips vary. As of January 2016, there are 5 main families of chips who represent most of the ASIC market share:

  1. BitFury's latest 16nm chip achieves 0.055-0.07 J/GH,
  2. KnC's 16nm Solar 0.07 J/GH,
  3. Spondoolies's 28nm PickAxe 0.15 J/GH,
  4. Bitmain's 28nm BM1385 0.18-0.26 J/GH,
  5. Avalon's 28nm A3218 0.27 J/GH.

Some miners run older less efficient gear, but the bulk of the global hash rate comes from these 5 chip families. Thus the network's average efficiency falls between 0.055 and 0.27 J/GH.

With the sudden doubling of the network's global hash rate from 500 to 1000 PH/sec (petahash/sec) occuring in the last 2 months alone, it is fair to say that most of the computing power comes from the newer, more efficient chips. Bitfury's 40 MW Georgia data center (0.055-0.07 J/GH) alone could account for up to 650 PH/sec.

But let's be conservative and assume the network's average efficiency is 0.15 J/GH. With a global hash rate 1000 PH/sec, this means the network is consuming around 150 MW. Over a year this represents electricity costs of $131 million (assuming the global average rate of $0.10 per kWh), and 1.3 TWh (terawatt-hours).

For comparison, according to IEA statistics, the world's consumption of energy for year 2013 was 9 301 Mtoe or 108 171 TWh.

Thus only 0.0012% of the world's consumption of energy goes to Bitcoin mining, or literally less than half of a drop in the bucket .

Furthermore most miners use clean renewable energy; ruthless competition in the mining industry forces them to use renewables due to their lower costs (as opposed to more expensive polluting power sources such as coal power plants). They usually select hydroelectric power which has a zero carbon footprint: BitFury, KnC, BW, etc.

Bitcoin's 0.0012% should be contrasted to many instances of energy waste that occur on much larger scales. A 3-ton car wastes 97.5% of its energy transporting a 75-kg person (1-75/3000=0.975). Power plants waste enormous amounts of energy. The US waste 61% of its energy (and similarly for the rest of the world). These are real and significant wastes. Bitcoin is not.

2. Mining may grow but its energy footprint will remain relatively small and green

As Bitcoin's value and adoption increase, mining will increase. However miners are bounded by sheer economic factors, because they are in it for financial gains.

In a hypothetical scenario where Bitcoin continues to be incredibly successful over the next 5 years, its market cap may hit for example $1 trillion. This is very optimistic. 5 years from now 18 million BTC will be in circulation so the market cap divided by 18 million indicates 1 BTC would be worth $55,000. Meanwhile 3 million BTC will remain to be mined, valued at $165 billion.

If the market cap hypothetically reaches $1 trillion, it will mean Bitcoin has succeeded on a massive scale and has positively impacted our society: foreign trade made more fluid thanks to Bitcoin's disregard of political borders, hundreds of millions of users (vs. today's few millions), millions of merchants accepting it (vs. today's 130 000). Maybe private and governmental financial institutions using Bitcoin as part of their settlement systems. All in all, this would be a (hypothetical) incredible success and Bitcoin would have probably directly or indirectly contributed to increasing the world's GDP by at least a tiny amount.

Now, miners might look at these numbers (3 million BTC to be mined, valued $165 billion) and decide to spend $40 billion in CapEx to build mining hardware and infrastructure, plus $80 billion in OpEx to pay for electricity and other maintenance, then mine $165 billion in BTC over the next 10 years, to potentially make a profit of $45 billion if all goes well.

How much electricity will the miners consume with a budget of $80 billion? If they decide to optimize all costs by refusing to buy electricity from power companies and instead by building their own hydroelectric dams—cheapest source of energy on the planet—then electricity would cost them approximately $0.01 per kWh. So $80 billion buys them 8000 TWh. Over 10 years this is 800 TWh per year.

Therefore even a hypothetical Bitcoin market cap of $1 trillion still fails to incentivize miners to consume more than 800 TWh per year, or 0.74% of the world's consumption of energy; however by this point Bitcoin would have also probably contributed to increasing the world's GDP by at least a tiny amount, and if it is by more than 0.74% then it will have been worthwhile to spend 0.74% of the energy on it.

3. Proof-of-work is the only way

It would be valid to say Bitcoin's proof-of-work is wasteful if there was a more efficient way to implement a Bitcoin-like currency (decentralized with no trusted third-party). But it appears impossible.

An alternative to proof-of-work might be proof-of-stake, but its core issue is that there is "nothing at stake". The developers of Ethereum have put incredible efforts into researching ways around this flaw, publishing numerous fascinating and technical blog posts. But even they had to give up the idea for now, and had to resolve to implement proof-of-work in the current Ethereum version. They still wish to move to proof-of-stake at some point in 2016, but it is unclear how they will do it or even if they will be successful.

Researchers have also studied other alternatives like proof-of-activity, and proof-of-burn, but they either directly or indirectly depend on proof-of-work.

I wish we could get rid of proof-of-work, but so far it does not appear possible. (I would love to be proven wrong!)

Or… could we at least use a proof-of-work having some usefulness outside of Bitcoin, such as replacing SHA256 hashing with protein folding? It would certainly be revolutionary. For this to happen we would need to figure a way to generate protein folding tasks in a decentralized manner, perhaps deterministically from a seed. Unfortunately no one has figured a way to do this. (And no, FoldingCoin did not find a solution: they were forced to forego decentralization in their design.)

4. Bitcoin already benefits the economy

As of September 2015, at least 729 Bitcoin companies have been created: exchanges, miners, wallets, payment services, infrastructure, etc. As of November 2015, more than $1 billion was invested these companies which employ thousands of persons.

In addition, a smaller economic benefit that is still worth mentioning is the following: Bitcoin as a payment platform increases economic trade by removing friction from traditional payments, or simply by offering one more option to pay. It leads to sales that may otherwise not take place if Bitcoin did not exist. Same thing for PayPal, credit cards, or any other payment systems: the more options exist, the higher the sales. The benefit is hard to quantify but is certainly there. For example BitPay reported a 110% increase in their transaction volume between 2014 and 2015. Some/most of these payments probably displaced traditional payments, while a fraction of them would not have taken place if Bitcoin did not exist.

When looking at the big picture, Bitcoin is a system that consumes 150 MW costing $131 million per year (see argument 1), and effectively created 729+ tech companies, thousands of jobs, and increased economic trade.

Is this worth it? Yes it is! Just looking at the jobs alone, it would be impossible to hire and pay thousands of employees at tech companies with a budget of $131 million.

Therefore labeling Bitcoin mining as a "waste" is a failure to look at the big picture. It is ignoring that these jobs alone that are a direct, measurable, and positive impact that Bitcoin already made on the economy.

5. The energy cost per transaction is declining

One criticism of Bitcoin is not about its aggregate energy use but its energy cost per transaction. Today this is a non-issue because miners' energy costs are fully covered by the mining reward. But as the reward decreases, 10-20 years from now it will not be sufficient to cover the energy costs. So transactions fees instead will have to cover them. Effectively users will be paying for the per-transaction energy cost through transaction fees.

Before I calculate the per-transaction energy cost, I would like to point out that many published estimates are grossly inaccurate. See Critic of Chris Malmo's VICE Motherboard piece), or Critic of the Bitcoin Energy Consumption Index, or Critic of John Lilic's LinkedIn post, etc.

So what is the true per-transaction cost?

A year and a half ago, in August 2014, the network was at 200 PH/sec and consumed 150 MW ($131 million per year assuming $0.10 per kWh), and processed 70 000 tx/day. The energy cost per transaction was $5.10.

Today the network grew 5× to 1000 PH/sec and I estimate the same 150 MW energy footprint (see argument 1), which makes sense: back in August 2014 miners efficiency was 5× worse than today and averaged approximately 1 J/GH: Avalon's 55nm A3255 (2 J/GH), Avalon's 40nm A3233 (0.75 J/GH), KnC's 28nm Jupiter (1.5 J/GH), KnC Neptune (0.6 J/GH), etc. However the network now pocesses 200 000 tx/day. The energy cost per transaction is $1.80.

The transaction rate is currently sharply increasing and should continue to do so mainly with block size increases (we are on the brinks of a consensus on a first historical increase from 1MB to 2MB) but also with off-chain payments (eg. Lightning Network or perhaps Stash which is based on Open-Transactions). If the current trend continues (transaction rate quadrupling every 2 years), the energy cost per transaction would fall 2 years from now to $0.45. Well not exactly. We need to take into account miners increasing their energy consumption over the next 2 years. It took 7 years for Bitcoin to go from 0 to 150 MW. Perhaps it could grow approximately +50% to 225 MW in 2 years. On this basis the per-transaction cost would not be $0.45 but $0.68.

Overall, the main observation is that the energy cost per Bitcoin transaction is rapidly falling ($5.10 → $1.80 → $0.68) and will continue to fall as long as the increase of the transaction rate is superior to the increase of the miners' energy consumption. At some point the energy cost should settle to a certain level because, as explained above, 10-20 years from now miners revenue will come mainly from transaction fees. So transaction rate, transaction fees, miners revenue, and miners' energy consumption will all be 4 linearly correlated metrics evolving hand-in-hand.

Even in the worst situation where the per-transaction energy cost settles to a relatively high amount such as half a dollar, and causes transaction fees to be half a dollar 10-20 years from now, Bitcoin would still beat the fees of most legacy financial systems (average remittance fees of 7.68%, bank wires costing a flat $20-30 fee, etc). The person who issued a Bitcoin transaction worth $147 million surely would not mind paying half a dollar in fees next time!

But there is no barrier preventing the per-transaction energy cost from settling much lower, such as $0.10 or $0.01 or less. All we need is block size increases, or Lightning Network/Stash. It is a common fallacy to assume that the more transactions are processed, the more energy miners "must" consume in order to continue securing the Bitcoin network at the same level. The adequate amount of energy spent securing the network needs only be proportional to the Bitcoin market cap, not to the transaction rate.

Think about it this way: it would be pointless to spend 100 MW to secure a cryptocurrency transacting $1 quadrillion per day if its market cap was a puny $1 and all coins were transacted 1 quadrillion times per day. The theoretical maximum value an attacker might steal from this cryptocurrency is $1. Therefore this network should be protected with no more than whatever is worth to spend in order to protect $1.

Many analyses of Bitcoin's energy consumption are flawed

It does not help when talking about whether Bitcoin's energy consumption is a waste or not, that many consumption estimates are wrong in the first place. I thought a small reference section about what they get wrong would be useful here.

If you are a journalist researching and writing about energy consumption, please email me and I would be more than happy to provide feedback in the form of a review.

Critic of Chris Malmo's VICE Motherboard piece

In "Bitcoin is unsustainable" (June 2015), Chris Malmo claims the network consumes 250-500 MW, quoting a then-outdated and still outdated Allied Control's whitepaper (July 2014), which in turn quotes an even more outdated Google doc (December 2013, copy), which estimated the network was actually consuming 45 MW at the time and could grow to 110 MW by April 2014. Nowhere is stated "250-500 MW" as Allied Control claims. They probably attempted to extrapolate the network growth but failed to take into account the quickly increasing efficiency of miners, which was 5 J/GH when this Google doc was written and improved 33× to today's 0.15 J/GH.

Critic of the Bitcoin Energy Consumption Index

Flaws in BECI cause it to overestimate electricity consumption by 4×. Describing these flaws is a task that deserves its own post: Serious faults in the Bitcoin Energy Consumption Index.

Critic of John Lilic's LinkedIn post

John Lilic's analysis is based on the flawed Chris Malmo article as well as the flawed BECI site. Read my critic of John's post here. This is one of the reasons I review this stuff. One incorrect article is released, others start quoting it, referencing it, and falsehoods spread quickly…

Final words

Take all my numbers with a ±30% grain of salt. When I write "0.27 J/GH" or "$0.45", I do not mean exactly these numbers. They are estimates that are sufficiently accurate to issue the conclusions I issue.

Please send me feedback! This post is a work-in-progress. I want to improve it.


isarrider wrote: I would really agree if the heat produced is used in heating a house or water...
(as an electrical heating does no calculations and just convert electricity in heat)
26 Jan 2016 11:53 UTC

Kevin wrote: Wait, 0.74% of all the world's energy is trivial? Um... I disagree. 26 Jan 2016 20:56 UTC

mrb wrote: Kevin: 0.74 percent would be significant waste if Bitcoin was a little toy system... But the model that calculated 0.74 percent assumes Bitcoin's market cap is $1 trillion (perhaps I need to state this in the intro). Think carefully of the large societal impact a $1 trillion cryptocurrency would have. If it manages to directly or indirectly increase the world's GDP by more than 0.74 percent then it will be worthwhile to spend 0.74 percent of the world's energy on it. 26 Jan 2016 21:32 UTC

Matt wrote: Proof of stake as a consensus method is just dismissed in this post. We have NXT, Bitshares, ethereal 1.1 all use proof of stake.

If proof of stake is a viable alternative then yes proof of work is wasteful as a consensus algorithm. It still may be useful for distribution but in that scenario it's usually only temporary.
26 Jan 2016 22:11 UTC

EmperorBag wrote: You argue that 0.74% of the world's energy usage for a 1 trillion market cap sector of finance isn't wasteful. Where's your comparison to the current standard's cost per 1 Trillion in GDP? Wastefulness is a measure of efficiency against alternatives. I don't need Bitcoin to buy or sell something, any currency would suffice. So what's the cost of production for 1 trillion in fiat exchange?

All of this data being absent makes your discussion above useless in consideration of whether Bitcoin is wasteful or not.
27 Jan 2016 17:06 UTC

mrb wrote: EmperorBag: thanks for your comment. You are right I need a comparison. How about this one:

Canada consumes approximately 4000 TWh yearly. Its nominal GDP is $1.8 trillion. So it produces *$450 million per TWh*.

In my model Bitcoin's hypothetical market cap is $1 trillion and miners consume 800 TWh. So its worth is *$1.25 billion per TWh*, or 3 times better than Canada.

Is it a valid comparison? Well, hard to say, it's apples vs. oranges. But it gives us a general idea that Bitcoin could extract value from energy with an efficiency roughly comparable to a country.
28 Jan 2016 03:13 UTC

mrb wrote: Matt,

Ethereum 1.1 does not exist yet (unfinished). No one knows HOW it will implement PoS.

Nxt is—so far—unproven I think. I looked into it and found a vulnerability: I read the whitepaper and skimmed the source code and I am not very impressed by the numerous kludges. For example EC ( sort of makes it like Ripple because users have to trust certain nodes/accounts. Or there is the fact it won't reorg more than 720 blocks, but this causes forks that can't re-merge when an (accidental or intentional) network split lasts more than 720 blocks. Etc.

Bitshares: I don't know it well enough to criticize. But I share other people's concerns that the delegates represent a certain centralization and security risks (if they are hacked, etc).
30 Jan 2016 07:28 UTC

Vortac wrote: "Or... could we at least use a proof-of-work having some usefulness outside of Bitcoin, such as replacing SHA256 hashing with protein folding? It would certainly be revolutionary. "

Gridcoin has accomplished much of it. It's rewarding BOINC computations (which can be virtually any kind of distributed computing process, including protein folding) in a decentralized manner. Proof-of-Stake is used for securing the blockchain. Currently, 30 different BOINC projects are supported by Gridcoin, covering all fields of science, from biomedicine to math.
24 Sep 2016 05:44 UTC

mrb wrote: Vortac: the big flaw in gridcoin is that it is completely centralized. If someone hacks the BOINC project servers keeping track of the work statistics, then that person can assign themselves arbitrary gridcoin rewards. This design flaw is impossible to fix. This puts gridcoin not in the same class as truly decentralized cryptocurrencies like Bitcoin, Ethereum, etc. 24 Sep 2016 22:56 UTC

Vortac wrote: Thank you for replying so quickly to such an old thread.

You are right, BOINC is not completely decentralized. But there is no one "central BOINC server". 30 different BOINC projects are rewarded by Gridcoin and even if one of them is breached completely, it's possible only to steal 1/30 of the Gridcoins issued for that day.

48000 Gridcoins (GRC) are minted daily, divided to 30 different BOINC projects, that's 1600 GRC per project. That's the maximum amount that can be assigned arbitrarily by a malicious attacker, who has succeeded in hacking one BOINC project. As the number of BOINC projects constantly grows, the BOINC network will become more and more decentralized. If there were 100 BOINC projects supported by Gridcoin, it would be possible to steal only 480 GRC per day. And of course, there are mechanisms to blacklist the hacked BOINC project and/or the hacker's BOINC account itself and stop the theft completely.

So yes, it's not completely decentralized yet (due to nature of BOINC credit system), but I believe Gridcoin is closest to the revolutionary alternative to Proof-of-Work you described in your post.
25 Sep 2016 08:49 UTC

Anonymous wrote: Great post! 31 Dec 2016 14:22 UTC

Digiconomist wrote: First of all I'd like to state that the critism on the Bitcoin Energy Consumption Index (BECI) contains false information. It is claimed that BECI runs on the assumption that miners " never recover their investments (capex)". This has never been stated or implied by the information provided on BECI, and may relate to a misunderstanding. BECI assumes that the entire network is running at roughly break even, but this doesn't mean this is the case for every miner part of it. New machines may still earn themselves back easily under this assumption.

Second I'd like to add that the case laid out in this article is extremely optimistic on the electricity consumption of the Bitcoin network. It is stated that " the network's average efficiency falls between 0.055 and 0.27 J/GH". When this article was published, the best publicly available miner was the Antminer S7, running at ~0.25 J/GH. This one was released just a few months before. The Antminer S5+ was released just a little bit before at ~0.44 J/GH (in August). These machines wouldn't even have hit the market if the estimates in this article were true, as they would have been producing a loss as of day 1.
23 Jan 2017 09:11 UTC

mrb wrote: Digiconomist: I copied your post and replied in 01 Feb 2017 21:57 UTC

Digiconomist wrote: Just to respond to this article specifically; as you wrote the network costs could be estimated at $131 mio (per year) at the time this article was published. Back then, miner income was at $500 mio annualized. You thus imply the average mining farm has ongoing costs equal to 26% of its total income. That's based on a residential rate of 10 cents per KWh. If we apply a more realistic rate of 5 cents per KWh as you later argued (industrial scale miners were already paying just 2 cents per KWh in 2014), then we're even talking about 13% in costs on average.

Now unless farms mining running at $40-60k per month shut down when income falls below $200k per month, average costs for the whole network are certainly higher than that. For the article it means that 0.15J/GH isn't conservative at all. Conservative would have probably been closer to 0.6J/GH at the time, taking into account the very high price per KWh and low average costs considered here.
06 Feb 2017 13:50 UTC

mrb wrote: Reviewing this a year later, I believe that yes my estimates were too optimistic. I made two errors: (1) Bitfury 16nm had not seen large scale deployments until Oct 2016, and (2) the Spondoolies PickAxe was only ever produced in engineering sample quantities (Spondoolies declared bankruptcy 4 months after writing this post).

If I were to revise my estimates, I think the average efficiency in Jan 2016 was probably around 0.30 J/GH. Consider that the rapid doubling of the hash rate from ~450 to ~900 PH/s in the 3 months preceding this blog post was in large part due to the Bitmain BM1385 (0.25 J/GH) and Canaan A3218 (0.29 J/GH) released respectively in Oct and Nov. We also know that by Jan 2016 Bitfury had a 40 MW data center full of 28nm (not 16nm) ASICs (0.20 J/GH) which account for 200 PH/s alone. So that is ~650 PH/s out of 900 PH/s of hardware operating at sub-0.30 J/GH.

Anyway I just published a much more precise estimate of the electricity consumption of miners as of Feb 2017:
15 Mar 2017 22:03 UTC

Digiconomist wrote: Okay, and you consider that not at all wateful based on the arguments presented here.

IMO the best argument for that is in the introduction; "computing power cannot be faked". It's the price you pay for trust. But that trust could come at a much lower price thanks to innovations such as Proof-of-Stake. So with regard to argument 3; there is a more efficient way. Ethereum will be moving to Casper soon, in what's probably the first major test for PoS at a Bitcoin-ish scale.

Yes, it will still have to prove itself, but it shows we don't have to accept the insane amounts of energy going into PoW just yet.

Other arguments are less strong. There are better uses for renewable than Bitcoin and Bitcoin may be able to create jobs without requiring 150MW+.

Furthermore transaction growth has completely stalled; so even if that was true it certainly isn't now. And if Bitcoin goes to $1 trillion in value as it is the energy consumption per transaction would be something like 2,500+ KWh. One U.S. household uses that in 3 months.
18 Apr 2017 18:22 UTC

Digiconomist wrote: Additionally, even though "computing power cannot be faked", the security offered by proof-of-work is actually decreasing. More hash doesn't do much for Bitcoin's security, but increased centralization due to economies of scale is increasingly becoming a security risk (take AntBleed for example).

Ethereum's Casper could potentially even be more secure since a simple majority attack won't be enough. Attacking Casper will most likely require a supermajority; and it will have less centralization incentives (economies of scale don't play an equally big role).
27 Apr 2017 15:15 UTC