The security of the Bitcoin block chain fundamentally depends on its proof-of-work, based on calculating SHA256 hashes. The network of miners use their SHA256 computational power to vote on which block chain to trust in order to confirm transactions. They are incentivized to do this by being rewarded with transaction fees as well as newly mined coins. Because computing power cannot be faked, voting cannot be cheated. This is in essence what makes Bitcoin's existence possible.
However some say this SHA256 proof-of-work used in mining consumes too much energy and is a "huge waste" or "unsustainable".
I strongly disagree. Here is why:
Argument 1: Miners currently use approximately only 0.0012% of the energy consumed by the world. Most are forced to use hydroelectric power (zero carbon footprint!) because using cheap renewable energy is a necessity to win in the ultra-competitive mining industry.
Argument 2: Even in the future, economic modeling predicts that if Bitcoin's market capitalization reaches $1 trillion, then miners will still not account for more than 0.74% of the energy consumed by the world. If Bitcoin becomes this successful, it would have probably directly or indirectly increased the world's GDP by at least 0.74%, therefore it will be worthwhile to spend 0.74% of the energy on it.
Argument 3: Mining would be a waste if there was another more efficient way to implement a Bitcoin-like currency without proof-of-work. But current research has so far been unable to demonstrate a viable alternative.
Argument 4: Bitcoin is already a net benefit to the economy. Venture capitalists invested more than $1 billion into at least 729 Bitcoin companies which created thousands of jobs. You may disregard the first three arguments, but the bottom line is that spending an estimated 150 megawatt in a system that so far created thousands of jobs is a valuable economic move, not a waste.
Argument 5: The energy cost per transaction is currently declining thanks to the transaction rate increasing faster than the network's energy consumption.
These arguments are explored in more details in the sections below.
1. Miners consume a negligible amount of energy
The energy efficiency—in joules per gigahash—of mining chips vary. As of January 2016, there are 5 main families of chips who represent most of the ASIC market share:
- BitFury's latest 16nm chip achieves 0.055-0.07 J/GH,
- KnC's 16nm Solar 0.07 J/GH,
- Spondoolies's 28nm PickAxe 0.15 J/GH,
- Bitmain's 28nm BM1385 0.18-0.26 J/GH,
- Avalon's 28nm A3218 0.27 J/GH.
Some miners run older less efficient gear, but the bulk of the global hash rate comes from these 5 chip families. Thus the network's average efficiency falls between 0.055 and 0.27 J/GH.
With the sudden doubling of the network's global hash rate from 500 to 1000 PH/sec (petahash/sec) occuring in the last 2 months alone, it is fair to say that most of the computing power comes from the newer, more efficient chips. Bitfury's 40 MW Georgia data center (0.055-0.07 J/GH) alone could account for up to 650 PH/sec.
But let's be conservative and assume the network's average efficiency is 0.15 J/GH. With a global hash rate 1000 PH/sec, this means the network is consuming around 150 MW. Over a year this represents electricity costs of $131 million (assuming the global average rate of $0.10 per kWh), and 1.3 TWh (terawatt-hours).
For comparison, according to IEA statistics, the world's consumption of energy for year 2013 was 9 301 Mtoe or 108 171 TWh.
Thus only 0.0012% of the world's consumption of energy goes to Bitcoin mining, or literally less than half of a drop in the bucket .
Furthermore most miners use clean renewable energy; ruthless competition in the mining industry forces them to use renewables due to their lower costs (as opposed to more expensive polluting power sources such as coal power plants). They usually select hydroelectric power which has a zero carbon footprint: BitFury, KnC, BW, etc.
Bitcoin's 0.0012% should be contrasted to many instances of energy waste that occur on much larger scales. A 3-ton car wastes 97.5% of its energy transporting a 75-kg person (1-75/3000=0.975). Power plants waste enormous amounts of energy. The US waste 61% of its energy (and similarly for the rest of the world). These are real and significant wastes. Bitcoin is not.
2. Mining may grow but its energy footprint will remain relatively small and green
As Bitcoin's value and adoption increase, mining will increase. However miners are bounded by sheer economic factors, because they are in it for financial gains.
In a hypothetical scenario where Bitcoin continues to be incredibly successful over the next 5 years, its market cap may hit for example $1 trillion. This is very optimistic. 5 years from now 18 million BTC will be in circulation so the market cap divided by 18 million indicates 1 BTC would be worth $55,000. Meanwhile 3 million BTC will remain to be mined, valued at $165 billion.
If the market cap hypothetically reaches $1 trillion, it will mean Bitcoin has succeeded on a massive scale and has positively impacted our society: foreign trade made more fluid thanks to Bitcoin's disregard of political borders, hundreds of millions of users (vs. today's few millions), millions of merchants accepting it (vs. today's 130 000). Maybe private and governmental financial institutions using Bitcoin as part of their settlement systems. All in all, this would be a (hypothetical) incredible success and Bitcoin would have probably directly or indirectly contributed to increasing the world's GDP by at least a tiny amount.
Now, miners might look at these numbers (3 million BTC to be mined, valued $165 billion) and decide to spend $40 billion in CapEx to build mining hardware and infrastructure, plus $80 billion in OpEx to pay for electricity and other maintenance, then mine $165 billion in BTC over the next 10 years, to potentially make a profit of $45 billion if all goes well.
How much electricity will the miners consume with a budget of $80 billion? If they decide to optimize all costs by refusing to buy electricity from power companies and instead by building their own hydroelectric dams—cheapest source of energy on the planet—then electricity would cost them approximately $0.01 per kWh. So $80 billion buys them 8000 TWh. Over 10 years this is 800 TWh per year.
Therefore even a hypothetical Bitcoin market cap of $1 trillion still fails to incentivize miners to consume more than 800 TWh per year, or 0.74% of the world's consumption of energy; however by this point Bitcoin would have also probably contributed to increasing the world's GDP by at least a tiny amount, and if it is by more than 0.74% then it will have been worthwhile to spend 0.74% of the energy on it.
3. Proof-of-work is the only way
It would be valid to say Bitcoin's proof-of-work is wasteful if there was a more efficient way to implement a Bitcoin-like currency (decentralized with no trusted third-party). But it appears impossible.
An alternative to proof-of-work might be proof-of-stake, but its core issue is that there is "nothing at stake". The developers of Ethereum have put incredible efforts into researching ways around this flaw, publishing numerous fascinating and technical blog posts. But even they had to give up the idea for now, and had to resolve to implement proof-of-work in the current Ethereum version. They still wish to move to proof-of-stake at some point in 2016, but it is unclear how they will do it or even if they will be successful.
Researchers have also studied other alternatives like proof-of-activity, and proof-of-burn, but they either directly or indirectly depend on proof-of-work.
I wish we could get rid of proof-of-work, but so far it does not appear possible. (I would love to be proven wrong!)
Or… could we at least use a proof-of-work having some usefulness outside of Bitcoin, such as replacing SHA256 hashing with protein folding? It would certainly be revolutionary. For this to happen we would need to figure a way to generate protein folding tasks in a decentralized manner, perhaps deterministically from a seed. Unfortunately no one has figured a way to do this. (And no, FoldingCoin did not find a solution: they were forced to forego decentralization in their design.)
4. Bitcoin already benefits the economy
As of September 2015, at least 729 Bitcoin companies have been created: exchanges, miners, wallets, payment services, infrastructure, etc. As of November 2015, more than $1 billion was invested these companies which employ thousands of persons.
In addition, a smaller economic benefit that is still worth mentioning is the following: Bitcoin as a payment platform increases economic trade by removing friction from traditional payments, or simply by offering one more option to pay. It leads to sales that may otherwise not take place if Bitcoin did not exist. Same thing for PayPal, credit cards, or any other payment systems: the more options exist, the higher the sales. The benefit is hard to quantify but is certainly there. For example BitPay reported a 110% increase in their transaction volume between 2014 and 2015. Some/most of these payments probably displaced traditional payments, while a fraction of them would not have taken place if Bitcoin did not exist.
When looking at the big picture, Bitcoin is a system that consumes 150 MW costing $131 million per year (see argument 1), and effectively created 729+ tech companies, thousands of jobs, and increased economic trade.
Is this worth it? Yes it is! Just looking at the jobs alone, it would be impossible to hire and pay thousands of employees at tech companies with a budget of $131 million.
Therefore labeling Bitcoin mining as a "waste" is a failure to look at the big picture. It is ignoring that these jobs alone that are a direct, measurable, and positive impact that Bitcoin already made on the economy.
5. The energy cost per transaction is declining
One criticism of Bitcoin is not about its aggregate energy use but its energy cost per transaction. Today this is a non-issue because miners' energy costs are fully covered by the mining reward. But as the reward decreases, 10-20 years from now it will not be sufficient to cover the energy costs. So transactions fees instead will have to cover them. Effectively users will be paying for the per-transaction energy cost through transaction fees.
Before I calculate the per-transaction energy cost, I would like to point out that many published estimates are grossly inaccurate. See Critic of Chris Malmo's VICE Motherboard piece), or Critic of the Bitcoin Energy Consumption Index, or Critic of John Lilic's LinkedIn post, etc.
So what is the true per-transaction cost?
A year and a half ago, in August 2014, the network was at 200 PH/sec and consumed 150 MW ($131 million per year assuming $0.10 per kWh), and processed 70 000 tx/day. The energy cost per transaction was $5.10.
Today the network grew 5× to 1000 PH/sec and I estimate the same 150 MW energy footprint (see argument 1), which makes sense: back in August 2014 miners efficiency was 5× worse than today and averaged approximately 1 J/GH: Avalon's 55nm A3255 (2 J/GH), Avalon's 40nm A3233 (0.75 J/GH), KnC's 28nm Jupiter (1.5 J/GH), KnC Neptune (0.6 J/GH), etc. However the network now pocesses 200 000 tx/day. The energy cost per transaction is $1.80.
The transaction rate is currently sharply increasing and should continue to do so mainly with block size increases (we are on the brinks of a consensus on a first historical increase from 1MB to 2MB) but also with off-chain payments (eg. Lightning Network or perhaps Stash which is based on Open-Transactions). If the current trend continues (transaction rate quadrupling every 2 years), the energy cost per transaction would fall 2 years from now to $0.45. Well not exactly. We need to take into account miners increasing their energy consumption over the next 2 years. It took 7 years for Bitcoin to go from 0 to 150 MW. Perhaps it could grow approximately +50% to 225 MW in 2 years. On this basis the per-transaction cost would not be $0.45 but $0.68.
Overall, the main observation is that the energy cost per Bitcoin transaction is rapidly falling ($5.10 → $1.80 → $0.68) and will continue to fall as long as the increase of the transaction rate is superior to the increase of the miners' energy consumption. At some point the energy cost should settle to a certain level because, as explained above, 10-20 years from now miners revenue will come mainly from transaction fees. So transaction rate, transaction fees, miners revenue, and miners' energy consumption will all be 4 linearly correlated metrics evolving hand-in-hand.
Even in the worst situation where the per-transaction energy cost settles to a relatively high amount such as half a dollar, and causes transaction fees to be half a dollar 10-20 years from now, Bitcoin would still beat the fees of most legacy financial systems (average remittance fees of 7.68%, bank wires costing a flat $20-30 fee, etc). The person who issued a Bitcoin transaction worth $147 million surely would not mind paying half a dollar in fees next time!
But there is no barrier preventing the per-transaction energy cost from settling much lower, such as $0.10 or $0.01 or less. All we need is block size increases, or Lightning Network/Stash. It is a common fallacy to assume that the more transactions are processed, the more energy miners "must" consume in order to continue securing the Bitcoin network at the same level. The adequate amount of energy spent securing the network needs only be proportional to the Bitcoin market cap, not to the transaction rate.
Think about it this way: it would be pointless to spend 100 MW to secure a cryptocurrency transacting $1 quadrillion per day if its market cap was a puny $1 and all coins were transacted 1 quadrillion times per day. The theoretical maximum value an attacker might steal from this cryptocurrency is $1. Therefore this network should be protected with no more than whatever is worth to spend in order to protect $1.
Many analyses of Bitcoin's energy consumption are flawed
It does not help when talking about whether Bitcoin's energy consumption is a waste or not, that many consumption estimates are wrong in the first place. I thought a small reference section about what they get wrong would be useful here.
If you are a journalist researching and writing about energy consumption, please email me and I would be more than happy to provide feedback in the form of a review.
Critic of Chris Malmo's VICE Motherboard piece
In "Bitcoin is unsustainable" (June 2015), Chris Malmo claims the network consumes 250-500 MW, quoting a then-outdated and still outdated Allied Control's whitepaper (July 2014), which in turn quotes an even more outdated Google doc (December 2013, copy), which estimated the network was actually consuming 45 MW at the time and could grow to 110 MW by April 2014. Nowhere is stated "250-500 MW" as Allied Control claims. They probably attempted to extrapolate the network growth but failed to take into account the quickly increasing efficiency of miners, which was 5 J/GH when this Google doc was written and improved 33× to today's 0.15 J/GH.
Critic of the Bitcoin Energy Consumption Index
Flaws in BECI cause it to overestimate electricity consumption by 4×. Describing these flaws is a task that deserves its own post: Serious faults in the Bitcoin Energy Consumption Index.
Critic of John Lilic's LinkedIn post
John Lilic's analysis is based on the flawed Chris Malmo article as well as the flawed BECI site. Read my critic of John's post here. This is one of the reasons I review this stuff. One incorrect article is released, others start quoting it, referencing it, and falsehoods spread quickly…
Critic of O'Dwyer and Malone
In Bitcoin Mining and its Energy Footprint, as of mid-March 2014, Karl J. O'Dwyer and David Malone estimated the consumption of Bitcoin miners to be comparable to Ireland's electricity consumption (3 GW.) To come to this conclusion they first note the efficiency of mining hardware at two extremes: GPUs and ASICs at respectively 500 and 0.5 J/GH (2 and 2000 MH/J.) They approximate the network hash rate to 20 PH/s and calculate a power consumption between 10 MW and 10 GW (they erroneously report the lower bound as "100 MW.") O'Dwyer and Malone then declare in an overly simplistic way the actual consumption to be "between the two" (glancing over a difference of 3 orders of magnitude!) and therefore could plausibly be 3 GW.
In reality, GPU and FPGA miners were obsolete as of March 2014. Virtually all the hash rate added through 2013-2014 came from ASICs. If the entire network of 20 PH/s was using the least efficient ASIC in the history of Bitcoin (Avalon A3256 at 7 J/GH), the total power consumption would have been 140 MW, a far cry from "3 GW."
In mid-March 2014, BTC was trading at 500-600 USD, the difficulty was 4.25e9, so assuming electricity at $0.05/kWh, hardware needed to be more efficient than 49-59 J/GH to recoup electricity costs alone, and definitely less than 40 J/GH to also recoup other opex: data center facilities, labor, etc. The most popular FPGA used amongst miners, Spartan6 LX150 achieved only 50 J/GH. It is impossible that the majority of the hash rate as of March 2014 came from FPGAs.
Final words
Take all my numbers with a ±30% grain of salt. When I write "0.27 J/GH" or "$0.45", I do not mean exactly these numbers. They are estimates that are sufficiently accurate to issue the conclusions I issue.
Please send me feedback! This post is a work-in-progress. I want to improve it.