The author of the Bitcoin Energy Consumption Index makes a fundamentally flawed assumption, causing it to overestimate the energy consumption by approximately a factor of 3-4. For example a real-world $1.5-2 million farm that is, on a monthly basis, mining $175k and using $37k of electricity, will be assumed to actually consume “$175k” of electricity in BECI’s theoretical model, which is economically nonsensical.
Quoting the website:
“For the Bitcoin Energy Consumption Index it is assumed that the average amount of Watts consumed per GH/s is equal to the amount of Watts that can be afforded by mining revenues per GH/s”
In other words BECI assumes the average miner uses so much electricity that mining revenues allow miners to barely recoup only their electical costs, while losing 100% of other major expenditures. The fact this is the average miner is mind-boggling. It implies half of the miners not only lose 100% of other expenditures but also lose money on electrical opex, ie. spend $1 on electricity, mine $0.50, and lose $0.50. And BECI seriously implies this is how half the network operates… Let that sink in for a minute.
- BECI assumes the average miner loses the totality of their initial capital invested in a mining farm (capex). This consists of the mining hardware itself (by far the largest cost), data center construction, labor to deploy the hardware, etc.
- BECI assumes the average miner continuously loses additional money on non-electrical operational expenditures (opex): staff to operate and guard the facility, servicing failing hardware, etc.
Concretely, how bad is it? Here is a realistic example:
An investor spends $1.5-2 million on a farm: $1 million on 700 Antminer S9 (13.5 TH/s, 1320 watt, $1400), plus $0.5-1 million for everything else (building, power distribution, cooling, down to every nut and bolt). This farm mines at 9.5 PH/s and consumes 1 megawatt. At $0.05/kWh the electricity bill is $37k/month. This farm mines about 185 BTC or $175k/month (as of 1 Feb 2017, when difficulty = 393e9 and 1 BTC = $950.)
According to BECI: $175k/month mined implies a mysteriously high electricity bill of $175k/month, meaning (1) the investor permanently loses the $1.5-2 million capex, (2) the investor barely stays afloat as the totality of his mining revenues are funneled into paying this hypothetical electricity bill, and (3) he continues to lose $5-10k every month (whatever is needed to employ staff to maintain and guard 1 megawatt of hardware).
But in reality the farm’s electricity bill is only $37k/month, as calculated.
BECI overestimated the consumption by 4.7× in this specific example. This is approximately how bad BECI’s model is.
Clue #1 the model is wrong
BECI fails to pass the laws of economics. As a businessman if I see I will lose $1.5-2 million from day 1 and will lose more money every month while running it, I would never invest in a farm. But this is not what we are observing: the worldwide hash rate keeps increasing. So obviously miners mine more and more because electricity consumption is not a huge hopeless money pit like BECI depicts it to be…
In reality mining farms are launched because financial planning and cost modeling shows the electricity costs to be small enough that investors hope to at least recoup their capex, and then hopefully make a little bit more money beyond this break-even point.
Clue #2 the model is wrong
Another way to show BECI’s model is flawed is that between 21 Dec 2016 and 4 Jan 2017 the network hash rate stayed flat, however because the Bitcoin price increased sharply during this period BECI incorrectly assumed “hey it must mean electricity consumption jumped +30% in these 2 weeks!” (9.3 to 12.0 TWh).
This error is a direct cause of BECI assuming there exists a 1:1 linear correlation between revenues (Bitcoin price) and electricity consumption. In fact the graphs show the absence of correlation continues after the period marked in red: the hash rate increases a bit, but BECI’s estimate decreases. But focusing on this (absence of) correlation is distracting from the main point, which is that BECI incorrectly assumes miners lose most of their expenditures.
Clue #3 the model is wrong
As of 1 Feb 2017 BECI’s flawed model infers the following:
“Current break-even Watts per GH/s (used for calculating the index): 0.427”
(Watts/GH/s is the same as J/GH.) The disconnect with reality is that the average efficiency of ASICs manufactured and deployed as of Aug 2016 was already as low as 0.10 J/GH.1 And out of today’s 3000 PH/s global hash rate, we know half of it was put online since Aug 2016:
So if one half (1500 PH/s) mines at 0.10, the other half would need to mine at 0.754 for BECI’s claim to be correct (average of 0.427.) The problem is: it is obvious there are not enough ~0.754 J/GH ASICs manufactured in the entire world to even account for anywhere close to 1500 PH/s.2
And more convincingly: why would half the network be mining at 0.754 J/GH? Anything above 0.427 is way below break-even: you are losing your capex, you are losing money on electrical opex, and you are losing money on non-electrical opex. This would be complete economic non-sense.
How to fix BECI?
There are 2 options.
Option 1: BECI sticks with the technique of estimating (electrical) costs from revenues. In that case the author should revamp his cost model by:
- Taking into account capex and non-electrical opex instead of completely ignoring them.
- Assuming a daily hashrate growth between 0.2 and 0.6% (current trend) which destroys profitability sooner than the author thinks.
- Assuming the electricity consumption does not instantly track linearly Bitcoin price changes, but tracks the average price of the last 4-8 months (it takes at least this long for a farm to be built.)
- Finally, given all these costs, the BECI should assume miners come out with a return on investment anywhere between 0 and 10% to be conservative.
Make these changes and BECI will be much more accurate.
Option 2: BECI could change its model by estimating electrical costs from the known electrical efficiency—measured in J/GH—from known Bitcoin ASICs. This is the model I used myself in this post (section 1). It is easy to do. There exists only a handful of ASICs on the market and in 2016 they have all reached the 16nm node so they are all easily in the range 0.10-0.15 J/GH (some can operate as low as 0.06 J/GH.) In Jan 2016 I estimated an average efficiency of 0.15 J/GH. As of 1 Feb 2017 it has probably improved 20% or so, down to 0.12 J/GH. Multiply this with today’s global hash rate of 3000 PH/s, and you end up with a global electricity consumption of 3.15 TWh, which is 3.4× lower than BECI’s current estimation (10.85 TWh.)
Based on this model, BECI overestimates Bitcoin’s electrical consumption by approximately a factor of 3-4, as it currently stands.
As I previously offered, if your are a journalist, or researcher, or anyone who wants to write about Bitcoin’s energy consumption, send me a note. I will be happy to review your work and provide feedback before publication. It is professionally unacceptable to publish analyses that are so far off reality like BECI.
Other minor errors
BECI makes other more minor errors. It is incorrect and misleading about the “energy” consumption of countries that are compared to Bitcoin. For example, it says Lithuania’s energy use is 11.21 TWh, but in fact this country uses 20.35 TWh (1.75 Mtoe) of energy of which 11.21 TWh is electricity use. BECI picks the smaller 11.21 number instead of 20.35 to do the comparison (which maximizes Bitcoin’s importance), while calling it “energy” which makes it sound like Bitcoin encompasses this country’s entire energy consumption: electricity + oil + coal + gas… All mentions of “energy” on the BECI site should be replaced with “electricity”. I reported this to the author on 30 Dec 2016, but the error still has not been fixed.
Also, BECI assumes the electricity cost of miners is on average of $0.06/kWH. In reality I believe it is closer to $0.04/kWh. Maybe $0.05/kWh. There are professional miners, like MegaBigPower, located in Douglas County in Washington State where the local PUD currently charges $0.0233/kWh. (Side note: I myself was GPU-mining in Douglas County in 2011-2012.) Ironically a lower electricity cost causes BECI’s flawed model to assume miners consume even more electricity, so I really hope BECI does not just fix the $/kWh and nothing else as it would make it even more wrong. The real fix is for BECI to properly revamp its model.
There are 4 main ASICs on the market (soon to be 3 because the KnC company is in trouble…):
- BitFury BF8162C16 16nm ASICs have been deployed in BitFury’s private data centers (such as their liquid-cooled one in Georgia) since Dec 2015, and operate from 0.06 to 0.12 J/GH.
- Bitmain BM1387 16nm ASICs were launched as part of their Antminer S9/R4/T9 series in mid-2016, and operate at 0.10 J/GH.
- Avalon A3212 16nm ASICs were launched with the AvalonMiner 721 in Nov 2016, and operate at 0.15 J/GH.
- KnC Solar 16nm ASICs have been deployed in KnCMiner’s private data centers since mid-2015, and operate at 0.07 J/GH.
By late 2014 all major ASIC vendors had already reached about twice the efficiency, like the Avalon A3222 at 0.40 J/GH released in Sep 2014. In late 2014 the entire Bitcoin network was measuring 300 PH/s. Even assuming all of it was still running on already-obsolete ~0.754 J/GH ASICs, we are still very far off the quantity of hardware (1500 PH/s) to exist for even BECI’s model to make sense. ↩