mrb's blog

Serious faults in Digiconomist's Bitcoin Energy Consumption Index

Keywords: bitcoin electricity energy

The author of the Bitcoin Energy Consumption Index makes a fundamentally flawed assumption, causing it to overestimate the energy consumption by approximately a factor of 3-4. For example a real-world $1.5-2 million farm that is, on a monthly basis, mining $175k and using $37k of electricity, will be assumed to actually consume “$175k” of electricity in BECI’s theoretical model, which is economically nonsensical.

Quoting the website:

“For the Bitcoin Energy Consumption Index it is assumed that the average amount of Watts consumed per GH/s is equal to the amount of Watts that can be afforded by mining revenues per GH/s”

In other words BECI assumes the average miner uses so much electricity that mining revenues allow miners to barely recoup only their electical costs, while losing 100% of other major expenditures. The fact this is the average miner is mind-boggling. It implies half of the miners not only lose 100% of other expenditures but also lose money on electrical opex, ie. spend $1 on electricity, mine $0.50, and lose $0.50. And BECI seriously implies this is how half the network operates… Let that sink in for a minute.

  • BECI assumes the average miner loses the totality of their initial capital invested in a mining farm (capex). This consists of the mining hardware itself (by far the largest cost), data center construction, labor to deploy the hardware, etc.
  • BECI assumes the average miner continuously loses additional money on non-electrical operational expenditures (opex): staff to operate and guard the facility, servicing failing hardware, etc.

Concretely, how bad is it? Here is a realistic example:

An investor spends $1.5-2 million on a farm: $1 million on 700 Antminer S9 (13.5 TH/s, 1320 watt, $1400), plus $0.5-1 million for everything else (building, power distribution, cooling, down to every nut and bolt). This farm mines at 9.5 PH/s and consumes 1 megawatt. At $0.05/kWh the electricity bill is $37k/month. This farm mines about 185 BTC or $175k/month (as of 1 Feb 2017, when difficulty = 393e9 and 1 BTC = $950.)

According to BECI: $175k/month mined implies a mysteriously high electricity bill of $175k/month, meaning (1) the investor permanently loses the $1.5-2 million capex, (2) the investor barely stays afloat as the totality of his mining revenues are funneled into paying this hypothetical electricity bill, and (3) he continues to lose $5-10k every month (whatever is needed to employ staff to maintain and guard 1 megawatt of hardware).

But in reality the farm’s electricity bill is only $37k/month, as calculated.

BECI overestimated the consumption by 4.7× in this specific example. This is approximately how bad BECI’s model is.

Clue #1 the model is wrong

BECI fails to pass the laws of economics. As a businessman if I see I will lose $1.5-2 million from day 1 and will lose more money every month while running it, I would never invest in a farm. But this is not what we are observing: the worldwide hash rate keeps increasing. So obviously miners mine more and more because electricity consumption is not a huge hopeless money pit like BECI depicts it to be…

In reality mining farms are launched because financial planning and cost modeling shows the electricity costs to be small enough that investors hope to at least recoup their capex, and then hopefully make a little bit more money beyond this break-even point.

Clue #2 the model is wrong

Another way to show BECI’s model is flawed is that between 21 Dec 2016 and 4 Jan 2017 the network hash rate stayed flat, however because the Bitcoin price increased sharply during this period BECI incorrectly assumed “hey it must mean electricity consumption jumped +30% in these 2 weeks!” (9.3 to 12.0 TWh).

BECI fluctuates for no reason

This error is a direct cause of BECI assuming there exists a 1:1 linear correlation between revenues (Bitcoin price) and electricity consumption. In fact the graphs show the absence of correlation continues after the period marked in red: the hash rate increases a bit, but BECI’s estimate decreases. But focusing on this (absence of) correlation is distracting from the main point, which is that BECI incorrectly assumes miners lose most of their expenditures.

Clue #3 the model is wrong

As of 1 Feb 2017 BECI’s flawed model infers the following:

“Current break-even Watts per GH/s (used for calculating the index): 0.427”

(Watts/GH/s is the same as J/GH.) The disconnect with reality is that the average efficiency of ASICs manufactured and deployed as of Aug 2016 was already as low as 0.10 J/GH.1 And out of today’s 3000 PH/s global hash rate, we know half of it was put online since Aug 2016:

Global hash rate through 2016

So if one half (1500 PH/s) mines at 0.10, the other half would need to mine at 0.754 for BECI’s claim to be correct (average of 0.427.) The problem is: it is obvious there are not enough ~0.754 J/GH ASICs manufactured in the entire world to even account for anywhere close to 1500 PH/s.2

And more convincingly: why would half the network be mining at 0.754 J/GH? Anything above 0.427 is way below break-even: you are losing your capex, you are losing money on electrical opex, and you are losing money on non-electrical opex. This would be complete economic non-sense.

How to fix BECI?

There are 2 options.

Option 1: BECI sticks with the technique of estimating (electrical) costs from revenues. In that case the author should revamp his cost model by:

  1. Taking into account capex and non-electrical opex instead of completely ignoring them.
  2. Assuming a daily hashrate growth between 0.2 and 0.6% (current trend) which destroys profitability sooner than the author thinks.
  3. Assuming the electricity consumption does not instantly track linearly Bitcoin price changes, but tracks the average price of the last 4-8 months (it takes at least this long for a farm to be built.)
  4. Finally, given all these costs, the BECI should assume miners come out with a return on investment anywhere between 0 and 10% to be conservative.

Make these changes and BECI will be much more accurate.

Option 2: BECI could change its model by estimating electrical costs from the known electrical efficiency—measured in J/GH—from known Bitcoin ASICs. This is the model I used myself in this post (section 1). It is easy to do. There exists only a handful of ASICs on the market and in 2016 they have all reached the 16nm node so they are all easily in the range 0.10-0.15 J/GH (some can operate as low as 0.06 J/GH.) In Jan 2016 I estimated an average efficiency of 0.15 J/GH. As of 1 Feb 2017 it has probably improved 20% or so, down to 0.12 J/GH. Multiply this with today’s global hash rate of 3000 PH/s, and you end up with a global electricity consumption of 3.15 TWh, which is 3.4× lower than BECI’s current estimation (10.85 TWh.)

Based on this model, BECI overestimates Bitcoin’s electrical consumption by approximately a factor of 3-4, as it currently stands.

As I previously offered, if your are a journalist, or researcher, or anyone who wants to write about Bitcoin’s energy consumption, send me a note. I will be happy to review your work and provide feedback before publication. It is professionally unacceptable to publish analyses that are so far off reality like BECI.

Other minor errors

BECI makes other more minor errors. It is incorrect and misleading about the “energy” consumption of countries that are compared to Bitcoin. For example, it says Lithuania’s energy use is 11.21 TWh, but in fact this country uses 20.35 TWh (1.75 Mtoe) of energy of which 11.21 TWh is electricity use. BECI picks the smaller 11.21 number instead of 20.35 to do the comparison (which maximizes Bitcoin’s importance), while calling it “energy” which makes it sound like Bitcoin encompasses this country’s entire energy consumption: electricity + oil + coal + gas… All mentions of “energy” on the BECI site should be replaced with “electricity”. I reported this to the author on 30 Dec 2016, but the error still has not been fixed.

Also, BECI assumes the electricity cost of miners is on average of $0.06/kWH. In reality I believe it is closer to $0.04/kWh. Maybe $0.05/kWh. There are professional miners, like MegaBigPower, located in Douglas County in Washington State where the local PUD currently charges $0.0233/kWh. (Side note: I myself was GPU-mining in Douglas County in 2011-2012.) Ironically a lower electricity cost causes BECI’s flawed model to assume miners consume even more electricity, so I really hope BECI does not just fix the $/kWh and nothing else as it would make it even more wrong. The real fix is for BECI to properly revamp its model.


Footnotes:

  1. There are 4 main ASICs on the market (soon to be 3 because the KnC company is in trouble…):

    • BitFury BF8162C16 16nm ASICs have been deployed in BitFury’s private data centers (such as their liquid-cooled one in Georgia) since Dec 2015, and operate from 0.06 to 0.12 J/GH.
    • Bitmain BM1387 16nm ASICs were launched as part of their Antminer S9/R4/T9 series in mid-2016, and operate at 0.10 J/GH.
    • Avalon A3212 16nm ASICs were launched with the AvalonMiner 721 in Nov 2016, and operate at 0.15 J/GH.
    • KnC Solar 16nm ASICs have been deployed in KnCMiner’s private data centers since mid-2015, and operate at 0.07 J/GH.

  2. By late 2014 all major ASIC vendors had already reached about twice the efficiency, like the Avalon A3222 at 0.40 J/GH released in Sep 2014. In late 2014 the entire Bitcoin network was measuring 300 PH/s. Even assuming all of it was still running on already-obsolete ~0.754 J/GH ASICs, we are still very far off the quantity of hardware (1500 PH/s) to exist for even BECI’s model to make sense.

Comments

Digiconomist wrote: First of all I'd like to state that the critism on the Bitcoin Energy Consumption Index (BECI) contains false information. It is claimed that BECI runs on the assumption that miners " never recover their investments (capex)". This has never been stated or implied by the information provided on BECI, and may relate to a misunderstanding. BECI assumes that the entire network is running at roughly break even, but this doesn't mean this is the case for every miner part of it. New machines may still earn themselves back easily under this assumption.

Second I'd like to add that the case laid out in this article is extremely optimistic on the electricity consumption of the Bitcoin network. It is stated that " the network's average efficiency falls between 0.055 and 0.27 J/GH". When this article was published, the best publicly available miner was the Antminer S7, running at ~0.25 J/GH. This one was released just a few months before. The Antminer S5+ was released just a little bit before at ~0.44 J/GH (in August). These machines wouldn't even have hit the market if the estimates in this article were true, as they would have been producing a loss as of day 1.
01 Feb 2017 21:35 UTC

mrb wrote: Digiconomist: You *do* imply the average miner never recovers their investments (capex), precisely because you imply the average miner barely stays afloat of his electrical costs only (opex). See quote from your own site in 3rd paragraph.

BECI implies the entire network recoups opex, but loses 100% of the capex. So *not* break even. To break even you'd need to recoup capex+opex.

Do you understand the difference between capex & opex?

Also you are wrong: many mining ASICs online as of Jan 2016 (when I wrote http://blog.zorinaq.com/bitcoin-mining-is-not-wasteful/ which is where our discussion started) beat 0.25 J/GH. It seems you didn't read my post. I provide links and references to each one of them in section 1 (http://blog.zorinaq.com/bitcoin-mining-is-not-wasteful/#sec1):

1. BitFury's latest 16nm chip achieves 0.055-0.07 J/GH (a 40MW data center of them launched in Dec 2015: https://news.ycombinator.com/item?id=10774773)
2. KnC's 16nm Solar 0.07 J/GH,
3. Spondoolies's 28nm PickAxe 0.15 J/GH,
4. Bitmain's 28nm BM1385 0.18-0.26 J/GH,
5. Avalon's 28nm A3218 0.27 J/GH.

It seems you don't know the market of ASICs very well.
01 Feb 2017 21:55 UTC

Digiconomist wrote: I don't think you can collect a random set of hardware and say "this is the J/GH" without even bothering to consider the economics surrounding that because it "seems fair" - while using economic arguments to tackle BECI. Some consistency would be nice.

You also published this post a bit too soon making it hard to discuss. If you had contacted me in advance I could have told you I was collecting data to cover 1 adjustment period in order to account for blocks being created faster (or slower) than 10 minutes on average. This held up the release of version 3, which includes some other adjustments as well. In particular the average costs mentioned here has been relaxed quite a bit. Looking at it from the bright side, you might like the updates. :)
03 Feb 2017 16:09 UTC

mrb wrote: But it's not a random collection of hardware. I didn't hand pick the most efficient ones to prove my point. It's all the known manufacturers of ASICs in the *world*. Literally.

Now, there are a lot of companies that never released silicon, failed, ran out of money, etc, see: https://en.bitcoin.it/wiki/List_of_Bitcoin_mining_ASICs

But nowadays all the known manufacturers of mining ASICs can be counted on the fingers of one hand (the list I gave, minus KnC minus Spondoolies who have failed). And when *all* of them have been shipping ASICs doing 0.06-0.15 J/GH for a while it's pretty obvious BECI's claims of 0.427 J/GH is impossible.

I'll say it again: it seems you don't know the ASIC market and that's what prevents you from understand how far off reality BECI is.
04 Feb 2017 00:29 UTC

Digiconomist wrote: But you didn't check how your estimate works out economically. You're saying the network is running at something like 0.12 J/GH. 0.15 J/GH if there has been zero improvement for a year, but that would be odd. Today the network is at 3,100 PH/s, so we're talking about 3.26-4.07 TWh per year.

We can translate this to costs directly since we can assume miners get 1 KWh per 5 cents spent on costs (per your own numbers). We get that on 3.26-4.07 TWh that translates to USD 163-204 mio in ongoing costs.

Annual income available to miners is easy to estimate as well, and it comes down to 817 mio per year (including fees). What we get is that miners are thus implied to be spending 20-25% on ongoing costs on average.

Now, of course this is in line with your example where the farm is paying 21% in costs, but you realize very well profit margins don't stay at 70-80% during the entire lifetime. On Twitter you wrote: "When I was mining with 20kW of GPUs it was pointless to mine when elec costs were ~70% of my revenues." So in my opinion you're arguing against yourself here.

If you want this to work out you're going to need something like 50% in ongoing costs on average. That's not the number I'm going with (65%), but then we're suddenly talking about BECI being just ~1.3 times the resulting estimate. That wouldn't be a massive gap at all.
04 Feb 2017 14:13 UTC