mrb's blog

Serious faults in Digiconomist's Bitcoin Energy Consumption Index

Keywords: beci critic debunking bitcoin mining electricity energy estimation

The author of the Bitcoin Energy Consumption Index makes fundamentally flawed assumptions, causing it to demonstrably overestimate the electricity consumption of Bitcoin miners by 1.5× to 3.6×, and likely by 2.0× to 2.5×.

The main fault amongst others, in BECI’s model, is making the overly simplistic and wrong assumption that “65%” of mining revenues are spent on electricity. “65%” is essentially pulled out of thin air due to a misunderstanding from the author.

This motivated me to research and publish my own estimates using a model built from real-world data: Electricity consumption of Bitcoin: a market-based and technical analysis bounds the annual consumption with certainty to the range 2.85 to 6.78 TWh. By comparison BECI claims a consumption of “10.23 TWh” as of 1 March 2017.

About a month ago I had written a first version of this BECI critic. At the time the author made the even worse assumption that 100% of mining revenues were spent on electricity. After discussing with him, he made some changes in his “final release”. However BECI remains fundamentally flawed, so I rewrote this post from scratch by basing my arguments on data from my model.

Critic #1: BECI fails to apply economic theory

BECI’s entire model is based on a flawed application of economic theory. Quoting the website:

Economic theory suggests that the marginal product of mining should theoretically equal its marginal cost in a competitive market. This would mean we can calculate the network’s energy efficiency by solving for the break-even electricity costs.

The author erroneously assumes the marginal cost is equal to the electricity cost (opex) and makes the case that in theory (close to) 100% of mining revenues are spent on electricity. In reality the marginal cost also includes the cost of the mining hardware (capex.) Because BECI fails to take into account capex when attempting to apply this economic theory, it ignores the largest initial cost of a mining farm.1

BECI would be right to ignore capex if and only if the hash rate was stagnating or decreasing. It would mean some/all existing miners continue to mine because they recoup their electrical opex and no new miners would join in because they expect to be unable to recoup their capex. In that case, the marginal cost of continuing to mine is only comprised of electrical opex. But in reality, the global hash rate has been significantly increasing for years, precisely because miners expect to recoup their capex, so the marginal cost of adding mining capacity is comprised of opex plus capex.

Critic #2: BECI’s author misread a working paper and arbitrarily decided the ratio of electrical opex to mining revenues was “65%”

Initially, BECI’s author applied his flawed interpretation the above economic theory literally and assumed that 100% of mining revenues were spent on electricity, ie. that no profit was ever made.

I pointed out that modern 16nm ASIC miners are in fact mining many more bitcoins than the cost of electricity. He realized something was wrong in his model, so in his “final release” he attempted to find the fraction of mining revenues spent on electricity. He found and quoted an SSRN working paper by Tomaso Aste:

Tomaso Aste (2016) […] implies that the average costs of mining are closer to 55 percent of the available miner income. Aste, however, doesn’t provide many details along with his estimate. To be on the conservative side, the average cost percentage used to calculate the Index is set at 65 percent

However Aste never implies an exact 55%. Aste writes:

it can be estimated that, with current hardware, the computation of a billion of hashes consumes, with state-of-the-art technology, between 0.1 to 1 Joule of energy. This implies that currently about a billion Watts are consumed globally every second (1GW/sec [sic]) to produce a valid proof of work for Bitcoin.” [Side note: GW/sec is incorrect—the unit should be GW or gigajoule/sec.]

At the time of Aste’s writings in June 2016 the network’s hash rate was 1500 PH/s, therefore “0.1 to 1 Joule” implies 150 MW to 1.5 GW, which is within order of magnitude of 1 GW. The hourly cost of 1 GW is $50k assuming $0.05/kWh, or $8333 per 10-minute Bitcoin block, which Aste compares to the per-block income of $15000 at the time, which is where the 55% comes from: 8333 / 15000.

But all these figures are merely orders of magnitude computed from “0.1 to 1 Joule.” Aste’s conclusion makes it clear:

This is indeed the order of magnitude of the present electricity cost for the proof of work in Bitcoin.

I even emailed Aste to erase any doubt; he replied:

Yes Marc, we do not know which is the average consumption per GH, it depends on the hardware, we can only produce order of magnitude estimates.

To be numerically exact, all Aste implies is that the electrical consumption is 150-1500 MW, which corresponds to a per-block electricity cost of $1250-12500, or 8.3-83.3% of the mining income.

BECI’s author misread a working paper by interpreting an order of magnitude as an exact estimate (55%) and arbitrarily picked “65%” when in fact the paper merely implies the percentage is somewhere between 8.3 and 83.3%.

Critic #3: BECI wrongly assumes electricity consumption and Bitcoin price are correlated over a “few weeks”

[Update: BECI’s author informed me he computes the average over 60 days, so this section can be ignored.]

Even if BECI estimated the correct percentage of mining revenues spent on electricity, another error would remain:

the Bitcoin price used for determining the Index is based on a moving average over the last few weeks

BECI is opaque: it fails to disclose how exactly this moving average is computed. Is a “few weeks” 2 weeks, 3 weeks, 4 weeks? Either way this moving average is insufficient. It takes not weeks but months to plan, finance, build and launch a significant mining farm in response to a Bitcoin price increase that opens a mining venture opportunity.2 Miners put mining hardware online in response to the Bitcoin price as of months ago, not weeks ago.

Because Bitcoin’s price has been generally increasing lately (+30% every 2 months between October 2016 and February 2017), looking at the moving average over a few weeks is causing BECI to constantly overestimate the revenues that miners plan for, hence causing its simplistic model to overestimate the global electricity consumption.

Critic #4: BECI uses misleading terminology

A more minor complaint I have about BECI is that it claims to compare Bitcoin to the energy used by countries (electricity + oil + coal + gas…) when in fact it compares it only to their electricity usage. BECI references the International Energy Agency for these numbers, and in my opinion BECI should follow this agency’s example and use proper terminology regarding energy vs electricity. Notably it should title the chart “Electricity consumption by country”, and should say “The entire Bitcoin network now consumes more electricity than a number of countries”. I reported this to the author on 30 December 2016, but he does not seem to want to fix it…

How to fix BECI’s model?

Step 1

First of all it is unscientific and misleading for BECI to claim to know a precise estimate of the electricity consumption of Bitcoin. No one knows it because the market share of the various ASICs is largely unknown, and no large-scale polling or study has ever been conducted to determine it. A proper scientific method would be at least to estimate lower and upper bounds.

Step 2

BECI needs to accurately estimate the percentage of mining revenues spent on electricity.

We can determine this from my analysis (Electricity consumption of Bitcoin: a market-based and technical analysis) which computed the lower and upper bounds as well as a best guess for the average energy efficiency of mining ASICs as of 26 February 2017:

  • Lower bound: 0.100 J/GH
  • Best guess: 0.145-0.166 J/GH
  • Upper bound: 0.238 J/GH

This analysis also computed that mining hardware is profitable below 0.56 J/GH (assuming $0.05/kWh.)

Therefore the average amount of mining revenues spent on electricity is 18 to 42%3 (0.100/0.56 to 0.238/0.56), which is 1.5× to 3.6× lower than BECI’s arbitrary choice of “65%”. And my best guess is 26-30% (2.0× to 2.5× lower than BECI.)

Step 3

BECI needs to calculate the average mining income by averaging the Bitcoin price over a few months, not weeks, as explained above.


BECI’s author wrote:

In the past, electricity consumption estimates typically included an assumption on what machines were still active and how they were distributed, in order to arrive at a certain number of Watts consumed per Gigahash/sec (GH/s). This arbitrary approach has led to a wide set of electricity consumption estimates

But it is his approach that is arbitrary, specifically his choice of “65%.”

The lower and upper bound estimates I presented are established with very high confidence. They literally assume the best and worst possible case, assuming respectively that all miners run either the most energy-efficient hardware possible or the least efficient hardware available at their time. For details see Electricity consumption of Bitcoin: a market-based and technical analysis.

As I previously offered, if your are a journalist, or researcher, or anyone who wants to write about Bitcoin’s energy consumption, send me a note. I will be happy to review your work and provide feedback before publication. It is professionally unacceptable to publish analyses that are so far off reality like BECI.


  1. I would make the argument that the marginal cost also includes other capital expenditures (data center building, etc) and non-electrical operational expenditures (labor to maintain and operate data centers, etc.) But this is debatable, so I will not use this argument in my critic of BECI.

  2. It takes less time for a small miner to upgrade from 1 to 2 kW, than a large miner from 10 to 20 MW. And the network is mostly made of a small number of large miners, not the other way around (large number of small miners.)

  3. 18-42% is calculated excluding fees. Since fees average approximately 1/10th of the block reward of March 2017, the real percentage of mining revenues spent on electricity is 16-39%.


Digiconomist wrote: First of all I'd like to state that the critism on the Bitcoin Energy Consumption Index (BECI) contains false information. It is claimed that BECI runs on the assumption that miners " never recover their investments (capex)". This has never been stated or implied by the information provided on BECI, and may relate to a misunderstanding. BECI assumes that the entire network is running at roughly break even, but this doesn't mean this is the case for every miner part of it. New machines may still earn themselves back easily under this assumption.

Second I'd like to add that the case laid out in this article is extremely optimistic on the electricity consumption of the Bitcoin network. It is stated that " the network's average efficiency falls between 0.055 and 0.27 J/GH". When this article was published, the best publicly available miner was the Antminer S7, running at ~0.25 J/GH. This one was released just a few months before. The Antminer S5+ was released just a little bit before at ~0.44 J/GH (in August). These machines wouldn't even have hit the market if the estimates in this article were true, as they would have been producing a loss as of day 1.
01 Feb 2017 21:35 UTC

mrb wrote: Digiconomist: You *do* imply the average miner never recovers their investments (capex), precisely because you imply the average miner barely stays afloat of his electrical costs only (opex). See quote from your own site in 3rd paragraph.

BECI implies the entire network recoups opex, but loses 100% of the capex. So *not* break even. To break even you'd need to recoup capex+opex.

Do you understand the difference between capex & opex?

Also you are wrong: many mining ASICs online as of Jan 2016 (when I wrote which is where our discussion started) beat 0.25 J/GH. It seems you didn't read my post. I provide links and references to each one of them in section 1 (

1. BitFury's latest 16nm chip achieves 0.055-0.07 J/GH (a 40MW data center of them launched in Dec 2015:
2. KnC's 16nm Solar 0.07 J/GH,
3. Spondoolies's 28nm PickAxe 0.15 J/GH,
4. Bitmain's 28nm BM1385 0.18-0.26 J/GH,
5. Avalon's 28nm A3218 0.27 J/GH.

It seems you don't know the market of ASICs very well.
01 Feb 2017 21:55 UTC

Digiconomist wrote: I don't think you can collect a random set of hardware and say "this is the J/GH" without even bothering to consider the economics surrounding that because it "seems fair" - while using economic arguments to tackle BECI. Some consistency would be nice.

You also published this post a bit too soon making it hard to discuss. If you had contacted me in advance I could have told you I was collecting data to cover 1 adjustment period in order to account for blocks being created faster (or slower) than 10 minutes on average. This held up the release of version 3, which includes some other adjustments as well. In particular the average costs mentioned here has been relaxed quite a bit. Looking at it from the bright side, you might like the updates. :)
03 Feb 2017 16:09 UTC

mrb wrote: But it's not a random collection of hardware. I didn't hand pick the most efficient ones to prove my point. It's all the known manufacturers of ASICs in the *world*. Literally.

Now, there are a lot of companies that never released silicon, failed, ran out of money, etc, see:

But nowadays all the known manufacturers of mining ASICs can be counted on the fingers of one hand (the list I gave, minus KnC minus Spondoolies who have failed). And when *all* of them have been shipping ASICs doing 0.06-0.15 J/GH for a while it's pretty obvious BECI's claims of 0.427 J/GH is impossible.

I'll say it again: it seems you don't know the ASIC market and that's what prevents you from understand how far off reality BECI is.
04 Feb 2017 00:29 UTC

Digiconomist wrote: But you didn't check how your estimate works out economically. You're saying the network is running at something like 0.12 J/GH. 0.15 J/GH if there has been zero improvement for a year, but that would be odd. Today the network is at 3,100 PH/s, so we're talking about 3.26-4.07 TWh per year.

We can translate this to costs directly since we can assume miners get 1 KWh per 5 cents spent on costs (per your own numbers). We get that on 3.26-4.07 TWh that translates to USD 163-204 mio in ongoing costs.

Annual income available to miners is easy to estimate as well, and it comes down to 817 mio per year (including fees). What we get is that miners are thus implied to be spending 20-25% on ongoing costs on average.

Now, of course this is in line with your example where the farm is paying 21% in costs, but you realize very well profit margins don't stay at 70-80% during the entire lifetime. On Twitter you wrote: "When I was mining with 20kW of GPUs it was pointless to mine when elec costs were ~70% of my revenues." So in my opinion you're arguing against yourself here.

If you want this to work out you're going to need something like 50% in ongoing costs on average. That's not the number I'm going with (65%), but then we're suddenly talking about BECI being just ~1.3 times the resulting estimate. That wouldn't be a massive gap at all.
04 Feb 2017 14:13 UTC

mrb wrote: I wrote a new post:

And I fully rewrote this post to take into account your "final release" of BECI which is still flawed.

My numbers prove to you that it is IMPOSSIBLE that miners spend more than 42% of mining income on electricity.

I think you fail to visualize how efficiency averages out. To calculate the average efficiency you need to average PER UNIT OF HASH RATE. And at any point in time in general most of the network hash rate is provided by newer farms (perhaps this is the crucial insight that makes it hard for you to accept my numbers?) So if 80% of the hash rate is provided by miners spending 20% on electricity, and if the remaining 20% spend 80% on electricity, the average is not 50% but 32%. Do the math. Read my new analysis ( and let me know if you have any questions.
15 Mar 2017 22:31 UTC

Digiconomist wrote: Did you really write this all just to point out that my estimate should be on the bottom end of my own error margin? From today's numbers: revenue $957M per year, costs $523M per year, so ~55.5%. As stated: "within reasonable economic boundaries one might expect to find a number that is 25 percent higher or lower". 42% costs is within that range. Anyway, anyone is welcome to pick another number if they like. I'm just trying to establish a method that produces a number that is plausible economically and not just technically (if you look at past estimates you can easily see why). BECI does that just fine.

By the way, the price is averaged over 60 days. It's not like I don't take feedback seriously ;)
16 Mar 2017 19:41 UTC

mrb wrote: Your "25%" is another figure pulled out of thin air. And this error margin is wrong too: $523M ±25% is $392-654M which is not in the bounds of $142-339M from Remember anything above $339M is provably wrong. It assumes the worst possible case of miners deploying the least efficient hardware available at their time. So your entire range of $392-654M is *really* in the wrong. You cannot keep saying "this isn't economically plausible" when I present factual, verifiable data proving your model is invalid.

You don't seem to believe in the "economics of mining" so here are numbers for a real-world miner showing that it can be quite profitable to mine (new added section):

Do something about it. Fix BECI.

PS: ok it's good that you average over 60 days. 1 (out of many) issue fixed :)
17 Mar 2017 03:01 UTC

Digiconomist wrote: I found it really hard to understand why you insist there's no overlap until I released 42% in your story isn't the same as 42% in mine, as your total revenues are based on unadjusted block rewards only. I also include fees and adjust for increasing hashrate (blocks are mined faster than once per 10 minutes on average). That leaves a serious gap of $130M between our revenue assumptions lol.

If you take 42% of the actual revenue we're going to be a lot closer, unless of course you'd like to tweak your scientific bounds in that case.
17 Mar 2017 15:05 UTC

mrb wrote: There is no overlap. My calculation is that worst case electricity costs are $339M/yr regardless of fees. The amount of fees is not a variable in my model because the model is based on what ASICs are used by miners and what are their energy efficiency.

So $339M/yr represents 42% of miner income excluding fees, or 38-39% of miner income including fees.

Arguing that your lower bound being close to my upper bound makes your model "ok" is wrong. Your bigger problem is your upper bound that should be close to $339M/yr.
17 Mar 2017 19:10 UTC

Digiconomist wrote: Okay, I really appreciate the effort you're putting into all this, so I'm checking out the new article later. But seriously, please fix this post.

I come to this page, and the first thing I see is a statement that I'm making a mistake on marginal costs. All I can say to that is that marginal costs don't include fixed costs like depreciation on buildings/machines or salaries like you're stating. So you'll then find that it really is mostly electricity costs.

Then I also find that you're just leaving out $100M+ from the revenue, and on top of the previous that really makes my brain explode.
18 Mar 2017 09:40 UTC

mrb wrote: Marginal costs do include the cost of the hardware. What is debatable is whether they include buildings and salaries (I edited this part of the post, see the new footnote about marginal costs.) But you are categorically wrong that they do not include the cost of the hardware. A miner adding one unit of mining capacity certainly needs to pay for this hardware.

In fact, the hardware cost is the largest initial marginal cost of setting up a new farm. Look at the CSV file in : an Antminer S5 cost $418 and consumes only $0.71/day in electricity. It takes more than 1.5 years for the cost of electricity to surpass the cost of the hardware.

Finally, I am not leaving out $100M of fees revenue. See the new footnote explaining how 18-42% is calculated.

Let me know when you will have fixed BECI.
21 Mar 2017 03:30 UTC

Digiconomist wrote: You should really check out the paper by Hayes (2015)

Specifically this part is relevant:

“Each unit of mining effort has a fixed sunk cost involved in the purchase, transportation and installation of the mining hardware. It also has a variable, or ongoing cost which is the direct expense of electricity consumption.”

Since sunk costs (unrecoverable expenses) aren’t relevant to marginal costs (it's not like they're not paid for), that’s how he’s left with electricity consumption.
21 Mar 2017 10:19 UTC

mrb wrote: "Since sunk costs aren't relevant to marginal costs..." → you jump to this conclusion, but no one supports this conclusion. Not even this paper by Hayes. You quoted a part that just explains "there is capex, and there is opex" which is obvious to you and I, and which is not what we are arguing about.

You try to argue that if a miner purchases an Antminer S5 for $418 and powers it for $0.71 per day, then economic theory suggests that mining revenues will amount to $0.71 per day. That is false. In reality miners expect to recover $0.71/day plus the $418.
22 Mar 2017 01:21 UTC

Digiconomist wrote: That's basic economic theory and also the reason why Hayes subsequently ignores them in the rest of his paper...

Investopedia explains this very well:

"Since decision-making only affects the future course of business, sunk costs should be irrelevant in the decision-making process. Instead, a decision maker should base her strategy on how to proceed with business or investment activities on future costs."

So basically, the price that was paid for a miner isn't relevant because it's purely retrospective. Looking forward, only electricity consumption (and some other negligible costs) matters . Putting these 100% of revenues isn't that crazy from an economic PoV (after all, the optimal output if where marginal costs = marginal revenue).

Now you're stating; I can show a new farm starting with elec costs as low as 15%, so are you kidding me with the number of miners needed at 100%+ to compensate for that.

I do agree with you that it goes against intuition, but first let's examine some reasons mentioned by Hayes why costs could exceed 100%:

"Individual decision makers may operate regardless of cost if they believe that there is enough speculative potential to the upside. Bitcoin mining may draw in those who find the features of anonymity and lack of governmental oversight attractive. Some miners may decide to hoard some or all of their lot and not regularly engage in offering mined bitcoins in the open market, a sort of bitcoin 'fetishism'"

Now I'm obviously a bit skeptical about this myself, otherwise I wouldn't have lowered the percentage.

Hayes actually mentions an important reason too why 100% could simply be too much:

"Some miners may be subject to an opportunity cost whereby it would be more profitable to expend the same electrical capacity for some other pursuit"

Objectively it should be (close to) 100% though. At least from an economic PoV. If reality differs that's not a failure of correctly applying economic theory.
22 Mar 2017 10:56 UTC

mrb wrote: I agree this economic theory makes sense in a theoretical case.

But if it was in effect in the present situation, then the global hash rate would be stagnating: existing miners would continue to mine because they can recoup their electrical opex, and no new miners would join in because they expect to be unable to recoup the sunk cost of capex. Obviously that is not what is happening. The hash rate has been increasing for years precisely because new miners expect to recoup sunk costs. Therefore this theory cannot possibly apply to the present situation.

I edited "critic #1" to better explain all this.
26 Mar 2017 02:16 UTC