mrb's blog

Electricity consumption of Bitcoin: a market-based and technical analysis

Keywords: mining asic energy efficiency

I drew a chart juxtaposing the Bitcoin hash rate with the market availability of mining ASICs and their energy efficiency. Using pessimistic and optimistic assumptions (miners using either the least or the most efficient ASICs) we can calculate the upper and lower bounds for the global electricity consumption of miners. I decided to do this research after seeing that so many other analyses were flawed. See for example the flagrant errors in Digiconomist’s Bitcoin Energy Consumption Index. I believe that my market-based and technical approach is superior and more accurate.

[Edit: the present research was cited in Nature Sustainability, cited in Nature Climate Change, quoted in the New York Times, published in Bitcoin Magazine, cited by the Electric Power Research Institute in a Hearing of the U.S. Senate Committee on Energy and Natural Resources, cited by Bloomberg New Energy Finance analysts, and others.]

I split the timeline in 10 phases representing the releases and discontinuances of mining ASICs. See the references and a commentary on the data behind this chart:

Hash rate and mining ASICs efficiency

I reached out to some Bitcoin ASIC manufacturers when doing this market research. Canaan was very open and transparent (thank you!) and gave me one additional extremely useful data point: they manufactured a total of 191 PH/s of A3218 ASICs.1

Determining the upper bound for the electricity consumption is then easily done by making two worst-case assumptions. Firstly we assume that 100% of the mining power added during each phase came from the least efficient hardware available at that time that is still mining profitably.2 Furthermore, despite A3218 being the least efficient in phases 5-8 we can only assume 191 PH/s of it were deployed, and the rest of the hash rate came from the second least efficient ASIC:

  • Phase 0: 290 PH/s @ 0.51 J/GH (BM1384)3
  • Phases 1-3: 150 PH/s @ 0.51 J/GH (BM1384)
  • Phase 4: 40 PH/s @ 0.25 J/GH (BM1385)
  • Phase 5: 191 PH/s @ 0.29 J/GH (A3218) +
    159 PH/s @ 0.25 J/GH (BM1385)
  • Phase 6: 670 PH/s @ 0.25 J/GH (BM1385)
  • Phase 7: 350 PH/s @ 0.20 J/GH (Bitfury 28nm)
  • Phase 8: 150 PH/s @ 0.13 J/GH (BF8162C16)
  • Phase 9: 1250 PH/s @ 0.15 J/GH (A3212)
  • Average weighted by PH/s: 0.238 J/GH

Secondly we assume none of this mining power, some of it being barely profitable, was ever upgraded to more efficient hardware.

Therefore the upper bound electricity consumption of the network at 3250 PH/s assuming the worst-case scenario of miners deploying the least efficient hardware of their time (0.238 J/GH in average) is 774 MW or 6.78 TWh/year.

Now, what about a lower bound estimate? We start with a few observations about the latest 4 most efficient ASICs:

  • Bitfury BF8162C16’s efficiency can be as low as 0.06 J/GH. But the clock and voltage configuration can be set to favor speed over energy efficiency. All known third party BF8162C16-based miner designs favor speed at 0.13 J/GH (1, 2). Bitfury’s own private data centers also favor speed with their immersion cooling technology (1, 2, 3). The company once advertised the BlockBox container achieved 0.13 J/GH (2 MW for 16 PH/s), presumably close to the efficiency achieved by their data centers. But we want to calculate a lower bound, so let’s assume the average BF8162C16 deployed in the wild operates at 0.10 J/GH.
  • KnCMiner Solar is exclusively deployed in their private data centers and achieves an efficiency of 0.07 J/GH.
  • Bitmain BM1387’s efficiency is 0.10 J/GH.
  • Canaan A3212’s efficiency is 0.15 J/GH.

As to market share, we know KnCMiner declared bankruptcy and was later acquired by GoGreenLight. They currently account for 0.3% of the global hash rate… a rounding error we can ignore.

Therefore the lower bound electricity consumption of the network at 3250 PH/s assuming the best-case scenario of 100% of miners currently running one of the latest 3 most efficient ASICs (at best 0.10 J/GH) is 325 MW or 2.85 TWh/year.

Can we do better than merely calculating lower and upper bounds? I think so, but with the exception of Canaan,1 other mining hardware manufacturers tend to be secretive about their market share, so anything below are just educated guesses…

Virtually all of the 1750 PH/s added after June 2016 came from BF8162C16, BM1387, and A3212, with the latter having the smallest market share. So the average efficiency of this added hash rate is likely around 0.11-0.13 J/GH. This represents 190-230 MW.

I would further venture that out of the 1500 PH/s existing as of June 2016, perhaps half was upgraded to BF8162C16/BM1387/A3212, while the other half remains a mixture of BM1385 and A3218. This represent 750 PH/s at 0.11-0.13 J/GH, and 750 PH/s at 0.26-0.28 J/GH, or a total of 280-310 MW.

I believe an insignificant proportion of the hash rate (less than 5%?) comes from all other generations of ASICs. Bitfury BF864C55 and 28nm deployments were upgraded to BF8162C16. KnCMiner/GoGreenLight represents 0.3%. BM1384 is close to being unprofitable. RockerBox, A3222, Neptune have long been unprofitable.

Therefore my best educated guess for the electricity consumption of the network at 3250 PH/s adds up to 470-540 MW or 4.12-4.73 TWh/year.

Economics of mining

Given the apparent high energy-efficiency, hence relatively small percentage of mining income that one needs to spend on electricity to cover the operating costs of an ASIC miner, it may seem that mining is an extremely profitable risk-free venture, right?

Not necessarily. Though mining can be quite profitable, in reality it depends mostly on (1) luck about when BTC gains in value and (2) timing of how early a given model of mining machine is put online (compared to other competing miners deploying the same machines.) I say this as founder of mining ASIC integrator TAV, as an investor who deployed over time $250k+ of GPUs, FPGAs, and ASICs, and as someone who once drove 2000+ miles to transport his GPU farm to East Wenatchee, Washington State in 2011 in order to exploit the nation’s cheapest electricity at $0.021/kWh—yes it was worth it!

To demonstrate real-world profitability of mining, I modeled the income and costs generated by every single machine model released in the last three and a half years in the following CSV files. The model assumes mined bitcoins are sold on a daily basis at the Coindesk BPI, and $0.05/kWh. Some machines have reached their end of life while others continue to mine profitably to this day. All data as of 11 March 2018:

Antminer S5 batch 1 ($418, 590 W, 1155 GH/s, released on 27 December 2014):

  • Lifetime mining revenues: $2018.27
  • Lifetime electricity costs: $779.51 (38.6% of revenues)
  • Lifetime profits: $1238.774

Antminer S7 batch 1 ($1823, 1210 W, 4860 GH/s, released on 13 October 2015):

  • Lifetime mining revenues: $5080.41
  • Lifetime electricity costs: $1279.21 (25.2% of revenues)
  • Lifetime profits: $3801.194

Antminer S9 batch 1 ($2100, 1375 W, 14.0 TH/s, released on 12 June 2016):

  • Lifetime mining revenues: $8589.11
  • Lifetime electricity costs: $1052.70 (12.3% of revenues)
  • Lifetime profits: $7536.414

Canaan Avalon 4 ($?, 680 W, 1.0 TH/s, released on 12 September 2014):

  • Lifetime mining revenues: $2102.79
  • Lifetime electricity costs: $763.78 (36.3% of revenues)
  • Lifetime profits: $1339.014

Canaan Avalon 6 ($1100, 1000 W, 3.5 TH/s, released on 21 November 2015):

  • Lifetime mining revenues: $3312.28
  • Lifetime electricity costs: $1010.40 (30.5% of revenues)
  • Lifetime profits: $2301.884

Canaan Avalon 721 ($888, 900 W, 6.0 TH/s, released on 21 November 2016):

  • Lifetime mining revenues: $2852.62
  • Lifetime electricity costs: $514.08 (18.0% of revenues)
  • Lifetime profits: $2338.544

Bitfury BF864C55 (comes in different configurations, model assumes a 0.50 J/GH, 5000 W, 10.0 TH/s machine in operation since 3 March 2014):

  • Lifetime mining revenues: $80833.65
  • Lifetime electricity costs: $8544.00 (10.6% of revenues)
  • Lifetime profits: $72289.654

Bitfury 28nm (comes in different configurations, model assumes a 0.20 J/GH, 2000 W, 10.0 TH/s machine in operation since 28 February 2015):

  • Lifetime mining revenues: $16046.10
  • Lifetime electricity costs: $2659.20 (16.6% of revenues)
  • Lifetime profits: $13386.904

Bitfury BF8162C16 (comes in different configurations, model assumes a 0.13 J/GH, 1300 W, 10.0 TH/s machine in operation since 12 October 2016):

  • Lifetime mining revenues: $5022.90
  • Lifetime electricity costs: $804.96 (16.0% of revenues)
  • Lifetime profits: $4217.944

KnCMiner Solar (comes in different configurations, model assumes a 0.07 J/GH, 700 W, 10.0 TH/s machine in operation since 4 June 2015):

  • Lifetime mining revenues: $13559.85
  • Lifetime electricity costs: $850.08 (6.3% of revenues)
  • Lifetime profits: $12709.774

Let’s study the first machine of this list, the Antminer S5 batch 1:

On 27 December 2014 (day 1) mining starts; electricity represents 15.3% of the first day’s mining revenues ($0.71 of $4.64), generating profits of $3.93.

On 27 January 2016 (day 397), after 13 months, electricity represents 37.1% of daily revenues ($0.71 of $1.91), generating daily profits of $1.20. Total profits stand at $869.77. Some miners may want to already consider replacing the S5 with a more efficient machine. For example the S7 generates daily profits 5× higher ($6.58) at only 2× the power consumption.

On 9 July 2016 (day 561), profits stand at $1012.37. However the halving occurs and drops the reward from 25 to 12.5 BTC per block. Electricity now represents 80% of daily revenues ($0.71 of $0.89), leaving meager daily profits of $0.18 (1/22nd of day 1’s profits.) It is practically futile to continue mining past this point. The S5 should be decommissioned or upgraded. For example the S9, two generations ahead, produces on the same day daily profits 50×(!) higher at only 2.3× the power consumption. An S5 decommissionned on this day would have spent 28.2% of its total revenues on electricity ($397.19 of $1409.56.)

On 8 October 2016 (day 652) for the first time the S5 encounters a day where it is unable to mine more than it costs in electricity; profits stand at $1021.42 (up $9 in 3 months, futile indeed!) Over the next few months some days it can make a tiny profit, some days it cannnot.

In the second half of 2017 the S5 becomes unexpectedly profitable again thanks to the Bitcoin price increasing faster than the difficulty level.

By 11 March 2018 (day 1171) profits stand at $1238.77 (“lifetime profits.”) Electricity represents 84.1% of daily profits and overall ate 38.6% of lifetime revenues.

Approximately 70% of lifetime profits were generated in the first 30% of the machine’s life ($869.77 generated in the first 397 days) and 80% of lifetime profits in the first 50% of the machine’s life ($1012.37 in the first 561 days.)

A miner who had invested $418 into purchasing an S5 would have, after 561 days, turned it into $1012.37, a 2.4× gain. Mining was quite profitable.5

Profitability threshold assumption

The model presented in this post makes one assumption: it looks at the difference in hash rate between the beginning and end of a phase, and assumes it indicates how many machines were manufactured during that phase.

Hypothetically, if a machine is first put online, and if it is immediately decommissioned within the same phase (eg. mining is suddenly no longer profitable,) and if it is put online again in a subsequent phase (eg. mining is profitable again,) then the model would classify the extra hash rate as belonging to the wrong phase.

However this hypothetical scenario is implausible given the model’s parameter of $0.05/kWh, as shown by the chart below. The efficiency of the best and worst hardware manufactured over time (“best J/GH” and “worst J/GH” data from the first chart) is compared to the profitability threshold2 below which a machine mines profitably:

Efficiency of machines compared to profitability threshold

The worst line never intersects the threshold. The least efficient machines remain profitable during their entire phase of production. So the model’s assumption is valid.


We can calculate the upper bound for the global electricity consumption of Bitcoin miners by assuming they deploy the least efficient hardware of their time and never upgrade it. As to the lower bound it can be calculated by assuming everyone has upgraded to the most efficient hardware. The table below summarizes the model’s estimates as of 26 February 2017:

  Lower bound Best guess Upper bound
Power consumption (MW) 325 470-540 774
Energy consumption (TWh/yr) 2.85 4.12-4.73 6.78
Energy consumption (Mtoe/yr) 0.245 0.354-0.407 0.583
Energy consumption (quad Btu/yr) 0.010 0.014-0.016 0.023
Percentage of world’s energy consumption6 0.00260% 0.00376-0.00432% 0.00619%
Percentage of world’s electricity consumption6 0.0144% 0.0208-0.0239% 0.0342%
Electricity cost (million USD/yr)7 $142 $206-$237 $339
Energy efficiency (J/GH) 0.100 0.145-0.165 0.238

Updated estimates for more recent dates can be found below.

As of 28 July 2017 (average hash rate of 6398 PH/s):

  Lower bound Best guess Upper bound
Power consumption (MW) 640 816-944 1248
Energy consumption (TWh/yr) 5.61 7.15-8.27 10.93
Energy consumption (Mtoe/yr) 0.482 0.615-0.711 0.940
Energy consumption (quad Btu/yr) 0.019 0.024-0.028 0.037
Percentage of world’s energy consumption6 0.00511% 0.00652-0.00755% 0.00998%
Percentage of world’s electricity consumption6 0.0283% 0.0360-0.0417% 0.0551%
Electricity cost (million USD/yr)7 $280 $357-$413 $547
Energy efficiency (J/GH) 0.100 0.128-0.148 0.195

As of 11 January 2018 (average hash rate of 16200 PH/s):

  Lower bound Best guess Upper bound
Power consumption (MW) 1620 2100 3136
Energy consumption (TWh/yr) 14.19 18.40 27.47
Energy consumption (Mtoe/yr) 1.220 1.582 2.362
Energy consumption (quad Btu/yr) 0.048 0.063 0.094
Percentage of world’s energy consumption6 0.01295% 0.01678% 0.02506%
Percentage of world’s electricity consumption6 0.0715% 0.0927% 0.1385%
Electricity cost (million USD/yr)7 $710 $920 $1374
Energy efficiency (J/GH) 0.100 0.130 0.194

Bitcoin’s level of power consumption can be presented in a way to make it look large8:

Or we can make Bitcoin’s consumption look small:

When considering the big picture I believe Bitcoin mining is not wasteful due to the various benefits we extract from it.

Lastly, modeling the costs and revenues of a miner over its entire life such as the Antminer S9 or S7 reveals that the hardware cost is greater than its lifetime electricity cost. Therefore a miner’s business plan should not look at the electricity costs alone when calculating expected profitability.


On 11 March 2017 I removed the assumption that sales of A3218 dwindled down to practically zero post-June 2016, because although sales volume did decrease I do not have precise metrics to justify it.9

On 13 March 2017 I made the calculation of the upper bound for the electricity consumption more accurate (was 861 MW, now 774 MW), thanks to A3218 production volume provided by Canaan.

On 16 March 2017 I added the section Economics of mining.

On 30 March 2017 I added the comparison to the electricity consumption of decorative Christmas lights.

On 16 May 2017 I reworked the section Economics of mining to add more miners such as S7, S9.

On 4 June 2017 I added all miners released in the last 2.5 years to the section Economics of mining.

On 28 July 2017 I produced updated estimates in the conclusion.

On 28 August 2017 I added the section Profitability threshold assumption.

On 12 January 2018 I updated my estimate and added a comparison to the world’s consumption of electricity.

On 11 March 2018 I updated the CSV files in the section Economics of mining.

References and commentary

The chart covers the period 15 December 2014 to 26 February 2017. Starting as early as December 2014 is sufficient for accurate modeling because only one ASIC released in phase 0 is still profitable: Bitfury BF864C55. All others are no longer profitable.

The daily hash rate data was obtained from Quandl; the curve was smoothed out by calculating each day as the average of this day and the 9 previous ones.

The cost of electricity is assumed to be $0.05/kWh which is half the worldwide average. It is logical to assume miners seek geographical locations with the cheapest electricity.

All energy efficiency values given in joule per gigahash are reported at the wall, taking into account the power supply’s efficiency.

Mining hardware manufacturers only sell one generation of miners at any given time. Usually it is a result of producing and selling small batches one by one, as Bitmain and Canaan have done. But it is also a result of aggressive competition: when a company launches a new ASIC significantly outperforming the efficiency of the competition, their sales come to a stop until a more efficient successor is available, as Canaan CEO N.G. Zhang recounted. Therefore my model actually errs toward overestimating electricity consumption by assuming that the previous ASIC generation is being sold/deployed at the same rate until the very day preceding the introduction of the next generation, which we know is not true in some cases.9


  • Neptune launched in June 2014 and achieves 0.70 J/GH.
  • Solar launched in June 2015 and achieves 0.07 J/GH. The company declared bankruptcy in May 2016, however they certainly stopped deploying mining capacity months earlier. This chart assumes they stopped in January 2016. Later, KnCMiner was bought by GoGreenLight. So far they have not added new hash power, but merely reactivated the hardware they acquired.


  • RockerBox was included in the SP20/SP30/SP31/SP35 Yukon product series; it launched in May 2014 and achieves 0.66 J/GH. The company failed to launch its successors—PickAxe, RockerBox II—and declared bankruptcy in May 2016, however they certainly stopped selling products months earlier as they were far behind competition in terms of energy efficiency. This chart assumes sales stopped in January 2016.10


  • BF864C55 launched in March 2014 and achieves 0.50 J/GH.
  • Their 40nm ASIC never entered full-scale production, hence its absence from the chart.
  • Their 28nm ASIC launched in February 2015 and achieves 0.20 J/GH.
  • BF8162C16 launched in October 201611 and achieves 0.06 to 0.13 J/GH. This wide efficiency range is due to the ASIC being operated in a variety of configurations—sometimes manufactured by third parties—from air cooling (1, 2) to immersion cooling (1, 2, 3) where voltages and clocks are pushed to their limits.




  1. In an email exchange with Canaan staff (VP of Engineering Xiangfu Liu, Jon Phillips, and Wolfgang Spraul) they gave me the following historic metrics:

    • A3222:
      • number of wafers made: ca. 950
      • number of chips per wafer: ca. 3330
      • performance per chip: 25 GH/s
      • total: 79 PH/s
    • A3218:
      • number of wafers made: ca. 1100
      • number of chips per wafer: ca. 3650
      • performance per chip: 47.5 GH/s
      • total: 191 PH/s


  2. The profitability threshold in joule per gigahash is calculated as such:
    1e9 (1 GH/s) / (2^32 × difficulty) × block_reward × usd_per_btc / usd_per_kwh × 3.6e6 (joules per kWh)
    As of 26 February 2017 (difficulty = 441e9, 1 BTC = 1180 USD, $0.05/kWh) the profitability threshold is 0.56 J/GH:
    1e9 / (2^32 × 441e9) × 12.5 × 1180 / 0.05 × 3.6e6 = 0.56 J/GH
    So 3 ASICs in the chart are no longer profitable: Neptune, RockerBox, and A3222.  2

  3. Most of the hardware deployed during phase 0—CPUs, GPUs, FPGAs, first-generation ASICs—has not been profitable for a long time, so we make the assumption these miners upgraded to the least efficient ASIC available at the end of phase 0 that is still profitable: BM1384. 

  4. Real-world profits are less than this figure because other costs are not taken into account: hardware, facility, maintenance, labor…  2 3 4 5 6 7 8 9 10

  5. However another investor who on 27 December 2014 bought $418 worth of bitcoins would be worth $863 on 9 July 2016, a 2.1× gain. An important reason why mining was profitable was simply that BTC gained value. 

  6. According to IEA statistics for the year 2014 the world’s consumption of energy was 9 425 Mtoe, or 109 613 TWh (equivalent to 12.51 TW non-stop) and the world’s consumption of electricity was 1 706 Mtoe, or 19 841 TWh (equivalent to 2.26 TW non-stop).  2 3 4 5 6

  7. Electricity cost assumes $0.05/kWh.  2 3

  8. I like Leo Weese’s idea to make various comparisons to make it look either large or small. 

  9. In an email thread with Canaan staff, although they were unable to give metrics, they confirmed that sales of the A3218 slowed down after June 2016 when competitor Bitmain launched BM1387 with a 3× better efficiency.  2

  10. I reached out to Spondoolies CEO Guy Corem to get official confirmation of when their sales stopped, but have not received a reply so far. 

  11. In a 2015 post on HackerNews I previously incorrectly assumed hardware based on this ASIC launched in December 2015 but large-scale deployment only started in October 2016, as shown by this tweet from Bitfury Executive Vice Chairman George Kikvadze. 


oregonmines wrote: Nice chart , the 100% by least efficient method, is an interesting way to look at a chips contribution hash wise. 11 Mar 2017 18:10 UTC

hashingitcom wrote: I like it - I've not run the estimates on mining for a while (busy with other stuff), but I just found one from about 2 years ago where I'd estimated a best case of 160 MW, and a more likely 320 MW at that point in time.

Do your energy figures allow for just the ASIC characteristic or have you factored other inefficiencies (especially in PSUs, cooling, etc.)?
30 Mar 2017 09:46 UTC

mrb wrote: Yes my estimates do take into account inefficiencies (1) of power supplies and (2) of the cooling at the chassis level, because J/GH figures are measured "at the wall."

There is also the PUE to consider (overhead at the data center level) which is typically very low for mining farms, 1.05 or so. But the *lower and upper bounds* do not need to take into account the PUE. The lower bound, by nature, needs to assume the overhead is zero. And the upper bound is such a worst-case & unrealistic estimate (by assuming miners deploy the least efficient hardware) that it must surely already overestimate power by at least 5%.

For my *best guess* estimates, in theory I should add 5%. But the variance is pretty large at 470-540 MW and not meant to be 5% accurate, so it would not be useful to do so.
30 Mar 2017 21:13 UTC

CW wrote: Thanks for sharing your analysis. It helped me clear a lot of misunderstandings I had.

I still don't understand how to calculate the lifetime income of a miner "By 15 January 2016 84% of the lifetime income of the S5 had been generated". I reviewed the income-antminer-s5.csv but still I cannot understand how you make this calculation. Thanks.
21 Apr 2017 00:43 UTC

mrb wrote: CW: By 15 Jan 2016 it had generated $854 of income (last column) which is 84% of the lifetime income of $1021. Well, to be pedantic the lifetime income moved a little bit upward and downward past Jan 2016, depending on diff and BTC price, and stabilized at $1030... 23 Apr 2017 04:10 UTC

janhoy wrote: This article claims an electricity consumption of 110KWh per transaction. What is your take on that? 09 May 2017 13:16 UTC

janhoy wrote: Ok, I just found your other article at which I believe explains it :) 09 May 2017 13:43 UTC

mrb wrote: Yes. Although my full response to and criticism against Digiconomist is at 10 May 2017 22:18 UTC

SRSroccoReport wrote: Marc... excellent work on Bitcoin energy consumption. I run the site and I analyze how energy & the falling EROI - Energy Returned on investment will impact precious metals, mining, paper assets and the overall economy.

I see you have had a debate with Digiconomist on the energy consumption and cost to produce bitcoin.

I am trying to find out a basic cost of production for bitcoin and ethereum, as I believe this would at least provide a floor for their price.

Can you reply here or contact me at I would enjoy hearing what you would gauge as a current total cost to produce bitcoin and ethereum. I do realize their costs will continue to increase as time goes by, but it would be helpful in comparing cost of production to their market price... especially now that they are highly inflated above their cost of production.

best regards,

29 May 2017 05:02 UTC

mrb wrote: steve: Per the math in footnote 2, currently (difficulty = 596e9, 1 BTC = 2300 USD, assuming $0.05/kWh) the efficiency under which an ASIC is no longer profitable is 0.81 J/GH. An Antminer S9 operates at 0.10 J/GH, so electricity costs represent 12% of mining revenues (0.10/0.81), therefore 1 BTC costs 280 USD in electricity.

This doesn't count the cost of the hardware which has to be amortized over the lifetime of a miner. My models show that for an end-of-life miner like the S5, 28% of mining revenues need to cover hardware costs, so 1 BTC need to recover 640 USD of hardware costs.

So the overall production cost of 1 BTC is 920 USD. But even this number doesn't account for other smaller expenses: data center, labor. Call it ~10%. This places the overall production cost of 1 BTC at around 1010 USD.
31 May 2017 20:42 UTC

Robert L wrote: Thank you for sharing! Really enjoyed reading your analysis. It would be interesting to consider the true energy cost by considering the additional upper bounds of ~30% inefficiency in energy production and transportation to user. I would imagine the global mean is even higher.

12 Jun 2017 02:14 UTC

Digiconomist wrote: The problem of estimating Bitcoin energy consumption is a lack of a central register with all active machines. The only certain number is the absolute minimum energy consumption (hash * most efficient miner), but that number doesn't get close to reality as newer machines only slowly push the old ones out.

If you're going to derive energy consumption from actual hash you're going to have a pretty big error on the tail. This is the part with most older machines, that relatively have the most impact on total energy use (eg. just 50 PH/s of old S2 miners has the same weight as 500 PH/s of S9 miners).

The author heavily relies on economic assumptions in determining the activity of these older machines, which adds a lot of uncertainty regarding this so-called "bound". IMO this hasn't been properly disclaimed in the article. Still I'm happy with it, since it also validates the need for an economic indicator given the reliance on profitability assumptions.
15 Jun 2017 17:24 UTC

mrb wrote: Digiconomist: my upper bound already estimates the amount of such older machines by calculating the profitability breakeven point (0.56 J/GH), as explained and disclosed. 15 Jun 2017 19:44 UTC

Digiconomist wrote: Yes, but it doesn't disclose uncertainty surrounding that number. Average cost per KWh are an estimate, not a given. With 2 cents per KWh that break-even point would more than double & have a big impact on the tail. Only the lower bound is an actual bound. 16 Jun 2017 07:51 UTC

Digiconomist wrote: The way it's presented makes it seem like the upper bound is of equal strenght as the lower bound. While the lower bound only has some performance uncertainty surrounding it, but the upper bound is a diffent story. If the network was running even just slightly lower at 4 cents per KWh you'd have to include the A3222, which immediately adds 10%(+) to your bound (0.81 TWh). It's not that solid. 16 Jun 2017 08:07 UTC

Digiconomist wrote: On top of the previous the number is also sensitive to timing (after all there's no guarantee to when machines are actually deployed - shipping and setting up take time too) and hashrate measurement errors. Timing alone can easily add 5% to the bound (just try shifting the months up by one). And worse: these errors can also stack up. No taking the correct machines into account and getting the timing wrong will quickly result in a 20-30% error on the upper bound. 16 Jun 2017 19:20 UTC

mrb wrote: Yes the upper bound is influenced by the assumed cost of electricity, and there is some uncertainty about the cost. I disclose this assumption in multiple places.

But I do not believe a lower cost would have a significant impact on the tail. Machines produced pre-Dec 2014 (where my chart starts) were produced in relatively small quantities that even their aggregate power consumption is not that high.

Case in point: you are wrong about the A3222 potentially raising the upper bound by 10%, because only 79 PH/s of it were produced (see footnote.) If you do the math and assume 79 PH/s of it, then the upper bound only raises by 1.6%.

What about RockerBox and Neptune? Well again none of them were produced in large quantities: 0.3% of the hashrate is KnCMiner hardware, and Spodoolies bankrupted themselves due to low volume.

As to the timing of ASIC releases and hashrate measurements, the small inaccuracies should average out to zero (some data points slightly overestimating, others slightly underestimating.)

Finally, I want to reiterate that the upper bound is based on such an extreme assumption (everyone deploying the LEAST EFFICIENT machines) that it gives us an error margin large enough to account for the bound being potentially 5 or 10% off.

In the end, the real power consumption is going to be in between the lower and upper bound, far from each bound.
20 Jun 2017 04:56 UTC

Digiconomist wrote: Okay, so I quickly calculated the weight each phase has and put the result in the chart here:

As you can see early phases are responsible for just 14% of the hash, but 29% of the calculated energy consumption is due to these machines. Late stage miners obviously carry a lot more hash (38%), but have a lot less impact on the final figure (they contribute 24%).

I also created one showing what happens when you change phase 0 only. Here's the result of putting it closer to 1 J/GH:

Doing so would cause early phases to be responsible for 40% of the total energy consumption figure, and raise the overall number by 15%. This is what an error in just 290 PH/s at the start can do to your figure. Small number, massive energy weight.

Today the break-even J/GH for the network is 0.92 at 5 cents per KWh. We know rates can go as low as 2 cents. Where does that leave old Bitmain S2, S3 and S4 miners? They run at <= 1 J/GH. I'd say this really adds quite some uncertainty to the proposed method.

One more thing. I was wondering why the method wasn't repeated in reverse? If you can do it for the least efficient machines why not with the most efficient ones? This would make for a more interesting "lower bound". At least more comparable to the upper one.
20 Jun 2017 20:59 UTC

mrb wrote: If you are going to input bad numbers into my model, you are going to get bad results out :)

You cannot put phase 0 at 1 J/GH. As my chart shows, it is impossible to have 290 PH/s of ASICs averaging 1 J/GH. Why? Because as of Dec 2014 all ASICs being sold (Neptune, RockerBox, BF864C55, BM1384, A3222) were in the range 0.50-0.70 J/GH and the network was measuring 290 PH/s. That means a unknown but significant fraction of the hashrate was already 0.50-0.70 J/GH. Therefore the average of a distribution of machines between 0.50 (best) and 1 J/GH (break-even) is going to be a number below 1 J/GH.

In other words you would need a break-even point significantly above 1 J/GH (maybe 2 J/GH or so) for the average of phase 0 to be 1 J/GH.

Furthermore, you are wrong about 0.92 J/GH. As of 20 Jun 2017 (when you wrote your post) the break-even at $0.05/kWh was 0.79 J/GH (1 BTC = 2700 USD, diff = 712e9).

And as of 10 Jul 2017 (1 BTC = 2350 USD, diff = 709e9) the break-even is even lower at 0.69 J/GH.

If we bumped phase 0 from 0.51 to 0.69 J/GH it would increase my worst case power consumption from 774 to 826 MW, a 7% increase. And just as I said in my previous message, being 5 or 10% off is already accounted for.
11 Jul 2017 01:42 UTC

Digiconomist wrote: I put in 1 J/GH to show how sensitive your model is to estimation errors in the small tail (in terms of hashrate). Market leader Bitmain was only putting out less efficient machines at this point, but they haven't really been considered for this article even though there's a potential big impact in there. The new machines are never the problem when estimating energy consumption, it's the older ones that present the biggest challenge and are responsible for the majority consumption (e.g. phase 9 accounts for ~40% of the hash but only ~20% of the energy consumption).

A second problem I see in the model is the lack of consistency. Economics are used to cut the tail, but if applied consistently I would expect some spillover effects (so the hashrate increase during a period can never be completely attributed to the machines available during that period). This is as simple as: price goes down at time t, I turn off my machines, price increases fourfold at time t + 1, I turn my machine on again. This is something to be analyzed throughout, as it could cause a bias towards new more efficient machines.

I think you're missing fees in the BE point calculation by the way. There can be a small difference based on what difficulty adjustment is applied, but the gap is too big so fees must be left out (around 18% of the total reward on Jun 20). I'm a lot closer for Jul 10 (0.77 J/GH), but fees have more than halved at this point.
14 Jul 2017 11:56 UTC

mrb wrote: As per my previous msg, I calculated that if we significantly bumped phase 0 by 35% (0.51 to 0.69 J/GH) it would increase the worst case power consumption by 7%. So, no, a 35% input variation causing a 7% output variation doesn't show a particularly high sensitivity in the tail estimation.

The last machine Bitmain was shipping before Dec 2014 (AntMiner S4) was 0.69 J/GH. Therefore the snapshot presented in this research as of 26 Feb 2017 (break-even of 0.56 J/GH) was correct to ignore the S4 as it has worse efficiency. There is no potential big impact that the model is missing.

As to miners shutting down at t and resuming at t+1, yes it may have a small effect on the worst case numbers. If I have time I will correlate the evolution of the break-even with phases and see where its impact might be the greatest. But I don't expect the effect to be very large. After all, later phases saw the introduction of machines more powerful than earlier phases, so only a minority of the hashrate added at a given phase might come from older machines.

As to fees, I didn't take them into account because they don't need to. As of 26 Feb 2017 they represented 9% of the mining income (170 BTC daily as per, so the break-even including fees was 0.61 not 0.56. This doesn't change my phase 0 assumption: the least efficient yet profitable miner is still the BM1384.
16 Jul 2017 01:10 UTC

Digiconomist wrote: So we've exchanged some emails in which I explained why I think this method is based on cherry picking, and instead of providing any follow-up you decided to quote mine my emails to support false statements about my energy index.

Rather ironic don't you think? For anyone interested the email dump can be found here:
22 Aug 2017 07:42 UTC

Digiconomist wrote: In the meanwhile, Bitmain has delivered some evidence that a technical approach typically underestimates energy consumption. Even their own farm is consuming 70% more than the theoretic optimum (even though they only run the most efficient machines) - simply because this number doesn't include relevant factors such as machine-reliability, climate and cooling costs (which can have a major impact on a large-scale operation). Details here: 26 Oct 2017 08:40 UTC

mrb wrote: You said my model was "cherry picking" because it ignored (1) people with free/stolen electricity and (2) people who mine in the red "some kind of fork". My answers:

(1) There has NEVER been any evidence that a significant fraction of the hashrate came from free/stolen electricity. If free/stolen electricity is your explanation to justify why BECI's estimates were twice higher than mine as of, say 2017-04-12 (see our emails), then you would be implying that more than half of the hashrate is from free/stolen electricity, which would be an absolutely ludicrous statement to make.

(2) If people mined some fork, then they are not mining Bitcoin and they are irrelevant to Bitcoin's energy consumption.

If I may bring up again other points you commented on:

(3) I looked into the potential case of miners turning off miners at phase t and resuming at phase t+1: see the newly added "Profitability threshold assumption" section. As I expected, it is a non-problem and does not influence the model.

(4) There is no sensitivity in the tail estimation (see message from 16 Jul 2017 01:10 UTC which you ignored). You never replied.

So, of the 4 points you brought up so far, all have been refuted. Meanwhile, practically all of my critics in BECI remain ignored... See my comment (and comment about points in your email dump) in

About the Bitmain Ordos mine: your calculations are invalid. The Quartz article never says "1,000 miners were equal to 10 petahashes per second in processing power". But even if you found a source making this claim, these numbers are obviously grossly rounded. In reality the overhead of the Ordos mine is either 11% or 33% depending on which journalist's numbers we trust:
31 Oct 2017 22:01 UTC

Mega wrote: Bloomberg recently wrote:
"Even as bitcoin approaches $8,000, the price required for mining to be marginally profitable may reach a jaw-dropping $300,000 to $1.5 million by 2022, according to Christopher Chapman, an analyst at Citigroup Inc. He based his estimate on current growth rates for mining and the electricity consumed by computers doing the work. At that pace, the power consumption implied by bitcoin’s growth may eventually match what Japan uses."

Do you foresee this kind of trajectory? Also what's your advice for issuing ICO to finance a new BTC mining operation?
16 Nov 2017 18:56 UTC

mrb wrote: I couldn't find the original source of Chapman's analysis, but his core premise is flawed anyway.

He assumes the energy consumption will double ($300k) or triple ($1.5M) every year regardless of the Bitcoin price. That's not how things work. The energy consumption follows the Bitcoin price, in this order.

Also, his numbers imply miners will add between ~30 and ~240 gigawatts in the year 2022 alone, which is physically impossible. The entire world's electricity production increases by only about 70 gigawatts per year.

Chapman is blindly extrapolating current growth, without considering the real-world impossibilities of unbounded exponential growth.

My personal estimate, assuming Bitcoin stays around $10k one or two years from now (which probably won't be the case), the energy consumption of miners would be around 3-5 gigawatts.
16 Nov 2017 22:21 UTC

concernedCitizen wrote: Hi,.. isn't mining the necessary evil in order for the whole system to work in a distributed maner. I mean,.. it feels like more and more people are mining, but those who actualy use it are pretty much the same.
Is the system actualy becoming more secure with so much energy being used?
I mean,.. doesn't the system has a target of 10mins per block? And the more people join and every 2016 blocks they have to adjust the dificulty to match that target?
Maybe mining shouldn't be so profitable..
Seem's we are all fighting each other and we are all greedy bastards
What would be the necessary amount of energy required to run this system?
What are your thoughts on this?
05 Dec 2017 00:02 UTC

Parijat wrote: It's unclear to me if you account for the fact that several different clusters could be trying to mine the next block, not only one of them will succeed while the rest will simply throw away the work they just did. Do those wasted hash computations figure in your power consumption calculations? 07 Dec 2017 02:50 UTC

babug wrote: Why in your calculations 1J = 1W*h? In reality 1J = 0.0002778 W*h 07 Dec 2017 07:35 UTC

mrb wrote: concernedCitizen: mining is required to secure the system. The current amount of global mining hashrate at above 10 exahash/sec means it would cost over $1 billion to execute a majority attack to be able to revert Bitcoin transactions. It's desirable to have this security margin.

Parijat: this is implicitly accounted for by the rate at which blocks are discovered. Multiple miners working on the same block means the statistical average amount of time it will take to solve it is reduced.

babug: no, my calculations assume 1 joule = 1 watt-second.
19 Dec 2017 19:59 UTC

cosmiclimber wrote: Fantastic work. I'd be curious to see something similar for the total electricity use for mining and transacting all crypto, including alts. Any guess at how much additional energy they are requiring as well? 22 Dec 2017 15:03 UTC

Bombtrack wrote: This is exactly the giant problem most of the crypto's face. There is just no keeping up with that...unless....
Burstcoin mines on harddisks, making it 400x less power hungry than BTC. Even if you dont care about all the other features the coin has, it still is a very interesting coin simply due to its extremely low power consumption!
Have a look:
29 Dec 2017 10:23 UTC

badgerd wrote: in the example of antminer s7 and s9 you expensed 100% of the cost of the equipment before the life of the equipment ran out. this would explain why the hardware cost for the S 5 was lower on a percentage basis. doesn't the price going higher extend the usable life of the hardware? 13 Jan 2018 11:14 UTC

badgerd wrote: i'm taking a different approach. at 12.5 bitcoin per 10 minutes that is 75 per hour, 1800 per day and 657,000 per year. total "prize" is 657,000 times price of bitcoin.
657,000 times $14,000 = ~ $9.2 billion per year. if you were the only one mining electricity consumption would be very low and profit would be high. but economics 101 says as people catch on to the easy money more competition comes in so the point is the total spend for electricity is dependent upon number of people spending money on electricity trying to win the prize. it is like any business if people see someone making money competition comes in and the percentage spent on electricity will rise until it is not economical to produce the product. As the price goes up the percentage spent on electricity goes up when the price comes down the percentage of electricity cost comes down to a point. if electricity cost .10 per kw/h and electricity was only 30% of the cost $2,759,400,000.00 then profit before hardware would be $6,438,600,000.00 and kw/h used would be 27,594,000,000.00. At $14,000 per bitcoin. i think this seems low. i also think bitcoin going from $4,000 to $14,000 would extend the life of some of the old hardware? right or wrong? i'm no engineer.
13 Jan 2018 11:31 UTC

Digiconomist wrote: Morgan Stanley actually talked with Bitmain's suppliers, and looked at the number of chips delivered to figure out how much the company can put out. They estimate it costs $3.000 to $7,000 to produce a Bitcoin.

This article puts it at about a $1,000 per Bitcoin based on a weak guess (ignoring basic stuff like seasonality and cooling requirements - the heat generated by thousands of machines needs to go somewhere).
15 Jan 2018 16:09 UTC

mrb wrote: Digiconomist/Alex: you spread false information, again.

Morgan Stanley includes non-electrical costs, so you are comparing apples to oranges. Look at their electrical-only cost per BTC: it's $1352.74 at $0.06/kWh. See exhibit 4: Adjusted for $0.05/kWh, to be comparable to my model and your model which both assume this price per kWh, this implies a cost of $1127 per BTC.

My model estimates as of 11 Jan 2018 between $710M and $1374M in electricity cost per year. See the summary. If you divide it by the number of BTC mined yearly (365×24×6×12.5), this implies between $1081 and $2091 per BTC.

Morgan Stanley's estimate of $1127 perfectly fits in my bounds of [$1081 .. $2091].

On the other hand, your Broken Energy Consumption Index estimates as of 21 Jan 2018 $2,171,817,216 in yearly electricity costs. This implies $3306 per BTC, which is significantly higher than both Morgan Stanley's and my estimates.
22 Jan 2018 02:20 UTC

mrb wrote: badgerd: yes, Bitcoin gaining value extends the usable life of the hardware.

However it is not true that as the Bitcoin price goes up the percentage of mining revenues spent on electricity necessarily goes up. As a direct counter-example you can look at the period from March through December 2017: the S9 went from spending 20% to only 5% of its daily mining revenues on electricity, because the price has gone up so quickly that miners have been unable to deploy hashrate to match the pace of the price increase. In other words, mining has become more profitable.

That said, regardless of the Bitcoin price, I do expect that the long-term trend (5-10 years) will be that electricity represents a bigger and bigger percentage. It's represented by the profitability threshold curve in the section "Profitability threshold assumption": the blue curve goes down over time, but there are certainly periods of time where it goes up.
22 Jan 2018 22:07 UTC

Digiconomist wrote: LOL, I'm spreading false information, ironic..

In any case, I've covered the full Morgan Stanley report here:
25 Jan 2018 21:19 UTC

mrb wrote: Digiconomist/Alex: You avoided replying to the points in my message, again. This is a repeating pattern. I debunk your arguments. You don't reply but make unrelated arguments on another point. I debunk them. You don't reply, etc.

As to the Morgan Stanley stuff, you conveniently avoid mentioning their chart (literally page 1 exhibit 1) which agrees with my estimates. Read

And you literally fabricated your energy forecast. Read

So, yes you are spreading false information.
07 Feb 2018 20:40 UTC

DD wrote: If crypto mining projects were able to obtain electricity at no cost, how do you anticipate that this would affect the future growth of the crypto mining industry and continuity of use of proof-of-work coins?

This is more than a hypothetical question.

04 Mar 2018 03:48 UTC

Digiconomist wrote: There's no point in replying when you just take some parts that you like, make up some story around it, and then accuse me of spreading false information. That's why I had to dump this: So I'm not really up for that again; just here to point people to a innovative, simple and superior approach to getting production numbers. 05 Mar 2018 13:52 UTC

mrb wrote: Let it be clear: I do not "make up stories." You are certainly spreading false information, as evidenced right here and in the numerous critics at which you have largely ignored, such as critic #2 (eg. real-world data proving your "60%" parameter wrong.)

And for the 2nd time, I already replied to that email dump, see my comments dated 31 Oct 2017 21:59 UTC and 07 Jan 2018 03:17 UTC in But you have largely ignored them too. Repeating pattern.
05 Mar 2018 20:48 UTC

mrb wrote: DD: the amount of legitimate paid-for electricity spent on mining is such that I doubt users stealing electricity could make any sort of significant impact on the market. 06 Mar 2018 21:25 UTC

Peter wrote: Pouvez-vous me dire si ces calculs sont exact :
Le calcul est fait sur la base de l'antminer S9 (13 Thash/s - 1375w)
Today 03/24/2018 26 000 000 Thash/s
(26 000 000/13) X 1375 = 2 750 000 000 W
soit 24 TW/h
Is the AntiminerS9 still profitable?
24 Mar 2018 21:13 UTC

Bitcoin Mexico wrote: Very interesting analysis! Thanks for sharing this information! We also published more data related and translated to Spanish. 28 Sep 2018 23:52 UTC

Digiconomist wrote: Cambridge launched the Cambridge Electricity Consumption Index based on this methodology, and the results barely deviate from the estimates provided by the Bitcoin Energy Consumption Index. So much for superiority.
01 Aug 2019 20:03 UTC

mrb wrote: Digiconomist: Incorrect, again. The Cambridge Electricity Consumption Index has historically significantly differed from BECI. In year 2018 for example, BECI was on average 51% higher than CBECI:

You characterize a 51% difference as "barely deviate"? Add this to your long list of false claims you've made over the years...
19 May 2020 02:10 UTC

Hello wrote: This information is really out of date. 13 May 2021 16:12 UTC