computing scales

ScaleFLOPS {Floating point Operations Per Second ]Bits
Deciscale (10−1) 5×10−1 Average human mental calculation 7 bits – the size of a character in ASCII
Unit (100) 1 Average human addition calculation 8 bits = 1 byte or “character”
Equivalent to 1 “word” on 8-bit computers
Decascale (101) 5×101 Upper end of human perception 16 bits = 2 bytes which is needed for a unicode character that represents all languages and many symbols
Equivalent to 1 “word” on 16-bit computer
Hectoscale (102) 2.2×102 Upper end of human throughput
4×102 – Commodre 64
128 bits (16 bytes) is the size of addresses in IPv6, the successor protocol of IPv4
512 bits (64 bytes) – the size of some registers in 64-bit CPUIPv6
Kiloscale (103) 92×103 Intel 4004 (first commercial CPU in 1971)
2×103 – Commodore 64 doing additions
Typical RAM capacity of early 1980s home computers ranging from 1K to 64K
Commodore 64 had 64K of RAM
Megascale (106) 1×106 The Atlas supercomputer was the first to achieve 1 megaFLOPS in 1962
1×106 (Motorola 68000 CPU in 1979)
1.6×107 Cray-1 supercomputer in 1976
640K was the RAM of the original IBM PC
1.44 megabytes was the capacity of a 3.5 inch floppy disk
Early hard disk drives had 10 megabytes of magnetic storage
Gigascale (109) 1.41×109 The Cray-2 supercomputer was the first to achieve gigaFLOPS in 1985
147.6×109 (Intel Core i7-980X Extreme CPU in 2010)
170×109 Numerical Wind Tunnel by Fujitsu
650 megabytes was the capacity of a compact disc (CD)
6.4×109 bits – capacity of the human genome
2 gigabytes – maximum addressable RAM for 32-bit CPU computers
4.7 gigabytes – capacity of a DVD
25 gigabytes – capacity of a Blu-ray
Terascale (1012) 1×1012 Intel’s ASCI Red was the first computer to achieve one teraFLOPS in 1997
11.3×1012 (GeForce GTX 1080 Ti GPU in 2017)
35×1012 (NVidia RTX 3090 GPU in 2020)
86×1012 (NVidia RTX 4090 GPU in 2022)
596×1012 (IBM BlueGene/L supercomputer in 2007)
1 terabyte was a typical hard disk in 2010
1.25 terabytes – capacity of a human being’s functional memory, according to Kurzweil in The Singularity Is Near
Petascale (1015) 1.026×1015 (IBM Roadrunner supercomputer in 2008) – first to achieve petaFLOPS
8 petaFLOPS – NVIDIA H100 NVL Tensor Core GPU built for trillion-parameter language models, and 256 can be connected to achieve 2 exaFLOPS
17.59×1015 Cray Titan supercomputer in 2012
20×1015 Roughly the hardware-equivalent of the human brain according to Kurzweil from his book: The Age of Spiritual Machines: When Computers Exceed Human Intelligence
36.8×1015 Estimated computational power required to simulate a human brain in real time.
10 petabytes – the Library of Congress’s collection
20 petabytes – a full human brain map
40 petabytes – the Internet Archive in 2018 with over 400 billion pages and all media
8 ×1017 – the storage capacity of Star Trek character {Data}
Exascale (1018) 1×1018 (Fugaku supercomputer in 2020) – the first to achieve exaFLOPS at peak performance
1.1×1018 (Frontier supercomputer in 2022) – the first true exascale machine
2.43×1018 Folding@home distributed computing system during COVID-19 pandemic response
The million+ core SpiNNaker and the Human Brain Project
300 petabytes – storage space at Facebook data warehouse in 2014, whereas total printed material in the world is around 200 petabytes. Facebook did about 1500 petabytes in 2020, or about 4 petabytes of new storage per day.
15 exabytes – storage space at Google data warehouse in 2013
Zettascale (1021) First zetta scale system predicted for 2035
1×1021 (Global weather prediction feasible around 2030, in Type 0.8)
Zettascale computers will be able to accurately forecast global weather for 2 weeks in the future and accurately model the whole human brain.
0.36 zettabytes – amount of information that can be stored in 1 gram of DNA
18 zettabytes – total global data in 2018 and predicted to reach 175 zettabytes by 2025
64 zettabytes – total global data in 2020
120 zettabytes was generated in 2023
Yottascale (1024) Expected scale of quantum computers in Type I
1025 to 1026 – simulating metabolome (small-molecular) and proteome (protein) processing in cells or organisms
7.3×1024 bits – information content of 1 mole of liquid water at 25 °C, equivalent to an average of 12.14 bits per molecule
Beyond yottascale (>1024) 1036 onwards – Matrioshka brain
1043 – simulating the stochastic behavior of single molecule
2.6×1042 bits – required to recreate the natural matter of the human brain down to the quantum level on a computer. Teleportation of life forms requires magnitudes more than this.
1.5×1077 bits – information content of a one-solar-mass (1M) black hole.
Upper limitsBremermann’s Limit is the maximum limit allowed for computation under the old laws of physics. This is 247 bits per second per gram of computing substrate.

– A transcomputational problem requires processing of more than 1093 bits of information. The number 1093, called Bremermann’s limit, is the total number of bits processed by a hypothetical computer the size of the Earth within a time period equal to the estimated age of the Earth.
The Bekenstein Bound is the upper bound of the amount of information I inside a Matrioshka Brain with a given energy E. This is also the upper limit on the thermodynamic entropy of the Brain, or the maximum amount of information required to perfectly describe it at the quantum level.

    I <= (2 Pi E R)/(h c ln2)
where I is the information, E is the energy, R is the radius, h is Plank’s constant, c the speed of light.

It can also be written as:
    I <= k M R
where M the mass in the region and k a constant having the value ~2.57686*1043 bits/(m kg). T
Scroll to Top