需要什麼技術來虛擬模擬...所有內容?


7

This is not my question.

在我的故事中,我有一台可以使用電極連接的機器。它允許人們進入一個小的虛擬現實空間,其中從重力和電磁力到每個分子的原子和夸克的所有事物都被模擬。這意味著人們可以像在現實生活中一樣準確地進行戰鬥,但是沒有死亡或受傷的可能性。

在狹小的空間中創建"完美"的世界模擬需要什麼技術?

具體來說,這樣的設備需要什麼樣的計算能力才能實現這一目標?

對於這個問題,狹小的空間可以定義為大約10 x 20平方米,高度大約3米的房間。

模擬必須能夠:

  1. 裝有蘋果,武器等物體。
  2. 所有事物都必須以某種方式進行模擬,以使虛擬現實中發生的事情如果重複或複制,都會在現實生活中發生。
  3. 它必須複製用戶的超級能力-即使用戶對他們的工作方式一無所知

假設在我的故事中,科學家發現了theory of everything,並且它與我們在現實世界中所知道的一切非常相似。(即解決這個問題,就好像要模擬關於我們當前宇宙的已知或理論化的一切一樣)

對於這個故事元素,我正在考慮讓一個具有獨特超能力的人可以觸摸機器或設備,使其性能提高數百倍甚至數千倍。這會有助於解釋巨大的計算能力嗎?

編輯說明:對此機器/技術沒有施加限制正是問題所在。我故事中的機器是用來幫助人們練習和理解他們的超能力的。由於超級大功率的差異很大,因此需要盡可能精確,以確保超級大功率在機器內部和外部均以相同的方式工作。否則,可以在機器中將氫原子變成金原子的人實際上可能在現實生活中將氫原子變成硼原子。

11

For a perfectly realistic simulation, like you imagined you would have to get down into subatomic level. This way when someone punches a hole through a table the splinters are perfectly realistic to the point of being possibly indistinguishable from real life.

I will take a step back, though, and instead of calculating for all the way down to quarks, I'll stay in the proton/neutron world, to make things simpler.

You'd have to store the following properties in memory for each particle:

  • position (relative to an origin, say an arbitrary corner of the room)
  • momentum
  • mass
  • charge
  • spin

Let's represent each property as a 1024 bit array. Why? For precision - if you are going to play with post-singularity technology, might as well make measurements more precise than today (1024 bits is arbitrary, and makes measurements 2960 times more precise than 2019's IT industry standards).

We also need to assign a memory address for each particle in the room. Let's deal with protons, neutrons and electrons. Why? Because then we can kinda approximate to the amount of particles we will have to deal with just by counting the mass of everything in the room.

10m $\times$ 20m $\times$ 3m equals 600m3 of air. The density of air is 1.225kg/m3 at standard conditions, so the air in the room has a mass of 735kg. Let's add two fully-clothed adults, floor, walls, a ceiling, a wooden table, some columns, some fruits, swords, for a total of an arbitrary 1,265kg. I have taken this number out of a body cavity, but it is quite believable. If we just compress the air a bit when adding all the other stuff, we have a nice, round number of 2 metric tons of stuff.

Protons and neutrons have different masses, but they are close enough to one another. Let's assume a neutron for each proton and we can use an average mass of 1.673776 $\times$ 10-27 kg per particle. Let's not calculate electron mass now because I'll just approximate for now.

So we have like...

$$ \frac{(2 \times 10^3)}{(1.67 \times 10^{-27})} = 1.2 \times 10^{30} \mathrm{\; atomic \space nucleus \space particles} $$

If the room is electrically neutral, we'll have an electron for each proton (which are half of the particles above), so the actual total would be more like 1.8 $\times$ 1030 particles.

If for some arcane reason the future people are still using bytes, we need to use a 128 bit architecture (i.e.: each address takes that amount of bits, or 8 bytes).

Each particle will have its own address, which is eight bytes in the address table. Each particle will also occupy 576 bytes (1024 bits per particle property = 64 bytes per property, and each particle has five properties). So: 640 bytes per particle.

$640 \times 1.8 \times 10^{30} = 1.152 \times 10^{34} \mathrm{\; bytes}$.

We're talking about needing approximately 11,520 geopbytes.

For comparison, Cisco, the largest router and switch maker in the world, claims that in 2016 the internet finally reached a combined annual traffic of one Zettabyte. A single Geopbyte would be greater than that by nine orders of magnitude. In other words, your simulation would require more than a billion times more bytes than the amount of bytes that circulated in the internet in 2016.

When we reach the point where we can do that, quantum processors might already be as obsolete then as the abacus is today, so I don't even want to imagine the amount of processing power involved. Let's just say the processors will runs on Clarkean magic or handwavium.


5

It depends on what the eventual Unified Theory of Everything actually proves

Right now there are two theories about how small small can go. One theory is that space is quantized at the Planck scale. This is the belief that space is made up of discrete band-limited units and that nothing exists at a smaller scale than this. The second theory is that nothing can exist that is smaller than this scale, but that things can exist that are larger that do not divide evenly down to the planck scale.

According to the first theory, in order to account for everything in every situation both known and unknown, you can achieve this by using the planck scale where the universe is theoretically indivisible for any practical purposes. Your room is 1.25e+36 by 6.25e+35 by 1.875e+35 planck lengths giving your a grid of about 1.465e+107 data points. Assuming your computer is made up of molecules, you would need a computer made up of about 10 to the 40th power universes just to create a disk space that can hold all that data; so, 100% true fidelity is way way way beyond doable.

According to the second theory, space is analogue no matter how small you go; so, there is actually no way for a computer to achieve absolute 100% resolution of it regardless of how many universes worth of matter you throw at the problem. This makes the issue go from intractable to truly impossible.

The good news is that the law of averages is your friend

By this I mean that when you take a sample grouping of similar things, you can make increasingly accurate predictions the larger the sample becomes. In other words, you don't need 100% fidelity to know exactly what will happen 99.99999% of the time at the macroscopic scale.

One thing computers are good at doing is statistically simulating complexity and data compression. As long all of your powers rely on the known properties of subatomic physics, you can simplify any pattern. For example: if your power relies on a certain exotic subatomic particle made up of a particular arrangement of techni-quarks, higgs-bosons, and handwavium that binds with a certain % of standard matter to form "unobtainium" which in turn binds with a certain protein in your sweat glands, then you can simulate all of those known properties as they apply to each layer up of interactions abstracting behaviors into accurate but probabilistic output at much larger scales. EI: first you index what the subatomic is doing, then the molecules, then the cells, then the tissues, etc. In the end, your program, could simulate and abstract your entire body into macroscopic blobs of tissues represented by mathematical seeds that when pushed through the right functions are predictive of all the countless repeating structures inside of it working in tandem.

Scanning your body in the detail you need to simulate it in this manner could take a very long time as the scanner samples, aggregates, tests, and resamples data, but once your anatomy is "compressed" into the system, you could run this simulation on relatively plausible computers. Because humans live in the macroscopic, having a margin of error is generally fine. If you fire a 1244.7°C ball of fire in the simulator and in real life it's, 1244.6°C, because you failed to account for a few particle of unobtainium that were unevenly distributed, who cares? No human will notice the difference making the training you get in the simulator perfectly applicable to the real world scenarios you are training for.

This is also true of questions like if your power will form gold or boron. The important question here is not mapping out the exact molecular activity, but understanding the rules by which your powers work, and having a scanning method that is precise enough to capture the states where either one or the other would be true.


1

When people think of simulation, they often go directly to brute-force solutions that puts 100% of the strain on the given computer and its parts. Which is basically summarised as "try as best you can to fool a compiled working conscious person into believing something fake is real"

A more elegant solution (or fucked-up, depending on your point of view) would be to jam a piece of technology deep into the brain, primarily into the older parts of the brain like the thalamus; technology that everybody has and they just accept it like we accept everybody has a rectangular computer in their pockets these days, and that's that.

This may well be the method in which the Matrix story practically works.

Why: Although we definitely do not understand consciousness or what causes it, one theory suggests that a good part of it is this sort of 'compiler' or 'zipping algorithm' the brain uses to basically intertwine all of the asynchronous and occasionally contradictory information the brain deals with into a "story" it tells itself (the consciousness sort of arrives then from 'the self' getting caught-up in this compiler as a variable and source of stimuli, like a snake eating its own tail) This strange function of crushing information into a "story" means that inconvenient things like the blind-spot in your eye, the differing input times of vision compared to hearing, the fact you really really want a cigarette vs the conflicting knowledge it's increasing your chances of death, and all the rest of it.

How: If you could essentially toss digitally-created stimuli' into the brain before this 'zipping algorithm' takes place, it's conceivable to hypothesise the brain would happily include the false-reality into the overall 'consciousness hallucination' that is our everyday waking life. Logical discrepancies and problems with the fidelity of the simulated input would just melt or be crushed away in the zipping/compiler process and the conscious creature likely wouldn't notice anything at all, except afterwards they might have some pretty messed-up dreams when the brain is essentially de-compiling and trying to work through problems that it had tossed into the unconscious during the daily bullshit-a-thon that is consciousness.

--

edit; it's conceivable you could include the aforementioned 'messed up dreams' (if you choose to include the imaginary phenomenon at all) as a kind of increasing risk or draw-back to using the simulator too much or too often; it could hypothetically cause people to suffer psychosis, or have mental breakdowns, become paranoid, split personas or even just give them cause to believe they're -still- in the simulator, or contend that dark forces are trying to insert 'small lies' into their daily life through this embedded device (and in addition physical kill-switch that would guarantee this is not the case, if you wanted to include it, is that a 'receiver' or switch in the back of their head must be on in order to receive any kind of hallucination.)


1

An intuitive approach

It is trivial to show from combinatorics that, classically, to represent the state of one atom, you must have more than one atom (in fact, many more than one atom).

The proof: Let's assume your computer's memory works by storing bits in the spin state of an atom (the type of atom doesn't really matter). Atomic spins are quantized, and can either be "up" or "down," which is convenient for building a binary system, where we can say 0 is "up" and 1 is "down."

If you assume you require 32 bits to represent all the possible states of a single hydrogen atom, it will take 32 memory atoms just to represent this single hydrogen atom.

In reality, for all the possible properties an atom can have, you will need a lot more than 32 bits. The number of bits you actually need is dependent on the number of properties your atom can have (spin, momentum, charge, etc...), as well as the resolution you need (the dynamic range).

This implies that, classically, in order to represent a simulation of a room down to the atomic level, you need a room much, much larger (in mass) than the room you intend to simulate to contain all of your computing hardware.

Even if we look at it from a quantum point of view (i.e. a post-singularity society that has created working general quantum computers), you can trivially prove that there is a 1:1 correlation.

If your simulated hydrogen atom has 500 possible quantum states (a gross underestimate to be sure), and you can somehow store this in the quantum state of a real hydrogen atom, then you need at least one real atom for every simulated atom you want to compute, simply to store the information about its state.

But what do we need then?

All of these intuitive concepts about what it takes to simulate the world with "exact precision" led to a more exact formulation known as the Berkenstein Bound.

Essentially what the Berkenstein bound says is that the amount of information you can place in a given amount of space is limited. Conversely, it also shows that the amount of information you need to represent any physical system at the quantum level is directly related to its mass and volume. It also shows that there is an upper limit on the amount of processing you can do with any given amount of mass and space.

The Berkenstein bound was almost immediately found to have a direct relation to black holes: Namely that if you attempt to exceed the Berkenstein bound (i.e. put more information in a given volume than it can support), your computer will collapse into a black hole!

Thinking back to our intuitive thought experiment before, this makes sense. To simulate your world you need bits. If you need atoms to represent bits, and you place too many atoms together in a given volume, of course they'd exceed the Schwarzchild radius and collapse into a black hole.

So what does the Berkenstein Bound say about your simulated room?

Well, as we've established, the amount of information you need to simulate a given space at its quantum level is directly related to the size of that space and the amount of mass in it.

Your question doesn't state anything about the mass in the room, but gives us its dimensions, which approximate a sphere of around 12 m3 (as an aside, rather than a cube, a sphere is the best configuration for your room as it minimizes surface area).

So, by the Berkenstein bound, your room requires approximately

3.08 x 1044 bits / kg

to exactly represent at the quantum level, and this is just the memory to store the states of all the atoms. It says nothing about computing the states of those atoms.


-1

If you need everything down to the quarks, you’ll need to simulate everything down to the Plank length $1.6 \times 10^{-35}$m and Plank time $5.3 \times 10^{44}$ seconds. That's the same length scale as strings.

Assuming your theory of everything works out to be something like string theory, you will need to compute the second derivative and running first and second integral (force/acceleration, energy/velocity, and position) of all of those elements in 10 spatial dimensions (not time)

Per simulated second, then, for a 10 x 20 x 3 meter room, your computer will need $2.8 \times 10^{151}$ calculations.