Is 0.999... = 1? (spoiler alert: no it is not)

You may have encountered the popular claim that \( 0.999... = 1 \), where the three dots signify that the decimal continues forever. This is a somewhat weird claim, since it would mean that mathematics is broken. There should be no way for two different numbers to have the same value. What makes it weirder is that this is quite popular claim. I've even seen mathematicians say that it's true! But is it though? One popular proof is to first denote \( S = 0.999...\) and then multiply by \(10\) to get \( 10S = 9.999...\) and subtract \( S \) from it, to get  \( 10S - S = 9.000...\) and finally dividing by \(9\) yields  \( S = 1.000... = 1 \) and we see that  \(0.999... = 1\)! However, there's a problem. This short derivation is not strictly speaking correct. It is veeeery close to being correct, and to see why let's look at finite decimals first. Let's say that \(S = 0.999\) (note that this is not the same as \(S = 0.999...\) ). Let's do the same trick as before, so

Entropy and the arrow of time

When you whisk an egg and leave it alone, why doesn't the egg white and yolk separate on their own? If you light a fire, why does it keep burning? And why do we perceive that time never changes it's direction? All of these questions are answered by a single concept in physics: entropy.

Simply put, entropy is a measure of the microscopic disorder of a system, and the second law of thermodynamics says that it can only increase or stay constant in a closed system. And all of this is just simple statistics.


Let's put this in more concrete terms with a simple example: we flip three identical coins and record the outcome. First of all, what does this have to do with anything? In thermodynamics, we are interested in the energy of the studied system. That energy is made up of all of the individual energy states of the particles that make up the said system. The two sides of a coin represent the possible states of a two level system, which makes it quite a popular example.

Now, if heads is one and tails is zero, then all of the possible outcomes form a list like this one \begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\  \end{bmatrix} In thermodynamical terms, each row is a microstate and all of the rows that have the same amount of ones and zeros are macrostates. Then the above list has eight microstates (obviously) and four macrostates, which can be grouped like this \begin{align} M_1=\begin{bmatrix} 1 & 1 & 1 \end{bmatrix},~ M_2=\begin{bmatrix} 1 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \\ \end{bmatrix},~ M_3=\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix},~ M_4=\begin{bmatrix} 0 & 0 & 0 \end{bmatrix} \end{align} The macrostates arise simply from the fact that we can't discern the individual coins from each other, so the order of the ones and zeros doesn't matter. These macrostates are characterized by their multiplicity, \(\Omega\), which is the number of different configurations (microstates) that fit in a single macrostate. For example, the multiplicity of state \( M_2 \) is simply \(3\), because it has three different microstates, whereas the \(M_1\) has a multiplicity of \(1\).

Now one might ask: what is the most likely outcome of this triple coin flipping? We can find this out by dividing the multiplicities with all possible microstates, as in \begin{align} P_{1}=\frac{1}{8},  P_{2}=\frac{3}{8}, P_{3}=\frac{3}{8}, P_{4}=\frac{1}{8} \end{align} So it's clear that the highest probability states are \(M_2\) and \(M_3\), because they have the highest multiplicity. We can easily extend this to larger systems, by noting that the multiplicity is given by \begin{equation} \Omega(N,q) = \frac{(q+N-1)!}{q!(N-1)!}, \end{equation} where \(N\) is the number of particles in the system and \(q\) is the energy state we consider. Now we don't need to make cumbersome lists that blow up really easily, and the probability of finding the system in a given macrostate is \begin{equation} P(q) = \frac{\Omega(N,q)}{\sum \Omega(N,q_i)}, \end{equation} where the summation is taken over all possible energy states \(q_i\). This is the heart of all thermodynamics, probability!

Finally, we can start talking about entropy. It is sufficient to think only about these multiplicities, but they become very large, very fast, so it's more convenient to represent the same information with the much more manageable entropy, \(S\), defined as \begin{equation} S = {\rm k}_B{\rm ln}(\Omega) \end{equation} It needs to be noted that we have made two important assumptions here. First of all, we have assumed that the microstates that have the same components cannot be distinguished, so their order does not matter. And secondly, all of the microstates have the same probability. That's right, they are all equally likely. That means that the microstates that depict the egg yolk and white separating are as likely as those that do not. But once the egg is whisked, the separation will never happen!

This is because although the states are all equally likely, there are far more configurations that do not allow separation. That is, those states have the highest multiplicity, and thus entropy and also probability. In realistic systems, the states with the lowest multiplicities are so unlikely that it's safe to say that they will never happen. This is illustrated in the figure below for two small interacting systems.
Here we can see that the whole system has a whopping \(\approx 10^{116}\) different microstates, although it is made up of only \(500\) particles and \(100\) energy states! A real system would be several orders of magnitude larger still and the peak in the multiplicity graph would be narrower. For an infinitely large system, the peak would be infinitesimally thin and there would be only one energy state where the system is allowed to find equilibrium.

In system \(A\), the state \(q_A=60\) is most likely when \(A\) and \(B\) are allowed to interact weakly, and \(A\) will spontaneously drift towards this state, if it is supplied the energy to do so. If \(A\) starts from some low energy level and \(B\) from a higher one, then the two will share energy until they reach equilibrium, or the highest entropy state they can reach.

This movement towards the highest entropy state is what we call the arrow of time. Since entropy is only allowed to increase, we perceive time as always going forward and never back. Although in closed systems the second law is absolute, we have found ways to reduce entropy locally by supplying external energy. But whether this is the same as turning back time (locally)? Well, in my opinion, not so much.

Comments

Popular posts from this blog

The Nobel prize in physics 2018: light all the way

Is 0.999... = 1? (spoiler alert: no it is not)

Why photonics? A brief history of me