Entropy and the Singularity

Reverse Predictability

The foundation of this idea is the concept of entropy, formulated by:

Entropy is a measure of microscopic configurations (Ω) that correspond to a thermodynamic system in a state specified by certain macroscopic variables.

Assuming that each of the microscopic configurations are equally probable, the entropy of the system is the natural logarithm of that number of configurations, multiplied by the Boltzmann Constant (kB), which provides consistency with the original thermodynamic concept of entropy – and gives entropy the dimension of energy divided by temperature.

Considering this, my thought diverged to the more general accepted rule that energy = change. The volatility of an object in its environment dictates the capacity to change and how it will change. Meanwhile entropy governs this process, or predicts it…

Entropy can be understood as a measure of molecular disorder within a macroscopic system. The number of possible configurations that molecules could assemble is equally probable and so the ones in which it does take are random, in consideration of that macroscopic environment.

So my idea of reverse predictability uses this certainty in order to deduce non-random configurations. This is not to say that they are likely, but that they demonstrate a predictable existence because of their environment. There is something that isn’t random about the entropy of each event. If all the macroscopic factors are halved (…crude example), then the variance in entropy must be relevant in predicting how the macroscopic system/state behaves?

There are outcomes that were possible in the first environment, that become impossible in the new environment, and the new environment creates possible outcomes that were not possible in the previous environment.

Therefore we have certainty in the differences that exist between the two environments. By identifying the outcomes that cannot be achieved in both environments, we can deduce the outcomes that can occur in both environments – the shared entropy.

Shared entropy is the number of configurations that are equally likely to occur microscopically, even where macroscopic influences differ. Depending on the number of macroscopic influences that are involved in identifying shared entropy, a level of predictability can be set out based on the number of configurations that are possible for all macroscopic environments.

Furthermore if there are configurations which are shared with other configurations that do not simultaneously occur as part of the shared entropy of all (more than two) environments, then the different macroscopic environments that cause these configurations have their own shared entropy. The below diagram helps to demonstrate this:

shared-entropy

Hypothetically speaking, the configurations/outcomes that can occur in A, that don’t occur in B, or in any of the other macroscopic environments, by deduction can only occur in A’s specific macroscopic environment.

If there is an equal chance of all configurations occurring then it doesn’t matter if out of all the possible outcomes, only a handful of those can only occur because of that environment…but if there is an environment where a lot of events occur simultaneously then the resultant microscopic configuration (outcome) is more likely to be within the shared entropy range.

We can monitor events and changes to macroscopic factors so as to alert us to the level of shared entropy of specific microscopic configurations in relation to their macroscopic influences. So if there is a large spectrum of macroscopic factors, and there is still shared entropy, then those shared configurations/possible outcomes are more likely to occur because those macroscopic factors are influencing the same subject matter.

Conclusion

It is agreed that entropy represents the random configuration of molecules subjected to a determined set of macroscopic factors. Presumably it can also be agreed that varying sets of macroscopic factors working on the same set of molecules simultaneously, impacts the hypothetical perfection of entropy (complete randomness and equal likelihood). It is therefore contradictory to say that microscopic configurations have the same equal likelihood of occurring when there are also configurations that can occur in one, but not the other macroscopic environment in isolation.

So the configurations that occur when different macroscopic factors are at play, signifies that although those outcomes are random to that environment, they are the more likely outcomes in consideration of the combination of macroscopic factors at play. By this logic then we can say that hypothetically, the greater the different macroscopic environments at play, the more likely the shared entropy configurations/outcomes will occur.

I think this process is identifying the unlikelihood of the macroscopic environment to arrange itself in such a way that the resultant microscopic outcome is predictable, rather than random or equally likely. But we don’t need this unlikely macroscopic environment to occur, we just need to know where shared entropy exists, and the most unlikely combination which produces a single shared outcome, is the starting point of reverse predictability, the singularity.

Advertisements

3 thoughts on “Entropy and the Singularity

  1. That is a very interesting idea you’ve put there. Entropy has always been considered (and taught) as a random event. But, it isn’t random in the truest sense. Even that randomness is governed by the macroscopic influences.

    This reminds me of a class on environmental statistics I had this semester. My teacher stressed never to use the term “randomly collected data” when doing statistical analysis. Data can never be random because it is always influenced by the area we consider, the energy and patience we have, the time we have to work with, and invariably, our biases.

    Great post!

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s