Cosmologic philosophy: entropy, time and clocks (Introduction)

by David Turell @, Tuesday, August 31, 2021, 19:46 (1180 days ago) @ dhw

New approaches and theories:

https://www.quantamagazine.org/the-new-science-of-clocks-prompts-questions-about-the-na...

"Huber, Erker and their colleagues realized that a clock is anything that undergoes irreversible changes: changes in which energy spreads out among more particles or into a broader area. Energy tends to dissipate — and entropy, a measure of its dissipation, tends to increase — simply because there are far, far more ways for energy to be spread out than for it to be highly concentrated. This numerical asymmetry, and the curious fact that energy started out ultra-concentrated at the beginning of the universe, are why energy now moves toward increasingly dispersed arrangements, one cooling coffee cup at a time.

"Not only do energy’s strong spreading tendency and entropy’s resulting irreversible rise seem to account for time’s arrow, but according to Huber and company, it also accounts for clocks. “The irreversibility is really fundamental,” Huber said. “This shift in perspective is what we wanted to explore.”

***

"The more regular the ticks, the more accurate the clock. In their first paper, published in Physical Review X in 2017, Erker, Huber and co-authors showed that better timekeeping comes at a cost: The greater a clock’s accuracy, the more energy it dissipates and the more entropy it produces in the course of ticking.

“'A clock is a flow meter for entropy,” said Milburn.

***

"In precise terms, entropy is a measure of the number of possible arrangements that a system of particles can be in. These possibilities grow when energy is spread more evenly among more particles, which is why entropy rises as energy disperses. Moreover, in his 1948 paper that founded information theory, the American mathematician Claude Shannon showed that entropy also inversely tracks with information: The less information you have about, say, a data set, the higher its entropy, since there are more possible states the data can be in.

“'There’s this deep connection between entropy and information,” Huber said, and so any limit on a clock’s entropy production should naturally correspond to a limit of information — including, he said, “information about the time that has passed.”

***

"In short, it’s the irreversible rise of entropy that makes timekeeping possible, while both periodicity and complexity enhance clock performance."

Comment: Using the concept of entropy, time can only exist if time is started at an event of high energy, i.e., after a BB. Time exists only after a BB.


Complete thread:

 RSS Feed of thread

powered by my little forum