I am reading two books at the same time: The Information by James Gleich and Cycles of Time by Roger Penrose.
By coincidence, both have Entropy at their core.
Roger Penrose develops Conformal Cyclic Cosmology (CCC), primarily from an analysis of the exceptionally low entropy at the the time of the Big Bang, and his argument that the theory of "inflation" in the early universe does not sync up with Entropy calculations.
Gleich, on the other hand, studies the rise of information, and while the whole book is interesting, I found the sections on Claude Shannon and the development of Information Theory to be the most interesting. While I have used the outcome of Shannon's theory in some of my digital signal processing projects (years ago!), I did not know that his original insight was around Information Entropy. That's pretty cool.
Why am I reading two books at once? Well, the Penrose book is tough going! The arguments are hard to follow, the writing is dense, and the diagrams are confusing. I know where he is going (time exists before the Big Bang, and after the Big Crunch - rinse and repeat - and somehow Entropy increases all the way through this mess), but I am not following all of the connecting bits.
So, I am reading Gleick when I get stuck. Amazingly, the high level descriptions of Entropy in The Information is giving me some insights into what Penrose is describing. Lucky coincidence :-)
By coincidence, both have Entropy at their core.
Roger Penrose develops Conformal Cyclic Cosmology (CCC), primarily from an analysis of the exceptionally low entropy at the the time of the Big Bang, and his argument that the theory of "inflation" in the early universe does not sync up with Entropy calculations.
Gleich, on the other hand, studies the rise of information, and while the whole book is interesting, I found the sections on Claude Shannon and the development of Information Theory to be the most interesting. While I have used the outcome of Shannon's theory in some of my digital signal processing projects (years ago!), I did not know that his original insight was around Information Entropy. That's pretty cool.
Why am I reading two books at once? Well, the Penrose book is tough going! The arguments are hard to follow, the writing is dense, and the diagrams are confusing. I know where he is going (time exists before the Big Bang, and after the Big Crunch - rinse and repeat - and somehow Entropy increases all the way through this mess), but I am not following all of the connecting bits.
So, I am reading Gleick when I get stuck. Amazingly, the high level descriptions of Entropy in The Information is giving me some insights into what Penrose is describing. Lucky coincidence :-)