Skip to main content

Entropy

I am reading two books at the same time: The Information by James Gleich and Cycles of Time by Roger Penrose.

By coincidence, both have Entropy at their core.

Roger Penrose develops Conformal Cyclic Cosmology (CCC), primarily from an analysis of the exceptionally low entropy at the the time of the Big Bang, and his argument that the theory of "inflation" in the early universe does not sync up with Entropy calculations.

Gleich, on the other hand, studies the rise of information, and while the whole book is interesting, I found the sections on Claude Shannon and the development of Information Theory to be the most interesting.  While I have used the outcome of Shannon's theory in some of my digital signal processing projects (years ago!), I did not know that his original insight was around Information Entropy.  That's pretty cool.

Why am I reading two books at once?  Well, the Penrose book is tough going!  The arguments are hard to follow, the writing is dense, and the diagrams are confusing.  I know where he is going (time exists before the Big Bang, and after the Big Crunch - rinse and repeat - and somehow Entropy increases all the way through this mess), but I am not following all of the connecting bits.

So, I am reading Gleick when I get stuck.  Amazingly, the high level descriptions of Entropy in The Information is giving me some insights into what Penrose is describing.  Lucky coincidence :-)


Popular posts from this blog

The Fourth R.

Reading, wRiting, aRithmetic, and algoRithms.  My wife and I were just brainstorming about this: how coding should be the next "basic" skill.  Of course, someone was ahead of us and posted this .  It is awesome to see Mozilla Hackasaurus referenced in this article.  It is a small world. In the early days of the printing press, scholars wrote the books; the press was simply used for production (see this article ).  As time went on, "average" people became familiar with the medium, and used it for their own messages.  We are at just that point with the Web.  Software Engineers write the code, and the Web distributes it.   Software Engineers are the algoRithm scholars of today.  They won't be for long.  Soon algoRithms will be taught starting in elementary school, along with the other three R's.

Connectome as a Book

Your Connectome is a map of your brain.  Every neuron, every synapse. I am only a few pages into Connectome, but was intrigued by a sentence: "Human DNA....has three billion letters....would be a million pages long if printed as a book."  The companion question, "How many pages for the Connectome?" might be answered later in the book, but I thought I would take a shot at it here. Here is the punchline: Your Connectome book is 6.7 million times longer than your DNA book. That human DNA is about a million pages is not too surprising, although it probably is not optimized. According to quora there are between 1500 and 1800 letters per page.  I am going to use round numbers, namely 2000.  Then, the 3x10^9 DNA letters would actually be 1.5 million pages.  But this is very wasteful.  Even using just ASCII we can encode four DNA letters per character, so the book should really only be about 400K pages.  And, this book is much more interesting; in...