Tuesday, November 20, 2012

Natural Selection through the eyes of Information Theory

I'm going to be covering different sections of this fundamental paper piece by piece over the next month or so.  Thanks to the Haldane's Sieve blog for putting it up.  Especially, thanks to the author Steven A. Frank.  I've been stumbling around for almost two years now, knowing the underlying connection between population genetics and communication theory was entropy, but lacking the mathematical formalism to describe the connection.  Interestingly, Frank is at UC Urvine, home of the Henry Samueli School of Engineering.  Here's the intro to the paper:

Natural selection. V. How to read the fundamental equations of evolutionary change in terms of information theory

Steven A. Frank
Journal of Evolutionary Biology
Article first published online: 16 NOV 2012
(link)

Introduction

   "I show that natural selection can be described by the same measure of information that provides the conceptual foundations of physics, statistics and communication. Briefly, the argument runs as follows. The classical models of selection express evolutionary rates in proportion to the variance in fitness. The variance in fitness is equivalent to a symmetric form of the Kullback-Leibler information that the population acquires about the environment through the changes in gene frequency caused by selection.

   "Kullback-Leibler information is closely related to Fisher information, likelihood, and Bayesian updating from statistics, as well as Shannon information and the measures of entropy that arise as the fundamental quantities of communication theory and physics. Thus, the common variances and covariances of evolutionary models are equivalent to the fundamental measures of information that arise in many different fields of study.

   "In Fisher's fundamental theorem of natural selection, the rate of increase in fitness caused by natural selection is equal to the genetic variance in fitness. Equivalently, the rate of increase in fitness is proportional to the amount of information that the population acquires about the environment [2].

   "In my view, information is a primary quantity with intuitive meaning in the study of selection, whereas the genetic variance just happens to be an algebraic equivalence for the measure of information. The history of evolutionary theory has it backwards, using statistical expressions of variances and covariances in place of the equivalent and more meaningful expressions of information. To read the fundamental equations of evolutionary change, one must learn to interpret the standard expressions of variances and covariances as expressions of information."

No comments:

Post a Comment

Comments have temporarily been turned off. Because I currently have a heavy workload, I do not feel that I can do an acceptable job as moderator. Thanks for your understanding.

Note: Only a member of this blog may post a comment.