Next: Level 2. The Second Up: New Approaches in Mathematical Previous: Level 0. Sequence Logos:

# Level 1. Machine Capacity: energetics of macromolecules.

The results described above indicate that we can successfully apply ideas from information theory to molecular interactions. This suggests that other concepts from information theory should also apply. An important concept is that of the channel capacity. A given communications channel, such as a radio signal, will operate over a certain range of frequencies W and the signal will dissipate some power P into the receiver. The receiver must distinguish the signal from thermal noise N it is also receiving. Shannon found that these factors alone define the highest rate of information that can pass across the channel:

 (5)

He also proved a remarkable theorem about the channel capacity [10]. If the rate of communication R is greater than the capacity, at most C bits per second will get through. On the other hand if , the error rate may be made as small as desired but not zero. The way to do this is to encode the signal to protect it from noise so that when the signal is decoded, errors can be corrected. Coding is used in compact disks to correct up to 4000 simultaneous bit errors [11], which is why CD music is so clear.

The corresponding ideas can be constructed for molecular interactions in which a molecule (molecular machine'') makes choices from among several possibilities [12,4]. The corresponding statement of the theorem is that so long as the molecular machine does not exceed the machine capacity, the molecular interactions can have as few errors as necessary for survival of the organism. Of course statements cannot be about desires'' in molecular biology, so the theorem is related to the evolution of the system. This mathematical result explains the observed precision of genetic control systems.

Next: Level 2. The Second Up: New Approaches in Mathematical Previous: Level 0. Sequence Logos:
Tom Schneider
2000-10-13