The results described above indicate that we can successfully apply
ideas from information theory to molecular interactions. This suggests
that other concepts from information theory should also apply.
An important concept is that of the channel capacity. A given communications
channel, such as a radio signal,
will operate over a certain range of frequencies *W* and the signal will dissipate
some power *P* into the receiver. The receiver must distinguish the signal
from thermal noise *N* it is also receiving. Shannon found that these factors
alone define the highest rate of information that can pass across the channel:

He also proved a remarkable theorem about the channel capacity [10]. If the rate of communication

The corresponding ideas can be constructed for molecular interactions in which a molecule (``molecular machine'') makes choices from among several possibilities [12,4]. The corresponding statement of the theorem is that so long as the molecular machine does not exceed the machine capacity, the molecular interactions can have as few errors as necessary for survival of the organism. Of course statements cannot be about ``desires'' in molecular biology, so the theorem is related to the evolution of the system. This mathematical result explains the observed precision of genetic control systems.