Bit

A bit is a binary digit, taking a value of either 0 or 1. For example, the number 10010111 is 8 bits long, or in most cases, one modern PC byte. Binary digits are a basic unit of information and communication in digital computing and digital information theory. Information theory also often uses the natural digit, called either a nit or a nat. Quantum computing also uses qubits, a single piece of information with a probability of being true.

The bit is also a unit of measurement, the information capacity of one binary digit. It has the symbol bit, or b (see discussion below). The unit is also known as the shannon, with symbol Sh.

Histrorical background

Claude E. Shannon first used the word bit in his 1948 paper A Mathematical Theory of Communication. He attributed its origin to John W. Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary digit" to simply "bit". Interestingly, Vannevar Bush had written in 1936 of "bits of information" that could be stored on the punch cards used in the mechanical computers of that time.

A bit of storage is like a light switch; it can be either on (1) or off (0). A single bit is a one or a zero, a true or a false, a "flag" which is "on" or "off", or in general, the quantity of information required to distinguish two mutually exclusive equally probable states from each other. Gregory Bateson defined a bit as "a difference that makes a difference"

Category:Quantum Information Theory category:Handbook of Quantum Information

Last modified: 

Monday, October 26, 2015 - 17:56