bit

The smallest possible quanta of information in a binary digital computer: an amount of memory that can lead to recollection of one of only two possible states, which might be interpreted as “yes” or “no”, “off” or “on,” “0” or “1,” “true” or “false”, etc. In a binary number such as “1001101”, each digit represents a bit.

The word was coined by the statistician and polymath John Tukey in 1946¹, ostensibly from “binary digit”, but clearly also from its most usual meaning, as in “he ate the last bit.”

Such features of computer hardware as the size of the gulps a microprocessor takes (an “8-bit microprocessor”) or the width of address busses (“The first IBM PC had an 8-bit bus.”) are described in bits. Not to be confused with byte.

The SI prefixes “kilo”, “mega” and “tera” are sometimes applied to the bit. To be strictly correct, the binary prefixes should be used instead.

1. David Leonhardt.
John Tukey, 85, Statistician; Coined the Word “Software”.
New York Times, 29 July 2000, page A19.

Obituary.

Sorry. No information on contributors is available for this page.

home | units index | search |  contact drawing of envelope |  contributors | 
help | privacy | terms of use