LESSON 1
COMPUTER OPERATIONS
Section I. THE DEVELOPMENT OF BINARY AND OTHER NUMBERING SYSTEMS
1-1.
INTRODUCTION
As stated in the previous subcourse, a modern computer's chief capability is its
capacity to react with lightning speed to binary coded bursts of voltage expressed as
zeros and ones. This lesson looks more closely at that capability: the computer's use
of the binary code, the manipulation of characters, symbols, and numbers through the
presence (1) or absence (0) of an electrical signal. (See figure 1-2, for a visual model of
the binary code.) It covers the development of the binary code, its roots in philosophy
and logic, and its eventual application to the computer. In order to provide a fuller
appreciation of how a computer encodes (converts) information into the binary code, a
section on converting decimal numbers into the binary system and vice versa is
included. The concept of the byte and its significance to users, and a description of
American standard code for information interchange (ASCII), the shared electronic
language of computers are covered.
binary code: a
system for representing things by
combinations of
two symbols,
such as 1 and 0, TRUE and FALSE, or the presence or absence of voltage.
1-2.
THE BINARY CODE
a. What It Is. Basically all digital computers, regardless of size or purpose, are
a traffic system for information expressed in zeros and ones. Although some of the
early computers, such as ENIAC (1945), used an internal language based on the
decimal number system, nearly every computer since 1950 has used the binary number
system. ENIAC was huge, cumbersome, and unreliable, in part, because it processed
numbers in decimal form, and thus required over 17,000 vacuum tubes to handle all its
circuitry. The binary system, with only two symbols, is a much more efficient means of
encoding information. It requires much less circuitry. The microscopic electronic
switches in a modern CPU have to deal with only two states, on or off represented by
zero and one, rather than the 10 needed for a decimal circuit.
b. Digitizing Input. Computers process information that often does not seem to
have anything to do with numbers or logic. A computer can reproduce sounds coming
through a microphone onto a special disk, monitor temperature in laboratories, or
manipulate images on television. To process this kind of input, the computer must first
"digitize" the information, that is, turn it into binary digits. To digitize music, the
computer takes periodic measurements of the sound waves and records each
measurement as a binary number.
MD0058
1-2