The IT Law Wiki
Register
Advertisement
The desire to economize time and mental effort in arithmetical computations, and to eliminate human liability to error, is probably as old as the science of arithmetic itself. This desire has led to the design and construction of a variety of aids to calculation, beginning with groups of small objects, such as pebbles, first used loosely, later as counters on ruled boards, and later still as beads mounted on wires fixed in a frame, as in the abacus.
— Howard Aiken, father of the Mark I IBM computer[1]

Definitions[]

Computer Fraud and Abuse Act[]

Computer is

an electronic, magnetic, optical, electrochemical, or other high speed data processing device performing logical, arithmetic, or storage functions, and includes any data storage facility or communications facility directly related to or operating in conjunction with such device, but such term does not include an automated typewriter or typesetter, a portable hand held calculator, or other similar device.[2]

Computing[]

A computer is a machine which manipulates data according to a list of instructions.

General[]

In 1940, the word 'computer' was used as a job description for human workers who performed complex calculations for military and civilian organizations, Both technical companies and military organizations employed hundreds of human computers who worked by using either their own minds or mechanical adding machines and calculators.[3]

U.S. copyright law[]

A computer is

[a] programmable electronic device that can store, retrieve, and process data that is input by a user through a user interface, and is capable of providing output through a display screen or other external output device, such as a printer. 'Computers' include mainframes, desktops, laptops, tablets, and smart phones.[4]

Overview[]

Computers take numerous physical forms. The first devices that resemble modern computers date to the mid-20th century (around 1940-1941), although the computer concept and various machines similar to computers existed prior. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.

Modern computers are based on comparatively tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space. Personal computers in various forms are icons of the Information Age and are what most people think of as a "computer." However, the most common form of computer in use today is by far the embedded computer. Embedded computers are small, simple devices that are often used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and even children's toys.

The ability to store and execute lists of instructions (called programs) makes computers extremely versatile and distinguishes them from calculators.

History of computing[]

It is difficult to identify any one device as the earliest computer, partly because the term "computer" has been subject to varying interpretations over time.

Originally, the term "computer" referred to a person who performed numerical calculations (a human computer), often with the aid of a mechanical calculating device. Examples of early mechanical computing devices included the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC). The end of the Middle Ages saw a re-invigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers.

JacquardLoom

Jacquard loom

However, none of those devices fit the modern definition of a computer because they could not be programmed. In 1801, Joseph Marie Jacquard made an improvement to the textile loom that used a series of punched cards as a template to allow his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.

DifferenceEngine

Difference Engine

In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer that he called The "Analytical Engine." Due to limited finance, and an inability to resist tinkering with the design, Babbage never actually built his Analytical Engine.

Large-scale automated data processing of punched cards was performed for the U.S. Census in 1890 by tabulating machines designed by Herman Hollerith and manufactured by the Computing Tabulating Recording Corporation, which later became International Business Machines, Inc. (IBM). By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.

Hollerith

Hollerith punch card machine

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.

A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult. Notable achievements include:

  • EDSAC was one of the first computers to implement the stored program (von Neumann) architecture.
  • Konrad Zuse's electromechanical "Z machines." The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. The Z3 was the world's first operational computer.
  • The non-programmable Atanasoff-Berry Computer (1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory.
  • The secret British Colossus computer (1944), which had limited programmability but demonstrated that a device using thousands of vacuum tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes.
  • The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability.
  • The U.S. Army's Ballistics Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming.
ENIAC02

ENIAC

Several developers of ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which came to be known as the stored program architecture or "von Neumann architecture." A number of projects to develop computers based on the stored program architecture commenced around this time, the first of these being completed in Great Britain. The first to be demonstrated working was the Manchester Small-Scale Experimental Machine (SSEM) or "Baby". However, the EDSAC, completed a year after SSEM, was perhaps the first practical implementation of the stored program design. Shortly thereafter, the machine originally described by von Neumann's paper — EDVAC — was completed but did not see full-time use for an additional two years.

Nearly all modern computers implement some form of the stored program architecture, making it the single trait by which the word "computer" is now defined. By this standard, many earlier devices would no longer be called computers by today's definition, but are usually referred to as such in their historical context. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture. The design made the universal computer a practical reality.

Vacuum tube-based computers were in use throughout the 1950s, but were largely replaced in the 1960s by transistor-based devices, which were smaller, faster, cheaper, used less power and were more reliable. These factors allowed computers to be produced on an unprecedented commercial scale. By the 1970s, the adoption of integrated circuit technology and the subsequent creation of microprocessors such as the Intel 4004 caused another leap in size, speed, cost and reliability. By the 1980s, computers had become sufficiently small and cheap to replace simple mechanical controls in domestic appliances such as washing machines. Around the same time, computers became widely accessible for personal use by individuals in the form of home computers and the now ubiquitous personal computer. In conjunction with the widespread growth of the Internet since the 1990s, personal computers are becoming as common as the television and the telephone and almost all modern electronic devices contain a computer of some kind.

Computer capabilities[]

Computer capabilities fall into seven main categories:

1. Data collection. When attached to various sensing devices, computers can detect and measure such external physical phenomena as temperature, time, pressure, flow rate, consumption rate, or any number of other variables. Also, computers can keep a record of transactions. For example, a computerized cash register can collect and store information about a sale that includes bookkeeping entries, taxes, commissions and inventory, and can even reorder stock. Some computer-based door locks require individuals to carry magnetic identity cards. Such locks not only control access but also create a record of whose card was granted access, when, and for how long.

Computers can also process visual and audio input, thus greatly increasing their applicability to data collection. Computers also can recognize human speech, read directly a variety of typewritten forms and handprinted texts, and detect patterns in video images.

2. Information storage. Computers can store large amounts of information for long periods of time in an electronic-readable form that is easily and quickly recoverable. Depending on the particular application, the methods of storage vary widely. Memory technology allows trillions of bits of [information]] to be stored conveniently and cheaply.

3. Information organization. Computers can be used to organize and rearrange information so that it is more suitable for particular applications. Computers can simplify and restructure vast amounts of raw data to assist people in drawing significant meanings or conclusions.

4. Calculations. Computers perform arithmetic calculations millions of times faster than human beings. They are used to make numerous simple calculations, such as those required in processing the payroll for a sizable organization; to make sophisticated statistical calculations on large amounts of data, such as those for social science research; or to perform highly complex scientific calculations, such as those needed for weather research or for modeling fusion energy systems.

5. Communication. Through connections over a telecommunication system, computers can transmit data around the world either to human users or to other computers, permitting the sharing of work and data among groups of linked computers (computer networks).

6. Information presentation. Computers can output information in a variety of forms. Through graphical display and voice response, they can make data readily understandable and useful to non-experts. It is possible to have data and computer schematics displayed on screens in a multicolored, three-dimensional format for design and analytical purposes. Also, data such as numbers and statistics can be organized by the computer in an easy-to-understand tabular presentation.

7. Control. Computers can be used to control a machine tool or a production line without human intervention. Many consumer devices, including microwave ovens, automated home thermostats, automobile engines, television sets, and telephones, incorporate computer controls.

Several of these capabilities can be combined, for example in computer-aided design of aircraft structures (or computer logic elements, for that matter) and computer-based modeling of the saltwater penetration in San Francisco Bay (a function of tidal action and ground water runoff). Both computer-aided design and computer modeling have found wide application and are illustrative of what is sometimes referred to as the “intelligence amplifying” capability of computers.

Stored program architecture[]

The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that a list of instructions (the program) can be given to the computer and it will store them and carry them out at some time in the future.

In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to that point.

Structure of a computer[]

A general purpose computer has four main sections: the arithmetic and logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by busses, often made of groups of wires.

References[]

  1. Quoted in Zenon W. Pylyshyn & Liam J. Bannon, Perspectives on the Computer Revolution (1989).
  2. 18 U.S.C. §1030(e)(1).
  3. Caper Jones, "The Technical and Social History of Software Engineering" 44 (2014).
  4. Compendium of U.S. Copyright Office Practices, Third Edition, Glossary, at 3.

See also[]


This page uses Creative Commons Licensed content from Wikipedia (view authors). Smallwikipedialogo.png
Advertisement