Computer Science Career Field Background
Computer science careers can be divided into three broad categories, by operation and industry sector: hardware, software, and the Internet. Hardware refers to the physical equipment of a computer, such as motherboards, memory chips, and microprocessors. Software includes the programs that tell the hardware exactly what to do and how to do it. The Internet is composed of numerous global networks that are connected to each other.
Developed in Asia and widely used during the Middle Ages, the abacus can be considered the origin of modern computing devices. An abacus, composed of strings and beads representing numerical values, can be used for arithmetic. Today, Japanese children still learn to work on an abacus before being introduced to electronic calculators.
French philosopher Blaise Pascal invented the world’s first digital calculator in the 17th century. His machine was based on a system of rotating drums controlled with a ratchet linkage. He originally intended it for use in his father’s tax office, but the principles behind it are still used in modern automobile odometers. In honor of his early contributions to computer technology, the programming language Pascal was named after him in the 1970s. A German philosopher and mathematician, Gottfried Wilhelm von Leibnitz, later improved Pascal’s design, making a handheld version similar to a handheld calculator. It never became available commercially, however.
The first significant automated data-processing techniques were applied to making fabric patterns, not calculating numbers. French weaver Joseph-Marie Jacquard introduced a punch-card weaving system at the 1801 World’s Fair. His system was straightforward enough; the punched cards controlled the pattern applied to the cloth as it was woven. The introduction of these looms caused riots against the replacement of people by machines.
After proposing in 1822 that it might be possible to compute table entries using a steam engine, Charles Babbage had second thoughts about his idea and went on to design the analytical engine that had the basic components of the modern computer in 1833. This earned him the title of father of the computer. He was aided greatly by the daughter of famous poet Lord Byron, Ada Augusta King, Countess of Lovelace, who is recognized as the world’s first programmer. In 1890, U.S. inventor and statistician Herman Hollerith put the punched card system to use for the 1890 census. He discovered that perforated cards could be read electrically by machines. Each perforation could stand for some important piece of information that the machine could sort and manipulate. Hollerith founded Calculating-Tabulating-Recording Company in 1914, which eventually was renamed International Business Machines (IBM) in 1924. IBM is still a computer industry leader today.
In the mid-1940s, punched cards were also used on the Electronic Numerical Integrator and Calculator (ENIAC) at the University of Pennsylvania. ENIAC’s inventors developed the world’s first all-electronic, general- purpose computer for the U.S. Army. This computer was enormous and relied on over 18,000 vacuum tubes. In 1949, they introduced the Binary Automatic Computer (BINAC), which used magnetic tape, and then developed the Universal Automatic Computer (UNIVAC I) for the U.S. census. The latter was the first digital computer to handle both numerical data and alphabetical information quickly and efficiently. In 1954, IBM built the first commercial computer, the 650 EDPM, which was programmed by symbolic notation.
By the late 1950s, the transistor, invented 10 years earlier, had made the second generation of computers possible. Transistors replaced the bulky vacuum tubes and were lighter, smaller, sturdier, and more efficient.
The integrated circuits of the late 1960s introduced the solid-state technology that allowed transistors, diodes, and resistors to be carried on tiny silicon chips. These advances further reduced operating costs and increased speed, capacity, and accuracy. Minicomputers, much smaller than mainframes (large-scale computers) but of comparable power, were developed shortly afterward.
The next important advances included large-scale integration and microprocessing chips. Microchips made even smaller computers possible and reduced costs while increasing capacity. The speed with which a computer processed, calculated, retrieved, and stored data improved significantly. Decreased costs allowed manufacturers to explore new markets.
In the mid-1970s, Steve Wozniak and Steve Jobs started Apple out of their garage. Their vision was to bring computers into every home in America and even the world. Toward that end, they developed a user-friendly computer offered at a reasonable price. User-friendliness was essential, since many people without computer skills would have to adapt to the computer system. The development of their eventual product, the Macintosh computer, was the first to give on-screen instructions in everyday language and successfully use a graphical interface. In addition, Apple introduced the mouse, which allows users to point and click on screen icons to enter commands instead of typing them in one by one.
IBM and manufacturers who copied their designs were quick to enter the personal computer (PC) market once they recognized the tremendous sales potential. The result was a friendly debate among computer users over which are better—Macs or PCs? Regardless of personal preference, the two incompatible systems often led to problems when people tried to share information across formats. Software designers have since developed ways to make file conversions easier and software more interchangeable.
One major trend of the last decade has been the downsizing of computer systems, replacing big mainframe computers with client-server architecture, or networking. Networks allow users greater computing flexibility and increased access to an ever-increasing amount of data.
The second major trend of the last decade has been the rapid growth of the Internet and World Wide Web. Initially developed for the U.S. Department of Defense, the Internet is composed of numerous networks connected to each other around the world. Not surprisingly, this massive network has revolutionized information sharing. It is used for real-time video conferencing, electronic mail services, online research, help lines, and long-distance telephone calls. The World Wide Web usually refers to the body of information that is available for retrieval online, while the Internet generally refers to the back-end network system plus its various services.
Hardware companies are continually striving to make faster and better microprocessors and memory chips. Intel and Motorola have been the innovators in microprocessor design, striving for faster and more efficient processors. Such innovations allow computer manufacturers to make smaller, lighter, and quicker computers, laptops, and handheld models.
As processors get faster and memory increases, computers can process more sophisticated and complicated software programming. Advances in hardware technology have led directly to advances in software applications. As the developer of Windows, Microsoft has been the leader in the software industry. Windows is a user-friendly, visual-based operating system. (An operating system is the interface between the user, the programs stored on the hardware, and the hardware itself.) Disk operating system (DOS) is one of the early operating systems, and while still used, it requires more computer knowledge than other operating systems. The Windows and Mac systems allow users to point and click on icons and menus with a mouse to tell the computer what to do, instead of having to type in specific commands by hand, as DOS requires.
See also:
- Computer Science Career Field Structure
- Computer Science Career Field Outlook
- Browse all Computer Science Careers.