total jobs On EngineeringCrossing

188,045

new jobs this week On EngineeringCrossing

13,428

total jobs on EmploymentCrossing network available to our members

1,475,759

job type count

On EngineeringCrossing

The History of the Computer

0 Views      
What do you think about this article? Rate it using the stars above and let us know what you think in the comments below.
My five-year-old nephew had his own Mac by the time he was two or three years old. I was 12 when I was first introduced to computers in my research-driven, sixth-grade computer class in West Lafayette, Indiana. I wrote a story, complete with pictures that we drew by connecting lines between two coordinates on an invisible Cartesian graph. We couldn’t check our work without printing it out. That was in 1987.

No one would deny that computers shape the way we learn today. They shape the way we think, the way we interact with others — and the way we dream. (I wonder how many of you reading this dreamed in Nintendo Wii last night.) Do we even remember what the world was like before we used computers for almost every aspect of our lives?

Computers have advanced us exponentially in the business world. OK, warp speed is probably more accurate.



A few years ago, I assisted my friend Bob Sperry of Salt Lake City, Utah, in completing his personal history. He devoted one chapter to his experience working at IBM for the first part of his career. Sperry began working for IBM in 1963 in technical sales as the leader of a team of systems engineers who sold and installed computers for large clients. He dictated to me as I typed.

"At that time, computers were just starting to come on the scene. The initial training was for punch card equipment. (For my grandchildren, they need to understand that this is pre-computer equipment)," he explained. "Data would be entered in by a keyboard at a punch card machine, and you would pack as much information as you could into an 80-column card. Before computers, companies used IBM punch card machines such as the 403 and 407 accounting machines and sorters to process repetitive work such as payroll and inventory control."

"To program 403 and 407 accounting machines, you had several options that were selected using wires," Sperry continued. "To do payroll, for example, clerks would enter in information including the number of hours they worked each day and any overtime hours worked and their hourly pay scale. Sorts were arranged on these cards in alphabetical or numerical sequence by office, district, and region, for example. Once the cards had been sorted into the proper order, they would go to the 403 and 407 accounting machines, which would calculate all the payroll data and print the payroll checks and reports. Payroll, inventory, and sales analysis were three of the early applications."

How did we function without computers? Certainly much more slowly and less efficiently, as evidenced in Sperry's account. So much of our daily lives are affected by computers that it seems archaic to even write this sentence. But seriously, how often do we stop and remember what the world was like before computers assisted us in every aspect of our lives?

I was curious to know how computers were developed. So I got on my computer and did a Google search. (Umm, otherwise I would have had to trek down to the library and pull a World Book off the shelf.)

My results directed me to Mary Bellis of Ask.com. Bellis provides a series of features on the different developments which shape computers today as we know them. (In the box embedded in this article, I provide a list of some of the significant advances, starting in 1936.)

Bob Sperry experienced several significant advances in the computer industry during his career with IBM. In the 1960s, a computer system cost millions of dollars to buy, or several thousand a month to lease, so only the largest companies could afford to house one. And by "house", I mean building a separate building for the larger systems — at least before the transistor arrived on the scene.

"Early computers were sold primarily to the Air Force, defense companies, and to the Federal Aviation Commission (FAA)," Sperry explained. "The IBM 650 was an example of one of these computers. They used cathode ray tubes instead of transistors. These tubes would get very hot. They would require large air conditioning systems to cool them down so that, basically, they would not blow up."

"I remember going to a NORAD center when I was in the Utah Air National Guard," he continued. "The computers there were used to track all airplanes in the air above the United States. This computer system was in a four-story building. Two stories were dedicated to the computer, and the other two were for air conditioning. To keep one of these systems going was a difficult task. IBM would have multiple people assigned that would be on-site full-time trying to keep these systems running. As the tubes didn't just stop working but would degrade over time, it was difficult to know which tubes to replace. Also, the tubes had a relatively short life. With thousands of tubes in each of these large computers, repair technicians were constantly changing out tubes."

Sperry also explained that "everything was wired by hand, and if the electricity went out, you would say 'bye-bye' to [all your data]."

Some of the technologies were not very stable and were phased out more quickly as a result. This was done out of necessity.

"The 1401 card system that I was responsible for installing at Kennecott Copper [a large copper mine near Salt Lake City] was a 32-kilobyte card system. These days, most PCs start at 256K and go up to several megabytes," Sperry noted. "After a short time, magnetic tape came on the scene. This was a huge development. They were able to get rid of all the card equipment and put it on magnetic tape. This necessitated upgrading the computer to a 64K or 128K memory. At the time, this was the largest commercial system in the Salt Lake Valley outside of the federal government."

"About three years later, they came out with the first disk machine," he continued. "I remember that we installed one large one, and within the first week, there got to be a little dust in the machine, and the whole disk went up into vapor. This caused a huge problem because we lost all that data; plus, the system also cost $300,000 to $400,000."

Sperry was a front runner in installing process control systems. He assisted the U.S. Steel plant in Orem, Utah, with the installation of a system that significantly improved production.

"At U.S. Steel, I sold and installed a system that tracked all the steel plates that went through production. This was the first major use of barcode in our area," Sperry said. "I installed a system that used remote terminals scattered throughout the steel plant that were used to track the steel plates as they went through the plant. The major problem that U.S. Steel had was that they could not keep track of the location of various plates of steel after they were rolled…Each plate of steel is ordered with…specifications…[that] vary greatly depending on what the steel is going to be used for."

"The first time I walked through the plant, I had a difficult time getting around because there were stacks and stacks of steel plates piled 20 to 30 feet high all over the plant floor. These plates were typically ¼- to ½-inch thick, 12 feet wide, and 20 feet long," he recalled. "It turned out that they lost over 35 percent of the product they made. When they couldn't find it, they had to re-do it. This was a terrific expense to U.S. Steel. There was a huge justification to install the system because of that."

When Kennecott Copper modernized their facilities at a cost of $100 million and got barely any increase in production, Sperry installed a train scheduling system for a small cost of $30,000 that improved their production by over 30 percent. His innovations landed him a promotion at IBM to work in the process industry, which at the time included oil, gas, mining, steel, paper, rubber/tire, and the chemical industries. He was then promoted to industry marketing and then worked as a product line manager for IBM's World Trade Department. He moved his family to New York state and then to Atlanta.

"[In 1975,] the PC concept was starting to come on the market. This was originally started with the 286 computers that couldn't do much of anything. With the advances in micro-circuitry, the stand-alone computer started to make sense," Sperry said. "IBM formed a task force, of which I was part. I went around the world gathering requirements and quantifying the [demand]…for a PC."

However, when Sperry's mother died, he chose to take a demotion at IBM and return to Salt Lake City to spend more time with his wife and five children. IBM had come out with the Series/1 computer which replaced the Series/7. Sperry was tasked with marketing the Series/1 in the mountain states of Utah, Idaho, Nevada, and Montana, and he ended up being the number one sales person for the Series/7 in the United States.

Being the entrepreneur that he is, Sperry ended up leaving IBM and working for a few start-up companies in the middle part of his career. For the last 15 years, he has invested in real estate and has worked as a financial advisor, which he does very well, since he is so people-oriented. Today he is semi-retired; he enjoys traveling with his wife and skis on a regular basis.

At the University of Utah, where Sperry received his bachelor's degree in the early '60s, he was working with 4-kilobyte, 50-microsecond computers. "Go from that to what we have today," Sperry said. "My desktop computer at my office today is 4-gigabytes. That would have cost hundreds of millions of dollars back [in the '60s]." And it would have been impossible to house in someone's office.

Today, transistors (the most basic manufactured units in an integrated circuit (IC)), are not visible to the naked eye. The gate width is the smallest feature size of a transistor, and it is typically measured in nanometers. ICs are manufactured using beams of light shining down on a large silicon wafer about the size of an old vinyl record. Currently, many transistors are smaller than the wavelength of light, so refracted beams are used instead. A tiny wedding cake is a good way to visualize the insides of an IC — with transistors at the bottom and multiple layers of metal traces cutting through non-conducting material like the stacked layers of a cake, according to modern sources at IBM.

While the cost of designing and developing a complex integrated circuit is quite high, when spread across typically millions of production units, the individual IC cost is minimized, which is good for business. ICs have consistently migrated to smaller feature sizes over the years, allowing more circuitry to be packed on each chip. This increased capacity per unit area can be used to decrease cost and/or increase functionality. One of Intel's founders, Gordon Moore, observed at the beginning of the IC era that the number of transistors in an integrated circuit doubles every two years. This is often referred to as "Moore's Law." In general, as the feature size shrinks, almost everything improves; the cost per unit and the switching power consumption go down — and the speed goes up, according to Wikipedia.

Among the most advanced integrated circuits are the microprocessors or "cores," which control everything from computers to cellular phones to digital microwave ovens. Digital memory chips and application specific ICs (ASICs) are examples of other families of integrated circuits that are important to the modern information society, according to Wikipedia.

Business, which had the enormous capital required to invest in the application of computers in the early days, now sees far greater efficiency and accuracy in accounting and payroll, operations, communications, and word processing. With the cost of ICs continually dropping, we now see limitless applications that affect every facet of our lives, from cell phones to video games, from security systems to innovations in automotives, and from kitchen appliances to toothbrushes.

So what was life like before computers? Perhaps this weekend, after I update my blog, I will take a nature hike. I live in Los Angeles County today, and I can retreat from the man-made world for a little while here, but not without my cell phone.

The Early Evolution of the Computer
  • Z1 Computer. Developed in 1936 by Conrad Zuse, the first freely programmable computer served as an automatic calculator device with three elements: a control, a memory, and a calculator for arithmetic.

  • First Electronic-Digital (Binary) Computer. Developed by Atanasoff and Clifford Berry in 1942, it used parallel processing, regenerative memory, and a separation of memory and computing functions.

  • Harvard Mark I Computer. Howard Aiken and Grace Hopper developed this in 1944. Imagine a giant room full of noisy, clicking metal parts, 55 feet long and eight feet high. The five-ton device contained almost 760,000 separate pieces.

  • Manchester Baby Computer and the Williams Tube. Developed by Frederic Williams and Tom Kilburn in 1948, it used a type of altered cathode-ray tube.

  • The Transistor. Developed by John Bardeen, Walter Brattain, and William Shockley in 1947-48, the transistor, although not a computer itself, greatly assisted in their evolution.

  • UNIVAC Computer. Developed by John Presper Eckert and John Mauchley in 1951 for the Census Bureau, the device was used commercially for payroll.

  • IBM FORTRAN. John Backus developed the first successful high-level programming language in 1954.

  • MICR (Magnetic Ink Character Recognition). This technology was developed for the bank industry in 1955 by Stanford Research Institute, Bank of America, and General Electric to read checks.

  • Integrated Circuit or "Computer Chip." Developed by Jack Kilby and Robert Noyce in 1958; placed the previously separated transistors, resistors, capacitors and all the connecting wiring onto a single crystal (or "chip") made of semiconductor material.

  • ARPAnet. The first Internet was developed in 1969 by Charles M. Herzfeld. (You mean Al Gore didn't invent the Internet?!)

  • Intel 1103 Computer Memory. The world's first dynamic RAM chip was developed in 1970.

  • Intel 4004 Computer Microprocessor. Developed by Faggin, Hoff, and Mazor in 1971.

  • Ethernet Computer Networking. Developed by Robert Metcalfe and Xerox in 1973.

  • MS-DOS Computer Operating System. Developed by Microsoft in 1981.


On the net:The History of Computers
inventors.about.com/library/blcoindex.htm

Integrated Circuit
en.wikipedia.org/wiki/Computer_chips If this article has helped you in some way, will you say thanks by sharing it through a share, like, a link, or an email to someone you think would appreciate the reference.

Popular tags:

 West Virginia  functions  data  PC  applications  United States  systems engineers  Air Force  startup company  OK


EmploymentCrossing is great because it brings all of the jobs to one site. You don't have to go all over the place to find jobs.
Kim Bennett - Iowa,
  • All we do is research jobs.
  • Our team of researchers, programmers, and analysts find you jobs from over 1,000 career pages and other sources
  • Our members get more interviews and jobs than people who use "public job boards"
Shoot for the moon. Even if you miss it, you will land among the stars.
EngineeringCrossing - #1 Job Aggregation and Private Job-Opening Research Service — The Most Quality Jobs Anywhere
EngineeringCrossing is the first job consolidation service in the employment industry to seek to include every job that exists in the world.
Copyright © 2024 EngineeringCrossing - All rights reserved. 169