Table of contents
Let’s look at the history of computing and computer science. How did it all begin? What exactly is computer science? The very first recordable history of computing was the ABACUS. This was programmed by a woman named Ada Lovelace. The idea behind it was to help speed up the process of making mathematical calculations. It is said that the first operating system had appeared on a computer built in 1837 and it is called “Difference Engine”.
History Of Computing And Computer Science
Charles Babbage, who was a mathematician and an inventor, had created this Difference Engine. But the project was never finished, and only a small part of it has survived. This computer is considered to be the first computer in history as we know it today. It stored values using punched cards and printed them with holes, using a total of 25 columns and 13 rows.
The Analytical Engine was first described by Charles Babbage in 1844. It had many parts like the memory unit, the adder and accumulator, the so called “logical unit”, etc. But these were never built because of more important developments that happened in later years. Hence this computer system is viewed as a theoretical milestone and is called Babbage’s dream or his anticipation . The idea behind it was to implement mathematical problems with high speed calculations through machine processes.
He later build the “Analytical Engine” (1837). This had a memory unit, which is the most prominent part of later computers. The working of it was very much similar to that of the Difference Engine, although it had many difference in many ways.
Ada Byron was the cousin of Lord Byron and was very talented. She spent most of her time in the Italian poet, John Keats’ house. This made her to have a knack for mathematics.
The next computer was built by Konrad Zuse in 1941. He invented “V1”. It was called this because it was the first calculator and computer ever to be built by him. It used vacuum tubes and metal wires for storage purposes, although it is not considered as a proper computer because this machine could only be used for one program at a time and all programs had to be written on paper tape.
The ENIAC, the first general purpose digital computer, was built in the year 1946. It was used in calculating ballistic firing tables for the United States Army. ENIAC uses 18000 vacuum tubes, 70 000 resistors and 10 000 capacitors for storage purposes. This machine took up a large space of 15 meters by 7 meters. It consumed 150 kWatts of power and worked with five programs at a time.
The next step in computing history was “EDVAC“. It is considered as one of the major milestones because it had new ideas which were incorporated in future computers. For example, it had control unit and memory unit. The ENIAC which has been already made has later on become the prototype for this newer one, EDVAC.
In 1971, the first microprocessor was invented by a man named Ted Hoff. It was called the Intel 4004. This is a “4 bit” microprocessor (signed 16-bit words). This can be used in machines like calculators, phones and printers.
The microprocessor also came about as a result of the invention of integrated circuits. It was in 1971 that Intel Corporation created the first microprocessor. In this, the entire computer system is on one chip, which means there are no cables or other external parts required for its operation.
In computer history, the invention of the microprocessor would be a major milestone. The first microprocessor, which was invented in 1971, had 4 bits and it was called “Intel 4004”. In 1976, “Intel 8080” came about which was used for calculators and other machines. Later on various models followed one after another. For example, “TI 990/994” came out in 1977, “Motorola 6800/6800” in 1978 and so on.
Nowadays there are advanced versions called microprocessors and they have advanced features like display units (LCD screens), hard drives etc., housed in less space than ever before (as small as 1 cm).
Computer history can be divided into different periods – which ones?
The era of mechanical computers is included between 1801 and the year 1900. The era covered during this time was a transitionary period in which many new methods and ideas were developed with regard to computer science. The computer had not developed much (it was just a gadget) because most of these computers had been used as mathematical instruments. These computers are considered to be first-generation computers.
This is considered as a transitionary period which includes the years between 1901 and 1955. These years were characterized by more complex machines, other than just a conventional machine. The first electronic computers were invented in this era. Also, the use of vacuum tubes, relays and much more made up the difference. This was actually the time when modern computers started to appear with the invention of the “Atanasoff-Berry Computer” in 1939, although it was designed as an experimental device only.
This age is considered to be that period between 1956 and 1965 and it was during this period that mainframe computers were widely used as well as developed further. The first transistor computer, the MANIAC I (Mathematical Analyzer Numerical Integrator and Computer I), was invented in 1955 by John Von Neumann at the University of Illinois. Nowadays, many scientists refer to this machine as the first real computer, although it is not the first electronic computer that was ever invented. Von Neumann stored more than 1000 programmes on magnetic tape.
The integrators and differentiators were also stored using punched cards. This machine had 50 000 vacuum tubes and consumed 150 kWatts of power while running.
It was in 1959 that “FORTRAN” language was developed, by a man named John Backus in IBM. This was the first time that a computer language was invented. It had variables, arithmetic and logical operators (AND, OR etc.). Later on, different programming languages were developed for different tasks like CAD/CAM and so on.
The era of mainframe computers started in 1960. These used to be bigger machines than those which were introduced earlier in this list. They had many thousand vacuum tubes which cost a lot of money and went up to millions of dollars when they were discarded because they were redundant.
The next breakthroughs in computing history happened after the development period between 1964 to 1969 (the first microprocessors appeared). Now the first mainframe computers of the second generation were used that were more efficient. These ran at a speed of 10-12 MHz and took up less room than ever before as well as costing less. The next breakthrough came from a man named Jack Kilby, who invented “Kilby Trio”. This used silicon transistors which made it possible for smaller faster transistors to be used in a computer. These machines could be purchased for about 100 000 dollars.
The last year was 1970 and it was the period when digital computers (that is ones which use electric currents instead of steam or vacuum) appeared. They were invented by a man named Douglas Engelbart. He invented the “oN-Line System” (NLS) which allowed for the use of a mouse with a computer for the first time in history. This was not just an ordinary mouse though, it was a special purpose computer called “Mouse”.
The early mainframe computers were mostly programmed using FORTRAN and COBOL languages. These had been developed earlier (during the transitionary period of computing history). Later on C programming language was also developed which is still in use now even by professionals.
This is considered as the era in which computers started to become more sophisticated. The main idea behind it was to make computers more user-friendly. This is the time when most of the desktop computers (those you see in people’s homes) were developed. Many different models were produced by various companies such as Apple Computer Inc, IBM and many others during this time. Also, multimedia was invented in this period and that greatly boosted the use of personal computers also known as “PCs”.
A new era was brought about when the first personal computers were introduced in the early 1980s. The main new idea was the development of a few programs which could be run in conjunction and followed by others, making the computer very productive. They found to be very useful because they were cheaper to buy, smaller and had lots of memory and could access the internet.
This period was characterized by rapid growth as computers got much faster and cheaper. This also marked the beginning of another era called “software-defined architecture”. Software-defined architecture is defined as “a computer operating system or bundle of programs whose abstractions are defined by code rather than hardware.
These are considered to be the most productive years between 1984 and 1989 which saw an explosion of new technology and new products being produced. Another important event occurred during this time when personal computers were finally introduced on a massive scale (space computing).
The final era is considered to be that period between 1990 and 1995. These years were characterized by complex machines and many software changes. Software became more sophisticated and the screens eventually took on a flat design. The last important event that happened in this period was the development of Internet which was introduced in 1995 (although it had been around since the 1970s).
Right now, computers are very powerful and can solve extremely complicated mathematical problems as well as carry out very advanced tasks with ease, all thanks to electronic computing.