top of page
  • Writer's pictureTim Buchalka

History of the Computer Part 1 - Learn to Code Series

So while not strictly essential, knowing computer hardware can ultimately make you a better programmer and that's because you really start to understand how things work under the hood of a computer. For that reason, we're gonna start this Learn to Code course with some basic hardware information.



We're gonna cover the history of computers, Boolean operations, gates, circuits, and switches. Then we'll end up with discussing the main components found in modern computers, namely input, output, the central processing unit, or CPU, memory, and storage.


Did you know there's more power in one of these devices, one you wear on your wrist, or one that you carry in your pocket than existed on the computers that went to the moon in 1969? So let's take a whirlwind tour now of the history of computing, starting way back in the 14th century.


I'm Tim Buchalka from the Learn Programming Academy and this is my Learn to Code series of blogs so subscribe to keep updated on new course releases. But for now though, let's take a short trip down computer memory lane.


Many of us are not avid history fanatics, but the history of the computer is absolutely fascinating and interesting. The computer's evolution is a relatively short one even though humans have used many techniques for thousands of years to help them calculate and compute. Keep in mind that the computer is just the delivery vehicle and it's the software inside it that actually allows us to do the things we do with it, from sending a text message, making a phone call, surfing the web, to playing games, monitoring our health via wearables, to using robotics, artificial and business intelligence, and expert systems, and much, much more.


There really is no end in sight in your lifetime to the evolution of computing as researchers keep pushing the boundaries of what is possible. I've found it very interesting that the term computer was first used during World War II in the 1940s and referred to humans, mostly women, who did all kinds of calculations in huge rooms with dozens of other human computers for the war efforts of that day.


So some often referenced calculating devices over the last 700 years include the abacus, that's a calculating tool of the 1300s that originally used beans or stones moved in groves of sand, stone, or other material. Today's abacus tends to use beads that slide across thin wires.


The first slide rule appeared in the early 1600s and Blaise Pascal designed and built a mechanical calculator in the mid-1600s. Charles Babbage designed and built his difference engine in 1823, which could do addition, subtraction, multiplication, and division, and solve polynomial equations and other mathematical problems. He went further and designed an analytical engine in 1837 that he never built, but it included parts mill, store, operator, and output that mirror that of modern day computers.


Herman Hollerith used an existing punch card concept, from the early 1800s, to design and build a programmable card-processing machine to read, tally, and sort data on punch cards for the United States Census Bureau for the 1890 census, which had 80 questions, hence the 80 column punch card. And in 1924, Hollerith founded the company that became IBM.


So the 1940s touched off many advances in the computer, however, they were still referred to as calculating devices during this era and that's because the definition of a computer was still attributed to humans doing massive calculations during this decade. During this decade, several calculating devices emerged making use of vacuum tube technology. The vacuum tube is a glass tube that has its gas removed creating a vacuum. Vacuum tubes contain electrodes for controlling electron flow and act as a switch or an amplifier.


I'll briefly describe now a few of the most commonly cited calculating devices. Now an initial use of vacuum tubes and computers was attributed to John Atanasoff and Clifford Berry at Iowa State University in the United States, and it's known as the Atanasoff-Berry Computer, or ABC for short. Alan Turing's Colossus was built for the British Enigma Project in 1943. The Mark I Relay Calculator, in 1944, was an electromechanical device that used relays, magnets, and gears to process and store data.


The ENIAC, electronic numerical integrator and calculator, in 1946, is said to be the first publicly known electronic computer developed by John Mauchly and Presper Eckert at the University of Pennsylvania in the United States. John von Neumann proposed a radically different computer design based on the notion of a stored programme. His research group, again at the University of Pennsylvania, built one of the first stored programme computers, the EDVAC, in 1949.


Nearly all modern computers today still use the Von Neumann architecture. Many of these calculating and computing devices were the foundation for what has become known as the computer, morphing the term first associated with humans to a machine.


And on a humorous note, the term computer bug, or bug, used mostly today to refer to defects in software, well that actually dates back to 1947 when a real bug, a moth, was actually found dead on a circuit board of a Mark II Relay calculator at Harvard University. Now the technicians removed the bug, the dead moth rather, and said in their report that they'd quote unquote debugged the calculator. A fascinating bit of history for ya.


All right, so in the next post, we'll briefly review the computer's history from the 1950s through to today and even tomorrow, what's coming up in the future. But to end the post, do you know what computer generation we're in right now? Find out the answer to that in the next blog. Thanks for reading and see you in that next post.

111 views0 comments

Recent Posts

See All
bottom of page