top of page
  • Writer's pictureTim Buchalka

Do Programmers Need to Understand Computer Hardware?

Does a modern programmer really need to understand the computer hardware and how it interacts with software? Let's talk about that today.



Today we're talking about whether modern programmers need to really understand the computer hardware at a deep level.


I'll start by saying that I've met programmers at all levels. I've got programmers who understand hardware at a really deep level and are really experts with hardware. I've also met programmers who've got a bit of hardware knowledge and some who know absolutely nothing. The good news is that they've all got jobs so what that means is in general, you don't really need to understand the computer hardware to get a job. There are exceptions to that and we'll talk about that a little bit later in this post but I think it's actually a good idea to have some understanding of computer hardware and we'll talk about that a little bit later.


The reason that you don't need that understanding today is that, to a large degree, computer languages that are in use today like Java, C# and Python, they're a high-level programming language so there's an abstraction layer between what you're writing your code and the computer hardware. So you're not dealing with access to chip level access or anything of that nature so it really doesn't need you to actually understand computer hardware to that level. Whereas in years gone by, you really needed to have a fundamental understanding of the chip set and how to get the most out of it to exploit it for your hardware. So it's a good thing that you don't need to do that because that means that you can write a program in any of those languages and support a range of operating systems that run on different hardware. So that's definitely a good thing.


But certain jobs, getting back to certain jobs that need that, if you're looking to get into something like perhaps embedded programming, that's where you're creating software for devices. You're actually writing software that actually is residing on a chip, typically in assembly language or C or something like that. You probably would benefit from computer hardware knowledge.


If you're wanting to, say, get into the Linux kernel and support that open source project, some knowledge of computer hardware would be very beneficial because that is written in a range of low-level languages, C. It's got some assembly and so forth, so you really should probably have some knowledge if you're getting into that.


Another example would be device drivers. So if you're writing software for video card drivers or a printer driver or something like that, they're often written in an assembly language or C and having a knowledge of the computer hardware that the software that you're writing will ultimately run on, would be very beneficial.


In those cases, yes you probably do need some computer hardware experience and maybe it's mandatory. Certainly having that knowledge I think would be very desirable for those positions. With that said, if you're not looking to get into those areas, I still think it's useful for you to have a bit of an understanding of computer hardware.


Things like, for example, if you're using a variable of data type bit or a byte, you're using that variable in one of your programs, I think it'd be very useful to know how does that actually get stored in binary on the computer? So what actually happens when you actually create that variable in your computer? How does that get stored on the computer in memory? That's a useful skill to have because in some cases that can help you become a more efficient programmer because you've got more of an understanding of what's going on under the hood, so to speak. Even other variables like strings, for example, and real numbers or doubles, that would be very useful to understand how that gets stored in the computer hardware because again, it can just get you thinking and hopefully making you a little bit more efficient as well.


So I guess these days, there's less of a need to be focusing too much on how much memory your computer program's using, but certainly that is an issue. You can run out of memory, even with today's modern computers that can probably come standard with 8GB of RAM, that's fairly standard these days, or higher. It is still possible to run out of memory and one of the reasons for that is the operating systems that we're using today enable so many different computer programs to be running concurrently, that even with a lot of memory, you can still run out of memory because the memory on that particular computer may be used by lots of other programs. So it still pays to be efficient where possible and just thinking about it, being efficient is a good skill just basically for a programmer to have, in my opinion.


Giving you a bit of an example about RAM, back in the 1980s, I used a computer called the Amstrad CPC 464. So this beauty didn't have a lot of memory. I think 64K was the default memory of that particular machine and it came out with a disc drive and it was pretty revolutionary for the time because prior to that was cassette tape (that's how we used to store our programs, on cassette tapes so you can imagine how slow that was), it's a sequential device, really slow to store your programs and then to actually load them. So they brought out a disc drive, I think it was about 40MB, still pretty small but compared to what cassette tape was, it was fantastic but it had sequential access only. So it was sort of like a glorified cassette tape. If you wanted to get to the end of contents of the hard drive, you basically had to read through it sequentially. It was nuts, I don't know why they did that.


Anyway, I was a brush young kid who was sort of 18 or something and I thought, "I can fix this". So I wrote some code in Z80 assembly language which enabled basic programmers, because it came with a basic language to access that disc randomly. That means they could write accounting programs and things like that that needed to access random parts of the disc, to access random contents instead of sequentially reading the data each time and I had 2K to write it in - 2K of RAM to get that up and going.


So in that case, I did need to understand the Z80 CPU at a deep level. I needed to understand the registers and how it all worked and it was great! It was great information for me to learn but quite hard to learn because I jumped from basic directly to assembly language. So the point of all this was that back then, it was really important to understand computer hardware and to write a program like that, I felt it was mandatory. There wasn't really any way round it.


Again, getting back to today's languages, Python, C#, Java and so on, you really don't need to deal at that level when you can be a bit lazy and just think, "Oh! Just gonna create some more variables here and I'll just create an array here and I'm not worrying about memory". It can be a mindset to get into so that's why I'm suggesting that it's still a good skill for you to have to learn a bit more about hardware to make you a better programmer because it just gets you thinking and trying to be a bit more efficient in your programming and that's always a good skill to have. All right, so that's a summary.


Do programmers these days need the hardware knowledge? Well for most jobs, as I've outlined, perhaps no, you don't need to have any. Some jobs are essential, it would be essential for you to have computer hardware experience and have an understanding of it, but in general, no you don't need to. But I think having that basic computer hardware knowledge will be beneficial and is a useful skill for all programmers to have.


All right, so I hope that helped. If you've got any questions, feel free to leave a comment and I'll get back to you.

1,196 views2 comments
bottom of page