Why does computers only understand 0s and 1s ?

Computers are electronic devices. They started from 'Vacuum tubes' to 'transistors' to 'Integrated circuits'.


Prabanjan M

3 years ago | 1 min read


Computers are electronic devices. They started from 'Vacuum tubes' to 'transistors' to 'Integrated circuits'.

Integrated circuits

An integrated circuit or monolithic integrated circuit is a set of electronic circuits on one small flat piece of semiconductor material, usually silicon.

credits to Wikipedia

So what 🤔?

So the only thing that a computer can understand is, not just a computer but any electronic device is High/Low or On/Off. So put in simple human terms there are only 2 alphabets in the electrical universe.

As a result of which we humans have accepted to denote them as 0 and 1, where 1 denotes 'High' or 'On' and 0 denote 'Low' or 'Off'. Which left us with no choice but to create equivalent notations for every alphabet with just the 0s and 1s.

Extras 🎁✨

So every thing you are currently looking at in your monitor is actually residing in your system as just a series of On and Off transistors in a strict order.

Everything from characters you see to images, videos, software packages and programs included is residing in your computer as a series of on and Off transistors. That should explain to you the answer to the following questions

  1. Why do high quality images take up a lot of memory ?
  2. Why does your system get heated up when you use heavy software packages ?
  3. Why does it get slow when you feel your laptop got heated up to max?

Thank you for reading this through! 🤩

Please let me know about your thoughts on this article. If you have any questions simillar to this please let me know in the comments.

Have a nice day, 👋


Created by

Prabanjan M

I am a final year student passionate about learning. I started this blog to help other students who are struggling, I share my knowledge on topics which I learned in the hard way so that others can learn it in the easy way.







Related Articles