fundamentally the computer is a calculator. but to understand programming, its good to keep in mind that everything the computer does, it does by:
- converting things to numbers
- doing math on them
fortunately you dont have to love math to love computing, because it does the math for you. its still good to know that everything is a number to the computer:
- the key you pressed is a number
- the move with the mouse is a number
- the signal that goes into the microphone, all numbers
- the picture it displays on the screen: one giant number
the computer doesnt always store everything (like a picture) in the same place; it can actually distribute pieces of it throughout different areas of a filesystem, or the ram. although ultimately the filesystem and ram can be considered one giant number, representing the “state” of the machine at a particular moment.
again, it isnt usually necessary to get into the details of this (if you got in deeply enough, you could design computers.) but if you want to understand how the computer is working when you give it instructions, its doing this:
- translating the instructions you give into number patterns
- translating those patterns into simpler patterns
- ultimately translating those simpler patterns into the number patterns that the hardware can use as numeric instructions
the numeric instructions the computer can use are called “machine language” and really do just consist of numbers. you can imagine a person hitting the various buttons on a calculator, only doing it so quickly that its a blur.
now, what about the non-number keys on a calculator? what about + and – and = and… off? well think of it this way… you can assign a number to each letter of the alphabet, right? (and thats what ascii coding is.)
- a = 1
- b = 2
- c = 3
- d = 4
- and so on…
(but with ascii, lowercase “a” is 97. still…)
well, imagine that each button on the calculator has its own number. so the button that has a “1” on it could be 1, but the button that says “/” could be 47. what matters isnt which number it is, but whether the computer can match the number that already means something to the computer.
once you have this numeric language (and its in the chip inside your computer,) all someone has to do is write a program in this language to translate other things (like a slightly easier language) into this tedious numeric one. and thats what assembly is! its considered a “low-level” language, because it stays close (low) to the numbers the computer understands.
but assembly isnt much fun to use for most people (it is for some) so they write yet another language on top of it, for most uses. by now, abstractions (translations) are happening that differ in concept or structure to the underlying language, making it a “higher-level” language– and now it starts to look maybe a little closer to something thats easy to teach.
if you look at a higher level language, like fortran, you probably wont mistake it for english. you will recognize the line that says “end,” but the line with “write” and parentheses in weird places? still, this line of basic demonstrates the idea:
10 print “hello, world!”
a person reading this for the first time may not know why the 10 is there, or that it will more likely go to the screen than a printed page– but those are details. the line prints “hello, world!” and if you say to a person “this line will print ‘hello, world!'” then at least it might stand to reason.
you could go further, and design a computer language that says something like:
put this text on the screen:
and my own language started out as one of those. the reason you dont see many languages like that, is not that theyre difficult to create– theyre almost always difficult to learn and use! (perhaps applescript was a worthy exception.)
sure, if you only had one phrase to remember: “put this text on the screen:” you could learn that easily. but if you wanted to remember 100 phrases, you would want to type them the same way every time (so the computer can understand them) and if every phrase was as long as “put this text on the screen:” you would need to learn 600 words to do the work of just 100!
so if you design a language that way, make it terse. some are very terse:
ls # this means “list the files”
its easier to remember 50 abbreviations than 50 important phrases, but its easier still to remember 50 words. in any case, you can put them on a piece of paper and look them up (you can keep a shorter list of ones you use most.)
now getting back to numbers… these words do translate to numbers again, right?
yes. but ultimately, the computer (and the various languages it runs on top of the built-in language) will have a place in its memory for each of these “commands” or as i like to call them, “functions.” so each function has a numeric address, just like your street number, or apartment number, post office box number, or phone number.
and whats at that address? instructions on how to do what the function does!
so if you want to really understand computing, keep in mind:
- a number is a number to the computer
- a letter is a number to the computer
- a word, even a paragraph is a number
- a color is a number
- sound is a series of numbers, which can be a very large single number
- video is a series of pictures along with sound, and all of it can be expressed as a number
- any kind of file is a number
- even computer instructions are numbers
so when they say the computer is just a fancy calculator, they really mean it. when the computer puts words and pictures on your screen, its doing everything with numbers.
even better: because of all the work people have done to create language translators, you can write instructions based on simple rules which bear enough similarity to english to make coding a reasonably easy task.
- license: creative commons cc0 1.0 (public domain)