Hi,
We know that computers (more specifically processors) convert stream intervals to a 1 or a 0. However, it always takes something to convert these zeros and ones into bits and bytes = useful information? This is done by programs written in a particular programming language. Now I wonder: the very first programming language: shouldn’t it have consisted of these zeros and ones? And how was it written?
Best regards,
SW
Answer
In the beginning people did indeed program with zeros and ones! On the very first computers, so-called “punch cards” were used, this is a piece of sturdy cardboard with holes in it, where a hole represented a 1 and no hole a 0. All data and programs were then written on such a punch card and read in by the computer.
(ref: http://en.wikipedia.org/wiki/Punch_card and http://en.wikipedia.org/wiki/Computer_programming_in_the_punch_card_era)
Now, of course, those zeros and ones mean something. For example, you probably know that bits are combined into bytes (8 bits). Every processor (the part of your computer that processes instructions) has a number of simple operations it can perform: retrieve numbers from memory or write them back (also in bytes of course), move numbers, compare or add numbers, skip instructions, etc. Every instruction is represented by a number of bytes. You can also program with these instructions. The first real programs were written in this “language”, which is called Assembler. In this language (which can therefore be translated directly into zeros and ones) you literally say what the processor has to do.
Many microcontrollers are still programmed in assembly language because it is a very powerful language – you have complete control over what happens. However, not all processors have the same instruction set (and thus are not compatible with assembler programs for other processors) and it is also very complicated to develop large programs in assembler. That is why programming languages ​​were (and are) being developed where you can tell your computer what to do at a higher level of abstraction. A compiler then translates what you program into assembler and finally into zeros and ones.
Answered by
ir. Joris Borms
Pleinlaan 2 1050 Ixelles
http://www.vub.ac.be/
.