Interesting Question: What compiled the first compiler?

Joined
Dec 10, 2006
Messages
540
I was thinking this morning as I was getting ready and I thought of this. How was the first compiler ever made when there was nothing in existence to compile the compiler? Anyone want to get up on a soapbox and explain it? I'd love to hear it. From my quick research it dealt with bootstrapping or BCPL or something. I'm not quite sure, it wasn't all that clear.
 
Could God write a program so long that not even he himself would have the memory to compile it in a single pass?
 
Well, what is the result of compiling and link-editing a program? Whatever that result is will be what the first compiler was written in.

Think of a gigantic machine which is used to build bolts, and the bolts are used to build more of that same machine. How did the first bolts get developed to build the first machine which built the bolts which was used to build the current machine? At it's most basic level someone had to do it by hand until the machine was developed to produce it mechanically.

Software development is the same way: until they developed something sufficiently complex to interpret something human-readable and change it in to something usable by the target machine people did it "by hand."

2.png
 
++ ^^

I think some are too spoiled by modern tools. The first OS I worked on was written in hand coded assembly and all assembly language I wrote was in a very basic assembly monitor. People used to code like that.

In school I remember programming a microcontroller *on paper*, generating the opcodes, data and branch/jump offsets by hand and manually keying it in on a hex keypad to test and run it. :p That was old school.
 
I think some are too spoiled by modern tools.
Heh... I'm sure some of the real old-timers (from back in the days when entering a machine instruction meant physical rewiring) would say the same thing about your fancy assembler :D
 
The first assembler I had was years later for the PC. My assembly monitor wasn't that fancy. :p
 
I'm no programmer, but wouldn't it just be directly written in machine code?
 
They weren't told anything. It's how they were built. It's actual physical circuitry. Describing exactly how the circuitry in a CPU works is enough to fill an entire college semester.
 
Well then answer this, how were the first machines told how to interpret machine code?

Their circuitry was designed to understand that

0111 on it's four instruction wires meant JUMP, 0110 meant AND, 0101 meant OR, etc. (note: this is arbitrary machine that may or may not have existed).

Did the machiens acutally KNOW what these instructions are? no, they just responded to the input as they were designed to and performed the only action that would result from the input.

So the first compiler wasn't machine compiled. It was written by hand, using the basic bit manipulations required to make the processor perform the functions that the programmer wanted.
 
The first assembler I had was years later for the PC. My assembly monitor wasn't that fancy. :p

I think that you meant you used machine code, then. I did the same thing, and agree that students should learn low-level architecture first. Floating around sixty layers up in their Java virtual machine, they've lost any concept of what's a computer really does.

Well then answer this, how were the first machines told how to interpret machine code?
By their hardware design.

Ya'll might be interested in reading Code, by Charles Petzold; or Write Great Code: Understanding the Machine. These books explain some of these fundamentals in a very approachable way.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I think that you meant you used machine code, then.
Ah, I should have figured someone else would understand what a machine code monitor was (and the primativeness of programming in one). :p But "assembly monitor" is more understandable to the masses who get the gist of it.
 
Hey mike...or anyone else, how much detail do those books go into on hardware. I've been having trouble trying to keep the hardware level as a behind the scenes, "it just works" type of abstraction while learning the higher level concepts. I did get alot out of my digital design class last semester, and I've got assemby coming up in the fall, but there's gonna be some major gaps. Would that be what those books were written for?

I'm always trying to drag you off topic but I think this applies fairly well. :p
 
I think that you meant you used machine code, then. I did the same thing, and agree that students should learn low-level architecture first. Floating around sixty layers up in their Java virtual machine, they've lost any concept of what's a computer really does.

I'm not arguing against that or anything, but I'm going to point out that it's difficult to do unless you're learning on your own time, and not everybody can learn successfully on their own. If you're a student, you can't just take an assembly programming class, because you usually have to complete "Programming I" as a prerequisite, which is usually some high level language like Java, Visual Basic or Python.

Most High Schools don't even offer low-level language classes, so High Schoolers interested in taking a programming class are stuck learning a high level language.
 
Last edited:
I don't mind -- it's the moderators you've got to worry about.

One of my friends runs a site called fpgcpu, where he discusses designing CPUs from scratch using FPGA parts. The site has become kind of stagnant, but has lots of great references and interesting links. One of the best is the Magic-1 CPU, which is a CPU and small computer that a guy built from TTL SSI parts. You can TELNET into it; seriously!

For textbooks, Modern Processor Design is pretty good, but the parts about fabrication and production are kind of crappy. I have another book but everything is in storage because of the remodel. I can't find it at Amazon, though; I think it is out of print. There's also Computer Organization and Design, but it might be one level higher than what you want.

Another thing you can do is look at data sheets for older processors. The IP isn't that precious anymore, so there's lots of documentation about how the processors really work internally. Companies that make replacement and compatible parts are expressing more about how they really work and not hiding details anymore.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
I'm not arguing against that or anything, but I'm going to point out that it's difficult to do unless you're learning on your own time,
I don't disagree with your point. The complexity of learning low level architecture first is fine for a university or other very motivated student, but not realistic for many youngsters who are introduced to programming in various forms. Middle schools and high schools are lucky enough to have a decent computer teacher, excluding some teach-to-the-test AP comp-sci teachers (I was lucky... i had an Atari computer nerd/hacker teacher in middle school one year and a very good advanced/AP computer programming teacher for 3 years in HS). The track of HS students who may benefit from it (AP) would be a year or two too late for it to be first, but even at that stage I don't think it is too late.

Further, there are many languages that would benefit very little whether application developers know assembly or low level hardware details because they contain their own abstractions which by design don't necessarily map to the underlying hardware. Or the programming task has little control to how low level hardware resources are used when sufficient (high level) good practices are used.

But someone serious about hard core development using a language that could take advantage of the knowledge would be well served to pick it up, even later. In a way that's like anyone who takes development seriously. Java, .NET and scripting programmers should learn the details and pitfalls/cliffs of their chosen languages to improve their skills. The required level just goes down even lower for some developers.

Motivated people who grew up programming in the early 80s and before did have that opportunity, not just as a nicety though. Poring over ROM function entry points and I/O registers and speed optimizing assembly language to make effective use of the hardware was almost a necessity to do something cool with the hardware. Who would have known that 26+ years later that it would be considered a good way to start? :p
 
Last edited:
I'm not arguing against that or anything, but I'm going to point out that it's difficult to do unless you're learning on your own time, and not everybody can learn successfully on their own.

To go against the grain, (no offense to you older posters,) but I don't think there's anything wrong with learning a higher level language first; they are comparatively *very* easy to pick up and they make the entrance criteria to programming much more relaxed than it once was.

And in support of the grain, nowadays I don't really do any large scale programming, but more than anything else I really appreciate the lower level stuff that I picked up since it applies much more directly to what I do.
 
Last edited:
It's certainly true that individuals learn in different ways, though I'd wonder if someone who isn't able to learn through self-study should be involved in computer science.
 
Back
Top