|
|
I finished building the 15 basic chips. I'm curious though, the Coursera course says it should take about 5 hours for this assignment, was this anyone else's experience?
It took me about 16hrs in total, a handful of the more complex chips just did not come to me intuitively, although, I did try and force myself to find the elegant/optimal solution rather than hucking a bunch of gates to solve the problems.
After contriving a few solutions, I looked them up online. In retrospect, the optimal solutions are so clear and simple, but I'm frustrated that I did not see them at the start.
|
Administrator
|
The experiences of different people vary widely. From what I've inferred in hearing people on this forum and the experiences of some of my students when I taught the course (which was also the first time I ever heard of it), I don't think you are too close to being the longest.
It's a very good thing that you slugged it through and come up with your own solutions. This put you into a much better position to really appreciate the elegance of more optimal solutions and to better understand why they were elegant. That represents a growth in your knowledge that is well worth that sixteen hours and will position you to be in a position to see better and simpler approaches to problems in the future -- and not just digital design problems, but problems involving logic in general.
|
|
Thanks for the encouragement! Man, i thought the ALU would be extremely complicated given its functionality, it has been the easiest piece to build so far because of all because of the underlying components are already built.
Chapter 2 Complete!
You, know, I graduated with my CS a little over a decade ago, but going back to the fundamentals is so worthwhile. It kinda broke my brain that someone on Reddit asked... why, if I had been in cyber security for 20 years, I would need to bother with completing this book. I'm kinda blown away at the lack of their understanding of how broad the career field is and how far a given specialty can take you away from the basics.
|
Administrator
|
You've hit on one of the key points of the course -- abstraction.
Imagine having to build the ALU (or, later, the entire CPU) from nothing but a bunch of Nand gates. The human mind has a very hard time working at both the high-level big-picture perspective and the low-level implementation perspective at the same time. We work best when we work at a level of perspective that has just enough detail to get the job done while being able to think in terms of the task immediately at hand.
We do this all the time without even thinking about it. When you set out to do "Spring cleaning", you don't think in terms of cleaning out the top-left dresser drawer. You think in terms of cleaning the bedroom, then the kitchen, then the bathroom. But when you get to the bedroom, you think about cleaning the closet, then the dresser, then the bed, then the desk. When you get to the dresser, that's when you think in terms of the individual drawers and, depending what's in them, you might break it down even more.
The same with the ALU. It's most natural to think of it in terms of the major functional blocks, like an adder, and something to select between different signals, and something to invert all the bits in a signal, and so on. So by implementing the smaller bits of logic so that you have parts that do those things, you have a match between the level of abstraction of the immediate problem and the level of the tools available to solve it.
This becomes all the more important as you walk up the software chain from the assembler to the virtual machine, to the compiler, to the operating system.
|
Administrator
|
a1ph4byte wrote
You, know, I graduated with my CS a little over a decade ago, but going back to the fundamentals is so worthwhile. It kinda broke my brain that someone on Reddit asked... why, if I had been in cyber security for 20 years, I would need to bother with completing this book. I'm kinda blown away at the lack of their understanding of how broad the career field is and how far a given specialty can take you away from the basics.
Very true.
When I was first introduce to this course (i.e., when I learned that I would be teaching it a couple weeks before the beginning of the semester) I made it through the first five chapters in an hour and a half and had the sixth done an hour after that. But my background lent itself to the hardware aspects. Even so, when I did the adder I thought I saw a fatal flaw because it had no carry-in bit and how could you possibly do two's complement arithmetic without that? The solution was SO clever and, in figuring out exactly why it works, I learned a way to relate the arithmetic to the logic that had never occurred to me before and that I have used occasionally since.
The assembler was nothing, but the VM took quite a bit longer even though I was familiar with the notion of stack frames and how arguments and local variables are handled by functions. But actually implementing the full details really cemented it in place -- and seeing how it could be implemented on a processor that has absolutely zero support for it natively. Then the compiler again introduced the details to make stuff that I already "knew" much more solid -- plus seeing how it leveraged the VM implementation instead of generating native processor code directly really drove home the value of well-layered abstractions (even though that had been a pretty strong bedrock in my problem solving skills already). Implementing the OS was a LOT of fun. I especially liked having to do the implementation of the dynamic memory allocation -- that took all the mystery out of it (at least in its most simple version). Most people get a lot out of implementing the character map, but that was something that I had had to deal with on a smaller scale in implementing code to drive a custom LCD module from a brain-dead PIC microcontroller years ago. The graphics and math functions are also something that turn on a lot of people, but that I already had the relevant experience to do handily, though I still learned a few clever tricks.
While I can't stop saying enough about this course, I really want to see a Part II follow-on that focuses on the hardware that we don't see here, namely the keyboard driver, the display driver, some kind of external storage, and a basic terminal program akin to the old DOS/BIOS days. I know some people have done those things and they are things that are best done with actual hardware, which makes it a lot harder to design a course that had wide acceptance.
I'd also like to see a SuperHack processor that has some additional bells and whistles that would make it suitable for a compiler course that wants to get into things like register assignment and various optimization strategies. I don't think it would take much.
|
|
I think those that teach get a unique perspective out of completing this course. It really is well laid out. I teach cyber security principles part-time at a university and have done quite a bit of course development. It would be interesting to see if a few folks would want to develop an unofficial follow-on to this course for the subjects you describe. In particular, I too think that the device driver implementation would be very rewarding to tackle.
|
|
I'm very cognizant of the concept of abstraction, without it, it would be impossible to teach IP network communications. But man, this project brings it to life. Just finished chapter 3. The simplicity of scaling up the RAM chips is truly a beautiful thing.
|
Administrator
|
a1ph4byte wrote
I'm very cognizant of the concept of abstraction, without it, it would be impossible to teach IP network communications. But man, this project brings it to life. Just finished chapter 3. The simplicity of scaling up the RAM chips is truly a beautiful thing.
Doesn't it, though?
Depending on how you implement it, a DFF as about 20 transistors in it. 32 Kwords of RAM has over half a million DFFs in it, so that's over ten million transistors!
And someone physically making a RAM chip builds it up just this way. In steps where each step increases the size of the building block by just a small factor. The beauty of exponential growth at work.
|
|