Writing your own CPU Emulator as an extra exercise ?

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Writing your own CPU Emulator as an extra exercise ?

iSol
This post was updated on .
Hello,

I was wondering if it makes any sense to re-implement my own version of the CPU Emulator in order to better understand how the Hack language works. I was thinking at using C++ and SDL2.

My idea is to write a fast emulator that will let you run Hack games much faster than the current Java implementation. Obviously a fast emulator doesn't need tools to inspect the RAM or see what instruction is currently running. For debugging purposes one could use the original Java CPU Emulator which is well suited for this task.

As an extra improvement I can add a color display (using the same video memory) but treating groups of 4 bits as a palette index for 16 colours. Obvs. the actual display area will be reduced or we can simply scale it up. The idea is to add small improvements while maintaining full compatibility with the current Hack computer specification.

Reply | Threaded
Open this post in threaded view
|

Re: Writing your own CPU Emulator as an extra exercise ?

iSol
I have a working emulator of the CPU that can run all included examples from chapters 4 and 6. The nice thing about the new implementation is that it seems a bit faster and doesn't flicker as much as the original (SDL2 uses OpenGL under the hood).

If there is enough interest in this I could make it open source. For now the emulator works exactly like the machine model described in the book.

I'm thinking at extending the machine while maintaining compatibility with the original model. The idea is to add an extra register (or use R15) as a settings register that when 0 will run in compatibility mode when set we could use the bits for various settings.

I have a few ideas about extending the machine with a colour screen for example. I'm thinking at using the EGA  320x200 pixels 16 colours model. Basically the machine could run in 2 modes:

- compatibility with a 512x256 pixels  B/W screen like in the book
- enhanced graphics with a 320x200 pixels and 16 colours (we could use an extra 16K RAM module, making the total RAM size equal to the ROM size).

I will leave this here to hopefully get some feedback while I work through the remaining chapters of the book.
Reply | Threaded
Open this post in threaded view
|

Re: Writing your own CPU Emulator as an extra exercise ?

cadet1620
Administrator
Sounds like a great project!

If you want to give your emulator a workout, try Bichromia.hack from this post.

--Mark
Reply | Threaded
Open this post in threaded view
|

Re: Writing your own CPU Emulator as an extra exercise ?

iSol
Thanks, I will try it.

Do you know if there is any flag in the machine model that prevents the screen form refreshing at fixed intervals ? In my code I simply set on/off a register (R15) and draw only when it is on. This convention seems to reduce the flicker when the screen is redrawn.

I should probably finish the course first and improve my simulator later . Otherwise I risk to introduce some incompatibility. I finished the first 6 chapters, working on chapter 7 now.

Reply | Threaded
Open this post in threaded view
|

Re: Writing your own CPU Emulator as an extra exercise ?

cadet1620
Administrator
You are going to see in chapter 7 that all of the memory locations 0-15 have dedicated usages in the standard VM to ASM translation.  R13-15 are reserved for the VM translator to use as scratchpad memory when implementing VM commands.

I'd sugest implementing screen refresh control as a memory-mapped I/O port, similar to the KBD input port.  This port would also be a place where you could control standard display vs. color display.

Also note that the keyboard's address is 24576, immediately following the 8K screen buffer. This will make it problematic to extend the screen buffer.

One way to get a larger screen buffer might be to use addresses 32768 - 65535. This gets a bit weird in hack since everything is signed. That screen would appear to be based at -32768.

--Mark
Reply | Threaded
Open this post in threaded view
|

Re: Writing your own CPU Emulator as an extra exercise ?

iSol
cadet1620 wrote
One way to get a larger screen buffer might be to use addresses 32768 - 65535. This gets a bit weird in hack since everything is signed. That screen would appear to be based at -32768.
Good idea. Actually the code could be pretty clean from the user perspective, as long as the compiler magically translates numbers larger than 32768 into their negative signed 16 bits equivalent. The machine code will basically get a negative number and the CPU simulator could use something like:

		
if(s >= 0) {
	indx = s;
} else {
 	indx = 65536 - std::abs(s);
}

From the user perspective he works with unsigned 16 bit ints for indexes. It is actually quite nice because as long as you have the correct offset of an array (say -40000) you can go forward simply by adding +1 and because int16_t wraps up you will get the negative indexes in the correct order. The burden to convert negative indexes into positive ones resides with the CPU Emulator.
Reply | Threaded
Open this post in threaded view
|

Re: Writing your own CPU Emulator as an extra exercise ?

cadet1620
Administrator
The Jack language that you will learn in chapter 9 and write a compiler for in chapters 10 and 11, and use to write your OS/system library in chapter 12 only supports signed integers.

It is also untyped and does not detect overflow, so it is fairly easy to deal with numbers > 32K-1 by just letting them overflow. Division of numbers >= 32K is a bit tricky, though.

I'd suggest not spending too much time o this until you've completed the book. Your perspective will change. You will find that most significant games written in Jack will be runable only on the VMEmulator because they are too big for ROM after translation to ASM. Programs running on the VME also run much faster and smoother than they do when translated to ASM.

If your read through the entire post about bichromia, you see what was required to get it translated to fit in RAM.

--Mark