I am slightly confused on how the conversion is happening, so I am trying to work through some of them by hand so I understand how the assembler works.
From my understanding, each line of assembly is getting converted to a 16 bit binary number.
In chapter 6 all the predefined symbols are given a hex value which converts to a 16 bit binary number. So in my symbol table, I can put these symbols and theire corresponding values and load them to print when I read them in.
Here is where I get confused. The destination and the jumps are only given as 3 bit binary numbers. The comp is 6 bits. That leaves 3 bits at the front. If it is a C instruction, those are just all ones.
So if I have D;JMP that should be Default: 111 Comp: 001100 Dest:000 JUMP:111
That leaves me one bit short and I can't figure out where it comes from? At first I thought it was a 1 or 0 for the semicolon, but that should be ignored. Any suggestions or something I am overlooking in the book?