ALU implementation: Wrong input singnal for control bits?

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

ALU implementation: Wrong input singnal for control bits?

maverickdove
Hello,

I seem to have a strange error when implementing the ALU component. While the hardware simulator and the output file indicate, that zx, nx, etc. are set to 1, I don't seem to receive that value inside my chip. I tried multiple approaches to figure out what I'm doing wrong. I restarted the simulator and changed my code, without success. To reproduce my problem the code snippet below can be used.

Assume first input for test:

|        x         |        y         |zx |nx |zy |ny | f |no |       out        |
| 0000000000000000 | 1111111111111111 | 1 | 0 | 1 | 0 | 1 | 0 | 0000000000000000

I expect the hardware simulator to display debugzx as 0.


// This file is part of www.nand2tetris.org
// and the book "The Elements of Computing Systems"
// by Nisan and Schocken, MIT Press.
// File name: projects/02/ALU.hdl

/**
 * The ALU (Arithmetic Logic Unit).
 * <abbreviated for clearance ... >
 */
CHIP ALU {
    IN  
        x[16], y[16],  // 16-bit inputs        
        zx, // zero the x input?
        nx, // negate the x input?
        zy, // zero the y input?
        ny, // negate the y input?
        f,  // compute out = x + y (if 1) or x & y (if 0)
        no; // negate the out output?

    OUT
        out[16], // 16-bit output
        zr, // 1 if (out == 0), 0 otherwise
        ng; // 1 if (out < 0),  0 otherwise

    PARTS:
    Not(in=zx, out=debugzx);
}

Instead I see debugzx = 1.

Can anyone point me in the right direction?
Reply | Threaded
Open this post in threaded view
|

Re: ALU implementation: Wrong input singnal for control bits?

WBahn
Administrator
I notice that your debugzx signal doesn't go anywhere. Some simulators would trim this logic (though I wouldn't expect this one to do that -- it's too brain-dead). But to eliminate that as a possibility, add it to the output variable list (or just use one of the existing outputs temporarily).

See what that does.
Reply | Threaded
Open this post in threaded view
|

Re: ALU implementation: Wrong input singnal for control bits?

maverickdove
Hey WBahn,
I'm sorry for the late response. I could re-enact the problem as follows using the ALU.hdl:

--

zx=1
Not(in=zx, out=debugzx); -> debugzx = 1

--

zx=1
Not(in=zx, out=debugzx, out=ng); -> debugzx = 0

--

It seems your assumption about the optimization was correct. This behaviour is especially tricky when the "trimmed" chip is used further down in the logic, like it is in my ALU implementation:

--

zx=1
Not(in=zx, out=debugzx); -> debugzx should be 0
And(a=debugzx, b=true, out=debugzy); -> debugzy should be 0 (0 AND 1 = 0) but is 1.

--

Though it all changes when I connect, say the ng output, with the final And-gate.

Eventually this was a human error. After carefully checking my design, it was obvious that there is no output pin "result". After I changed that to out, everything worked as expected. Hurray!
Reply | Threaded
Open this post in threaded view
|

Re: ALU implementation: Wrong input singnal for control bits?

WBahn
Administrator
Glad you got it figured out -- and learned something about the simulator in the process.