I've kind of just skimmed over the fact that you can set individual pins to true or false so far, but now that I've actually used that functionality in the incrementer, I just was wondering how this is justified physically - like what would the physical implementation of setting a pin to "true" or "false" be?
Setting an input to false is done by connecting it to 0 volts, normally called "ground". Setting an input to true is done by connecting the pin to the positive power supply voltage, usually 5V or 3.3V, depending on the system design.
It's interesting that you can connect something to nothing [0 Volts] and the result be meaningful like that.
First, some basics:
Voltage, sometimes called Electromotive Force (EMF) is a force that cause electrons to move (or at least want to if they can). Normally measured in Volts.
Current is moving electrons, hopefully in a wire; rather spectacular when they escape like lightning. Current is normally measured in Amperes (Amps).
The amount of current that flows for a given voltage depends on how good a conductor the wire carrying the current is. How hard it is to drive electrons through a conductor is called its resistance and is measured in ohms.
The relationship between Voltage (E), Current (I) and Resistance (R) in a simple circuit is called Ohm's Law
E = I R I = E / R R = E / I
(Note that physicists stole V from us EEs a long time ago and stuck us with E for voltage. I don't know why current is I.)
Power is how much work it takes to move the electrons, normally measured in Watts when talking about electricity. Power is the product of voltage and current. Using Ohm's Law gives us these relationships
P = I E = I2 R = E2 / R
Now, on to the answers.
Connecting an input pin to 0V causes current to flow out of the input (the current came into the IC from its V+ power pin) and this outbound current is what the IC actually detects as FALSE. The reverse happens when an input pin is connected to 5V. Current flows into the input (and out the IC's ground pin) and is detected as TRUE.
Probably a bit off-topic, but why would 5V or 3.3V be chosen, specifically? Is it because of special properties that these voltages have, or just convention?
Since power is related to voltage squared, it is very advantageous to run at lower voltages. When IC's were being developed in the 1960's, they could be made to work reliably using 5V. Advances in IC technology now allows lower operating voltages. The reduction from 5V to 3.3V is a 2/3 ratio so the power usage is approximately halved.
Very cool! Thanks for the primer. I've never really done much with anything lower level than pointers in C, so getting down to the very nuts and bolts of everything has been quite enlightening.
If you'll continue to humor me for a bit longer, when you say that the chip "detects" the flow of current to be 0 or 1, what exactly does that mean, what is the detection mechanism. I imagine a transistor of some sort?
Hi, you wrote: "Connecting an input pin to 0V causes current to flow out of the input (the current came into the IC from its V+ power pin) and this outbound current is what the IC actually detects as FALSE. The reverse happens when an input pin is connected to 5V. Current flows into the input (and out the IC's ground pin) and is detected as TRUE. "
Can you please elaborate more on this, I am not sure I fully understand this. What do you mean by outbound current and how does the IC detects an outbound current? I can "see" how voltage would cause electrons to circulate but that's about as far as I see.