blackfishgm wrote
It's interesting that you can connect something to nothing [0 Volts] and the result be meaningful like that.
First, some basics:
Voltage, sometimes called Electromotive Force (EMF) is a force that cause electrons to move (or at least want to if they can). Normally measured in Volts.
Current is moving electrons, hopefully in a wire; rather spectacular when they escape like lightning. Current is normally measured in Amperes (Amps).
The amount of current that flows for a given voltage depends on how good a conductor the wire carrying the current is. How hard it is to drive electrons through a conductor is called its resistance and is measured in ohms.
The relationship between Voltage (E), Current (I) and Resistance (R) in a simple circuit is called Ohm's Law
E = I R I = E / R R = E / I
(Note that physicists stole V from us EEs a long time ago and stuck us with E for voltage. I don't know why current is I.)
Power is how much work it takes to move the electrons, normally measured in Watts when talking about electricity. Power is the product of voltage and current. Using Ohm's Law gives us these relationships
P = I E = I2 R = E2 / R
Now, on to the answers.
Connecting an input pin to 0V causes current to flow out of the input (the current came into the IC from its V+ power pin) and this outbound current is what the IC actually detects as FALSE. The reverse happens when an input pin is connected to 5V. Current flows into the input (and out the IC's ground pin) and is detected as TRUE.
Probably a bit off-topic, but why would 5V or 3.3V be chosen, specifically? Is it because of special properties that these voltages have, or just convention?
Since power is related to voltage squared, it is very advantageous to run at lower voltages. When IC's were being developed in the 1960's, they could be made to work reliably using 5V. Advances in IC technology now allows lower operating voltages. The reduction from 5V to 3.3V is a 2/3 ratio so the power usage is approximately halved.
--Mark