JavaJack wrote
Why is pixel 0,0 the LEAST significant bit of 16384? In other words, what was the authors' rationale for this approach, as opposed to pixel 0,0 being the MOST significant bit of 16384?
I'm just curious if there's a scientific basis for this, or if both approaches are equally valid and one could more or less flip a coin to decide their design.
Ages old hardware tradition, starting with mechanical teletypes, is to serialize data starting with the least significant bit. TV CRT screens scan in rows, left to right, top to bottom. Thus, the first pixel displayed is the least-significant bit, in the upper left corner of the raster.
--Mark