Why is pixel 0,0 the LEAST significant bit of 16384?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Why is pixel 0,0 the LEAST significant bit of 16384?

JavaJack
Why is pixel 0,0 the LEAST significant bit of 16384? In other words, what was the authors' rationale for this approach, as opposed to pixel 0,0 being the MOST significant bit of 16384?

I'm just curious if there's a scientific basis for this, or if both approaches are equally valid and one could more or less flip a coin to decide their design.
Reply | Threaded
Open this post in threaded view
|

Re: Why is pixel 0,0 the LEAST significant bit of 16384?

cadet1620
Administrator
JavaJack wrote
Why is pixel 0,0 the LEAST significant bit of 16384? In other words, what was the authors' rationale for this approach, as opposed to pixel 0,0 being the MOST significant bit of 16384?

I'm just curious if there's a scientific basis for this, or if both approaches are equally valid and one could more or less flip a coin to decide their design.
Ages old hardware tradition, starting with mechanical teletypes, is to serialize data starting with the least significant bit. TV CRT screens scan in rows, left to right, top to bottom. Thus, the first pixel displayed is the least-significant bit, in the upper left corner of the raster.

--Mark
Reply | Threaded
Open this post in threaded view
|

Re: Why is pixel 0,0 the LEAST significant bit of 16384?

JavaJack
Maybe I'm misremembering, but it seemed reversed from the bitmaps you might find in Windows DIB format, TIFF files, and (going back a ways) the old re-definable character sets on Commodore 8-bit computers. Most explanatory texts display the MSB leftmost, too.