World's most popular travel blog for travel bloggers.

[Solved]: Is order of bits in byte really not of concern?

, , No Comments
Problem Detail: 

What I can't wrap my head around is sentence repeated everywhere I look, that order of bits in byte is not important(not of my, as a programmer, concern). My question then is if there is possibility that it makes difference?

For example, I crate a binary file with just 0x1 in it (represented on my machine as 00000001). What keeps other machine to read the same byte as 128(10000000) ? Is there standard for msb placement in file, memory that guarantees compability or am I missing something trivial/obvious along?

EDIT: Thanks to dirk5959's answer I found out that my machine is little-endian for bytes and the same is for bits in byte. Additional question is, if it is a rule or there is some architecture that behaves different?

Asked By : zubergu

Answered By : Kyle Jones

Bytes are transferred from memory to disk using an I/O protocol (e.g. SCSI) that specifies the bit order of transmission in the case of a serial protocol, or for parallel protocols specifies which pin upon which each bit in a byte should be transmitted. For bytes moved from memory over the network, the network link level protocol (e.g. Ethernet) specifies the bit order. In either instance, an application programmer need not concern herself with the details; the operating system in concert with disk or network controllers will maintain the correct bit order end-to-end so that the correct values are transmitted/received or store/retrieved.

N.B.: byte order is another matter altogether.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/35455

0 comments:

Post a Comment

Let us know your responses and feedback