User:Lekshmi.gk/sandbox

In computing, endian and endianness refers to how bytes of a word are ordered within computer memory. In simple terms, a "word" is one unit of data (may be a number or a string) and is typically 4 bytes long. If memory is considered an array of bytes, and a word to be stored consists of multiple bytes, then we need an order to place these individual bytes in memory. Big endian systems are systems in which the most significant byte of a "word" is stored in the lowest numbered memory location. In contrast, little endian systems are those in which the least significant byte of a "word" is stored in the lowest numbered memory location.

Say a word was "ABCD", A,B,C,D being 4 bytes, A the most significant byte and D the least significant, and memory locations 01,02,03,04 are used, 01 being the first location in memory used to store this word. Then, in big endian systems, A would be placed in 01, B in 02, C in 03 and D in 04. In little endian systems, D would be placed in 01, C in 02, B in 03 and A in 04. There are many examples of both types of systems, with the principle reasons for implementing either format being the underlying operation of the given system. A common example of little endian system is the Intel processor and big endian systems is the Motorola processor. Mixed forms are also possible, for instance the ordering of bytes within a 16-bit word may differ from the ordering of 16-bit words within a 32-bit word. Such cases are rare and are sometimes referred to as mixed-endian or middle-endian.

Endianness is important as a low-level attribute of a particular data format. A code/file written in a big-endian system maybe be read word-reversed in a little-endian system and vice-versa. It is especially problematic when communicating across the internet when systems do not know each other. The term big-endian originally comes from Jonathan Swift's satirical novel Gulliver’s Travels by way of Danny Cohen in 1980.[1]