2021-11-27 18:50:20
Sometimes I write digital archaeological posts, for example, about the origin of keyboard layouts, how r-pentomino was invented, or about April Fools' RFCs. Today I will write a little about the etymology of various computer terms.
Everyone knows that the word modem comes from a combination of modulator and demodulator - two devices used to convert digital information into a form convenient for transmission over analog networks and back. The word codec ([en]coder + decoder) and the less widely known slang terms like balun (balanced + unbalanced) and serdes (serializer + deserializer) have a similar origin.
Similar to a codec in spelling and sound, the name of the Kodak company, registered in 1888, has a different nature. The company's founder, George Eastman, wanted to invite a new word — short, easily recognizable, and pronounced in different languages. According to legend, he used a set of letters from the Anagrams game (the grandmother of the Scrabble game). One of the criteria for George was the use of his favorite letter K, which accounts for 40% of the result. The idea to make a new word was not entirely successful: in 1896, on the pages of the Amateur Photographer magazine, readers made a dispute, trying to find out the word's origin. It was found, for example, that in Hindustani (it came there from Persian), this word means "boy," and one of the readers pointed out the similarity with the Hebrew Kahdak.
The word bit in the sense of a minimum amount of information was first publicly used in Claude Shannon's 1948 article" Mathematical Communication Theory." Claude himself referred to the authorship of the mathematician John Tukey, who used bit as an abbreviation for binary [information] digit in internal documents of Bell Labs. The word byte (distorted English bite as piece) stands for the minimum amount of information processed at one time or directly addressed. Werner Buchholz first used it in 1956 in the design documentation for the IBM Stretch system. On different systems, bytes come in various sizes, for example, 4, 6, or 9 bits (the size of a byte can even be variable). To accurately indicate the size of a byte of 8 bits, it is common to use the term octet.
For engineering reasons, it is more efficient for computers to work with numbers that are powers of two. Therefore, engineers often understand the word kilobit as 1024 bits (2^10), but in some cases, it means 1000 bits (10^3, as with other measures, such as meters). For example, the 1968 year's edition of the Encyclopedia of Library and Information Science, on the same page, states that 1 kilobit is 1000 bits, and 1 kilobyte is 1024 bytes. A similar story with the prefixes mega, giga, and so on. All this confusion continued until the end of 1998 when the International Electrotechnical Commission finally came in and fixed it (no). Since then, according to international standards, kilobits should mean 1000 bits, and for 1024 bits, the term kibibit should be used. However, not everyone agrees with this: according to the Russian "Regulations on the units of quantities" from 2009, the term kilobyte is fixed anyway in the value of 1024 bytes.
To measure the data transfer rate, in addition to any kilobytes per second and kilobits per second, engineers sometimes use terms based on the word baud (for example, kilobaud). Baud in modern communication usually means the number of changes in the carrier frequency per second, so if, for example, the carrier uses two signal levels, then 1 baud is 1 bit per second. But this is not accurate because the bits are considered gross here, i.e., include any overhead information, such as error correction. These bauds are named after Jean Maurice Émile Baudot, a French engineer who, in 1870, invented the basic encoding for the telegraph (aka International Telegraph Code # 1).
Also, it turns out that the word android is almost three times older than the word robot (which turned 100 years old last year).
102 viewsedited 15:50