Why a computer buffer is called a buffer

Why is a chunk of working memory called a “buffer”?

The word ‘buffer’, by the way, comes from the meaning of the word as a cushion that deadens the force of a collision. In early computers, a buffer cushioned the interaction between files and the computer’s central processing unit. The drums or tapes that held a file and the central processing unit were pieces of equipment that were very different from each other, working at their own speeds, in spurts. The buffer made it possible for them to work together effectively. Eventually, the buffer grew from being an intermediary, a temporary holding place, to being the place where work is done. This transformation is rather like that of a small seaport that grew into a great city: once it was merely the place where cargo was warehoused temporarily before being loaded onto ships; then it became a business and cultural center in its own right.

From An Introduction to Programming in Emacs Lisp. The same book explains two meanings of “argument”.

7 thoughts on “Why a computer buffer is called a buffer

  1. I encountered the word ‘buffer’ in chemistry before I did in software. In chemistry, a buffer is a solution which maintains a certain pH value even when you add an acid or base to it (up to a certain limit, of course).

  2. I, too, thought that the use of “buffer” in computing was based on the use of “buffer” in chemistry. That is, something (solution or memory) that maintains a certain condition (pH or working status) unless overloaded.

  3. A buffer in chemistry also “protects” against shocks or changes, just like the buffers on trains. The train buffers seem to be more or less the original ones. m-w.com claims this origin: “buff, verb, to react like a soft body when struck
    First Known Use: 1835”.

  4. In some translations of Computer Science texts to Spanish, the word “buffer” gets translated to “tampon”, whose definition matches that of “ink pad”.

  5. I don’t know for spanish but in french, a “tampon” is rather the stamp tool than the ink pad.
    And it’s the translation for buffer too, i.e. something that absorbs shocks.

  6. I dug up a bit on how “buffer” was used in early computer history, since I think any explanation must either fit that early use, or explain how a later interpretation replaced it. The earliest use was for ‘logical-or’, and within 10 or 15 years became more closely associated with to I/O-mapped registers.

    In “A REPORT ON THE ENIAC — Part I, Chapter 1” from http://web.archive.org/web/20150322233942/http://ftp.arl.mil/mike/comphist/46eniac-report/chap1.html is the section “1.2.1.1. Buffers and Cathode Followers”, which says “Buffers and cathode followers are normally non-conducting tubes with a single input and a single output. … When the outputs of a number of buffers or cathode followers are connected together to a common load resistor, the resulting circuit provides for the logical “or” since when any one of the buffers or cathode followers receives a positive signal, the circuit emits a negative or positive signal respectively.”

    Archive.org has more original documents about the ENIAC. For example, page 67/1012 of the PDF at https://ia800509.us.archive.org/10/items/ReportonENIACEl00Moor/ReportonENIACEl00Moor.pdf , “Receiver cathode follower buffer output lines. These are to be used to program the product acc. to receive the partial products.” I think this suggests that a was used as a temporary storage spot, but I found it too hard to understand the technology of the era to make sense of it.

    There’s more about it in the 1950 book “High-speed computing devices”, p44 in the section “Inverse Gates, or Buffer” at http://babel.hathitrust.org/cgi/pt?id=uc1.b4107180;view=1up;seq=60 which says “These *logical or* circuits are referred to as anti-Rossi circuits, isolating circuits, or simply buffers. …. Buffers are commonly used in the input circuits of an electronic register, e.g., in the decade rings of a device like the ENIAC.”

    Note also that page 207 ( http://babel.hathitrust.org/cgi/pt?id=uc1.b4107180;view=1up;seq=223 ) of the same book, concerning the SDC computer, uses “buffer” in a different context. It says there are 32 acoustic-delay-lines for main storage and “a buffer reservoir housed with each of the four external storage units.” I don’t think these two uses of ‘buffer’ are connected.

    Lastly, there’s an easy-to-read book by Prudential Insurance Company of America titled “An introduction to electronic computers” from 1954. Page 202 at http://babel.hathitrust.org/cgi/pt?id=uc1.b4232868;view=1up;seq=226 defines buffer as “An electrical circuit allowing the reception on one line of signals from two or more other lines without feeding back signals on the other input lines; thus a logical “or” circuit.”

    Interestingly, the index entry for “Buffer” in the same book says “see also “Static register””, which is on page 91 (http://babel.hathitrust.org/cgi/pt?id=uc1.b4232868;view=1up;seq=115 ). That paragraph says “It is impossible, therefore, to have a direct read-write process. There must be an intermediate step in which the time scale is changed. This usually involves loading the data from the external memory into a “static” or non-moving register at the low speed of the reading device. It is then read out of the static register at the high speed of the internal memory. In a sense, the static register is a buffer. It is a memory device which can operate at any speed.”

    This sounds similar to the description in the Emacs Lisp document, except that the buffer in 1954 seems more like port-mapped I/O where the CPU could read it directly, rather than ‘being an intermediary’ to the CPU.

    I don’t have a good conclusion. It looks like “buffer” was use in multiple different ways all tied to memory or I/O. It could have been that “buffer” the term for the logical-or used for input circuits became used for accumulating input data, became used for mapping I/O to memory, became intermediate storage. It may also be that the term in software has a different path than in hardware, so rather than the software term evolving from (say) the 1970s hardware use, it actually evolved from the 1950s use.

    Resolving that would require a lot more research. My caution is that the etymology seems trickier than the Emacs Lisp documentation suggests.

Leave a Reply

Your email address will not be published. Required fields are marked *