|A 2012 parody depiction of the Shannon bandwagon from American electrochemical engineer Libb Thims' article “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair.” |
Bandwagoners tend to cite: Claude Shannon, Norbert Wiener, Leon Brillouin, Ludwig Bertalanffy, and Warren Weaver, as classical bandwagon instrument players. 
Historically, Leo Szilard (1929) and Gilbert Lewis (1930), and in some apocryphal cases Ludwig Boltzmann (1894) and William Thomson (1851) sometimes enter the mix, via the citation "over-reading" method.
Semi-modern bandwagoner theorists or connective writers include: James Coleman, Jay Teachman, Jeremy Campbell, Jerome Rothstein, Johan Galtung, Horton Johnson, Luciano Floridi, Olivier Beauregard, Richard Raymond, Seth Lloyd, Kenneth Bailey, Terry Bynum, Seda Bostanci, Robert Doyle, Stanley Salthe, Hubert Yockey, Stephen Coleman, Loet Leydesdorff (2001) (Ѻ), to name a few.
In 1940, Claude Shannon, at the Institute for Advanced Study, was working on a probability-based reformulation of Ralph Hartley’s 1928 “Transmission of Information" model, wherein the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences", was best "practical measure of information", specifically in regard to a telegraph operator sending 1s (electrical current HIs) and 0s (electrical current LOs) in a telegraph transmission.
In fall 1940 to spring 1941, Shannon consulted John Neumann (see: Neumann-Shannon anecdote) on what “name” to give to his new Hartley-stylized information formula. In response, Neumann joked that he should call his formula by the name “entropy” per reason that Leo Szilard, in his 1922 PhD dissertation “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings”, had done something similar with logarithms, namely he espoused on the relationship between Maxwell’s demon and lack of information (in a piston and cylinder) about the location of atoms. Shannon took this advice to heart, and in 1948 confabulatedly called his new computer science information measurement quantity by the thermodynamic heat state function quantity name entropy, therein situating the implicit, albeit incorrect, view that the two quantities are one and the same.
In 1948, Shannon's new information theory was published. Shortly thereafter, because entropy of thermodynamics is universal function, i.e. applicable to EVERY body in the universe, people naively began to think that Shannon's formulations were universal likewise; therein, applications were being made into fields outside of communication by radio or wire.
In the first London symposium on information theory, held in 1950, six out of twenty papers presented were about psychology and neurophysiology. This number increased to eight by the time of the second symposium.
In 1955, in the midst of this growing application of new telegraph communication mathematics theory, L.A. De Rosa, chairman of the newly formed Professional Group on Information Theory (PGIT), published the following query memo, attempting to get and hand as to which direction the PGIT was headed, in regards to research, funding, and publication types accepted: 
This editorial prompted an number of response articles. PGIT members were divided. Some believed that if knowledge and application of information theory was not extended beyond radio and wire communications, progress in other fields could be delayed or stunted. Others, however, insisted on confining the field to developments in radio, electrons, and wire communications. The two points of view were hotly debated over the next few years. 
By the third symposium held in 1956, the scope was so wide that it included participants with backgrounds in sixteen different fields: anatomy, animal welfare, anthropology, computers, economics, electronics, linguistics, mathematics, neuropsychiatry, neurophysiology, philosophy, phonetics, physics, political theory, psychology, and statistics.  The balloon depiction, shown below, gives an idea of this ‘ballooning effect’ as Shannon would later describe it.
This growing "bandwagon" usage of Shannon's transmission of information, by wire or radio, theory outside of communications engineering proper, prompted the following infamous 1956 editorial retraction memorandum article “The Bandwagon” wherein Shannon makes a plea to everyone to stop using his new so-called information entropy theory outside of communications engineering proper: 
Reaction bandwagon stylized articles to this bandwagon editorial article followed in the aftermath, including one from Norbert Wiener entitled "What is Information Theory?".
Elias | Sokal affair
In 1958, American electrical engineer Peter Elias (1923-2001) published "Two Famous Papers" a parody of Shannon's bandwagon, in what we know would now, retrospectively, call a Sokal affair (Ѻ), i.e. an inside joke discussion of made up theory that looks real on the surface. 
See the 2001 MIT Project History article “Information Theory and the Digital Age” for an inside look at the early bandwagon years. 
|The "information theory balloon" on how the mathematical theory of 1s and 0s has ballooned out of proportion into a desperate number of fields, so much so that the balloon is ready to pop (2012). |
The following are related quotes:
“It will be all to easy for our somewhat artificial prosperity to collapse overnight when it is realized that the use of a few ‘excited’ words like information, entropy, redundancy, do not solve all our problems.”— Claude Shannon (1956), “The Bandwagon”, Mar
1. (a) Aftab, O., Cheung, P., Kim, A., Thakkar, S., and Yeddanapudi, N. (2001). “Information Theory and the Digital Age” (§: Bandwagon, pgs. 9-11), Project History, Massachusetts Institute of Technology.
(b) Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
2. Raymond, Richard C. (1950). “Communication, Entropy, and Life” (abs), Am Sci. 38: 273-78; In: Modern Systems Research for the Behavioral Scientist (ch. 19, pgs. 157-), Aldine Pub. Co., 1969.
3. (a) Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
(b) Sokal affair – Wikipedia.
4. (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
(b) Mitra, Partha and Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
5. Weiner, Norbert. (1956). “What is Information Theory”, IRE Transactions on Information Theory, 48, June.
6. (a) Elias, Peter. (1958). “Two Famous Papers” (pdf), IRE Transactions: on Information Theory, 4(3):99.
(b) Mitra, Partha, and Bokil, Hemant. (2008). Observed Brain Dynamics (Appendix A: The Bandwagon by C.E. Shannon, pg. 343; Appendix B: The Two Famous Papers by Peter Elias, pg. 345). Oxford University Press.
7. (a) Blachman, N. (1956). "A Report on the Third London Symposium", IEEE Transactions (pg. 17), March.
(b) Aftab, O., Cheung, P., Kim, A., Thakkar, S., and Yeddanapudi, N. (2001). “Information Theory and the Digital Age” (§: Bandwagon, pgs. 9-11), Project History, Massachusetts Institute of Technology.
8. De Rosa, L.A. (1955). “In Which Fields Do We Graze?”, IRE Transactions on Information Theory, 1(3):2, Dec.
9. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
● Fox, Jeremy. (2011). “Now THAT’S the way to Stop a Bandwagon!”, OikosJournal, WordPress.com, Sep. 26.