There is no denying science has provided humanity with countless benefits, from electricity and modern medicine, to a greater understanding of the wondrous universe and our place within it. However, as Charles Kettering poignantly noted, “99.9 percent of success is built on failure”. In a theoretical example of natural selection at work, the history of science is the history of one outdated and incorrect idea being gradually consumed and replaced by another. Not always indicative of failure, and still capable of providing insights and focus for future discoveries, some of these flawed historic understandings nevertheless ranged from the immensely short-sighted or foolish to downright dangerous and moronic.
Here are 20 insane scientific theories that people from history actually believed:
20. The dominant scientific theory throughout Renaissance Europe, preformism held that humans existed in miniature but complete forms in male sperm and were subsequently enlarged via the female womb
Believed to have originated from Ancient Greece, with Pythagoras credited as one of the earliest individuals to advance the notion, “spermism” dictated the father contributed the essential characteristics of a human offspring whilst the mother merely provided an oven in which the child grew. Transmitted into European scientific understanding by Aristotle, and subsequently elaborated by Galen, it was not until the 17th century the belief started to face testing scrutiny. However, the core issue of how unorganized matter could transform into organized life – known today as epigenesis – remained a vexing question for scientists of the past, resulting in a stunning conclusion being reached.
In contrast to the modern scientific understanding of embryonic development, naturalists claimed instead humans must exist in a miniature preformed state. Developed into a widespread theory by Antonie van Leeuwenhoek, the Dutch scientist determined “in no full-grown body are there any vessels which may not be found likewise in semen”. Producing the theory of “preformation”, Leeuwenhoek reasoned human sperm comprised tiny but nonetheless complete humans which were transmitted into the womb for incubation. Remaining the dominant theory of generation during the 18th century, it was not until Dalton’s atomic theory of matter that preformation was dealt a decisive blow in the early 19th century.
Confirmed by successive explorers, the concept of the mythological California had nonetheless become sufficiently entrenched that these real-world informed opinions were unable to change wider perceptions. Shown correctly only briefly on a handful of maps from the Age of Discovery, most notably the Mercator map, cartographers continued throughout the early 17th century to depict California inexplicably as an island. Lasting long after human visitations to the region had become more common, it was not until the mid-18th century that Jesuit missionary-explorers finally laid to rest the incorrect geographical notions concerning the peninsula.
18. Proposed in its modern form only two hundred years ago, the debunked field of homeopathy proposes treatment of patients through the concept of “like cures like” on a diluted scale
Initially suggested by Hippocrates around 400 BCE, who introduced the concept of “like cures like” with the prescription of small doses of mandrake root to induce further mania in order to cure the psychological condition, the medical idea was adopted centuries later by Paracelsus who concluded “what makes a man ill also cures him”. Rejecting the traditional methods of mainstream medicine, notably bloodletting, as irrational, in 1796 Samuel Hahnemann invented his doctrine of alternative medicine. Coining the term “homeopathy” in print in 1807, Hahnemann thesis was predicated on the similar belief that whichever substance causes symptoms of a disease in a healthy individual could equally cure said symptoms in a sick person.
Denoting the causes of disease as “miasms”, Hahnemann developed the process of homeopathic dilution, wherein the chosen substance is repeatedly diluted and the containing vessel struck against an elastic material to release healing potential. Diluting the substance beyond the point where molecules of the original substance actually remain, Hahnemann’s medical practice exploded into popularity in the United States following introduction in 1825. Exported globally by fanatical American adherents, homeopathy has endured fluctuating levels of acceptance in the decades since but today is widely regarded as a pseudoscience with no medical benefit.
17. Claiming to identify an individual’s character through external appearances, physiognomy fell into disrepute during the Middle Ages before a resurgence during the Early Modern Period
Stemming from ancient notions correlating a relationship between an individuals’ external appearance and internal characteristics, Aristotle, contending “it is possible to infer character from features”, served as one of the earliest known proponents of the theory of physiognomy. Adopted during the Middle Ages, the widely believed scientific theory linking personality to outwards appearance fell into disrepute during the Late Middle Ages, with Henry VIII of England outlawing the teaching of the flawed theory and Leonardo da Vinci notably dismissing physiognomy as without “scientific foundation”.
Revived during the late-18th century by Johann Kaspar Lavater, the Swiss pastor built heavily upon the work of English philosopher Sir Thomas Brown from a century earlier. Asserting “there are mystically in our faces characters that carry in them the motto of our Souls”, Lavater’s renewal of the debate surrounding physiognomy was initially met with mixed reactions. Growing in popularity, however, during the remainder of the 18th and into the 19th centuries, physiognomy was even used during the foundations of criminology as well as for the purposes of scientific racism. Nevertheless, with the advent of the Modern Age, physiognomy has increasingly fallen into disuse and is broadly rejected by the scientific community as both inaccurate and unscientific.
16. Placing Earth at the center of the Universe, geocentricism remained the dominant theory of astronomy for thousands of years until being challenged by Copernicus in the sixteenth century
Found in pre-Socratic philosophy and proposed by Anaximander, the geocentric model of the Universe offers a description of the planetary system placing the Earth at the center and the Sun, Moon, stars, and other planets surrounding and orbiting our celestial body. With stars appearing to be fixed, rotating once each day upon an axis through the geographic poles of the Earth, for ancient observers it appeared entirely rational to conclude these objects were the ones moving and not the seemingly static Earth. Becoming mainstream scientific theory by the fourth century BCE, supported by both Aristotle and Plato, in contrast to popular mythology geocentrism was historically combined with concepts of a spherical Earth.
Developing into the Ptolemaic system, offering minor alterations and clarifications, some of the earliest suggestions of dissent stemmed from Islamic astronomers during the 10th century CE. Seriously challenged for the first time in the Common Era with the publication of On the Revolutions of the Heavenly Spheres by Copernicus in 1543, the Polish scientist instead proposed the Earth and other planets instead rotated around the Sun. Gradually crumbling under the ever-increasing weight of empirical evidence, the development of the telescope in 1609 and observations made by Galileo in the early 17th century eviscerated remaining support for the archaic interpretation of the cosmos.
15. Despite the absurdity of the concept, it was widely believed a race of plants from Central Asia was capable of literally growing lambs upon their sprouts like flowers
A legendary creature believed to be both an animal and plant simultaneously, the Vegetable Lamb of Tartary was once believed to have been indigenous to Central Asia. Connected to the plant by an umbilical cord, it is thought the creature possibly originated from misconceptions from the ancient world. The Greek historian Herodotus wrote of trees in India “the fruit whereof is a wool”, referring to the then-unknown cotton plant, whilst Jewish folklore from the 1st millennium CE references a creature known as Yeduah, who was like a lamb sprouted from the earth. Similarly, Chinese mythology contains the legendary “watersheep”, a combination of plant and animal, connected to the ground via a stem which, if severed, would prove fatal to the lamb.
In spite of the patent absurdity of growing sheep like fruit, from the 14th-century the legendary creature was introduced to the public consciousness in Europe. Claimed during the mid-16th century by Sigismund, Baron von Herberstein, the Carniolan diplomat offered a descriptive and persuasive account of the animal. Eventually, in 1683, German physician Engelbert Kaempfer embarked on an expedition to Persia to ascertain the truth. Finding no evidence of the lamb-plant, Kaempfer concluded no such being existed and resolved to educate the misinformed European audiences of their foolishness.
14. Holding diseases such as cholera were transmitted via “bad air”, miasmatic theory – one of the foremost medical traditions of the pre-modern era – blamed powerful miasmas for ill health
A longstanding, and today obsolete, medical opinion, the miasmatic theory asserted diseases and related epidemics, such as of cholera or the Black Death, were caused by a “miasma”. A noxious form of bad air, miasmas were believed to emanate from rotting organic matter and spread harmful diseases to those who inhaled the tainted air. Predating the start of the Common Era, during the first century BCE Roman writer Vitruvius described the effects of potent miasmas, describing how “the morning breezes blow toward the town at sunrise…they bring with the mists…the poisonous breath of creatures of the marshes to be wafted into the bodies of the inhabitants”.
Remaining a popular theory throughout the Middle Ages, even by the 1850s miasmatic theory was still used to explain outbreaks of cholera in both London and Paris. Transmitting infections not person-to-person but environment-to-person, the prevalence of said belief, supported famously by Florence Nightingale, drastically hindered the introduction of basic hygienic standards among humans. Eventually disproved and replaced by germ theory following the discovery of bacteria, as well as the conclusion of John Snow – the father of epidemiology – that cholera was water and not airborne, it was not until the late-nineteenth century unscientific belief in miasmas finally diminished.
13. A pseudo-scientific field of human anatomical study, phrenology claimed to be able to identify and explain character traits through measurements and examinations of the skull
Offering a revolutionary alternative interpretation of bodily primacy, Hippocrates and his followers transformed human understanding by shifting priority from the heart to the brain. Perpetuated by Galen, the belief that mental activity – and thus the animal soul – inhabited the brain prompted investigations of character to focus upon the cranial organ. Introduced by Johann Kaspar Lavater in the 1770s, Lavator’s Physiognomische Fragmente argued the thoughts and souls of a human were intimately connected to an individual’s physical frame. Asserting a perpendicular forehead to be a sign of intellectual deficiency, Levator’s work was quickly adopted and advanced by others.
Notably by Franz Joseph Gall, who would become the leading exponent of the field of phrenology, Gall established what he claimed to be scientific determinations of a relationship between the skull and a person’s character. Extrapolating findings beyond any reasonable standard of scientific inquiry, phrenology was rapidly employed for the purposes of scientific racism and used to persecute ethnic and social minorities. Lacking any scientific foundation, however, phrenology was increasingly rejected by the late-19th century, although notably was used in the 1930s by Belgian authorities in Rwanda to justify their advancement of the Tutsis over the Hutus.
12. An incorrect scientific theory still commonly taught in schools across the United States, the tongue does not have separate taste centers for different sensations
A common misconception still taught today in some schools, the tongue map – also known as the taste map – asserted different sections of the human tongue are exclusively responsible for different basic tastes. Believed to originate from a poor translation by Harvard psychologist D.P. Hanig of a German paper written in 1901, Zur Psychophysik des Geschmackssinnes, the original German paper had merely identified minute differences in threshold detection across the tongue. Taken out of context, these tiny variations in sensitivity were transformed via Hanig’s poor representation into an erroneous suggestion each part of the tongue was selectively responsible.
Becoming a core part of both popular and academic understanding of the human sense of taste, it was not until 1974 the false theory was finally laid to rest. Investigated by Virginia Collings, a researcher from the University of Pittsburgh, Collings found that although there was a slight difference in the concentration of taste receptors, akin to the findings by the Germans decades earlier, the overall effect on taste was negligible. However, despite disproving the theory, demonstrating a full taste range exists on all parts of the tongue, it remains a frequent misconception and is often perpetuated via misinformed American popular culture and education.
11. Until the discovery of genetics, it was presumed a fetus could be mentally and physically affected or harmed by the thoughts and feelings of its host mother
Commented upon at length by Roman author Pliny the Elder, the belief in a postpartum maternal impression upon newborn babies was a widely held medical opinion throughout the ancient world. Stipulating a strong mental bond to exist between a pregnant mother and the developing fetus, the theory of maternal impression contends the mother’s mind is capable of physically and psychologically affecting the child she is carrying. Generating sustained fear among medieval communities, women went to great lengths to attempt to induce a positive impression upon their child and avoid the stigma or blame for causing her baby to be tainted by her misdeeds whilst pregnant.
For example, it was widely suggested the cause of the Elephant Man’s disfigurement was the frightening of his mother whilst pregnant by an elephant, imprinting the animal’s features upon the child. Similarly, mental illnesses were commonly attributed to the manifestation of hysteria by a pregnant woman. Serving to explain birth defects and cognitive disorders, which were otherwise inexplicable acts of misery, belief in maternal impression lasted into the twentieth century. Struck a fatal blow by the advent of genetic theory, today the pseudo-scientific theory has been broadly abandoned.
10. The history of scientific opinion regarding both tobacco and radium demonstrates how eagerly our species is to embrace an alleged curative without due diligence to ensure safety
Discovered in 1911, radium and its ability to seemingly treat previously incurable conditions was hailed as a medical miracle. However, in our rush to embrace the supposed scourge of lesions and carcinoma, medical experts wantonly overlooked the immense harm high levels of radiation has upon the human body. Quickly transforming from careful application to being freely accessible to the general public, radium became a staple ingredient in everyday household products across the United States, including bath salts, toothpastes, and face-creams, all for the alleged therapeutic benefits of the radioactive and potentially lethal metal.
Possessing a failure-to-success rate of 100 to 1 in treating cancers, delusional support for radium treatments continued well into the 1920s in the United States before dissipating at last in the 1930s after health scares. Offering a similar experience, tobacco, despite being harmful to the human body, was likewise presented as a positive and medicinally beneficial treatment. Claimed to provide numerous health benefits, tobacco smoke enemas were even widespread throughout Europe – especially London – during the 19th century. It was not until the Nazis launched the first modern anti-smoking campaign that independent scientific attention became increasingly focused on the negative influences of the drug.
9. Not entirely devoted to the transmutation of base metals into gold, the historical science of alchemy was widely accepted for thousands of years despite an absence of reliable results
Spanning at least four millennia, alchemy was an ancient branch of natural philosophy and science practiced by human civilizations from around the world. Although today commonly associated with efforts to transmute base metals, for example lead, into noble metals such as gold, the aims of alchemy were historically more diverse. Originating, by many accounts, in Ancient China, the purpose of alchemy was not transmutation but rather obtaining the Grand Elixir of Immortality. Known later as the philosopher’s stone, it is thought gunpowder was an unintentional by-product of alchemists seeking the legendary universal panacea.
Introduced to Latin Europe during the 12th century via Arabic scholars, it was not until Paracelsus a century later that Europeans increasingly reverted focus back towards the original Chinese purpose of medicine. Seeking to combine minerals and plants for treatments, limited discoveries were made to the benefit of mankind and the discipline increasingly became debunked in the public consciousness. Declining in Europe during the 18th century, coinciding with the emergence of chemistry as a defined scientific discipline, alchemy was increasingly distanced from the fledgling field as unscientific and fraudulent. Although enjoying a brief revival in association with supposed occult sciences, today alchemy is regarded as an inexact pseudoscience.
8. Despite the obvious harm caused by unnecessary consumption and interaction, for most of human history feces was a common component of medical treatments
Facing incurable diseases, medieval doctors were prepared to turn to any source for ideas. Unfortunately for our ancestors, one of the many unscientific and even harmful cures of the past was the widespread application of feces. Known to date at least from Anglo-Saxon medicine, the excrement of goats, sheep, calves, oxen, and pigeons were commonplace components of a physicians arsenal. Mixed with household ingredients, a recipe of pigeon feces combined with wheat flour and egg white, for example, was recommended as a treatment for a severe headache. Remaining prominent throughout the Renaissance, during the 17th century Robert Boyle treated cataracts by sprinkling powdered human excrement upon the affected areas.
Not confined only to ignorant European doctors, Chinese physicians during the fourth century BCE were similarly applying feces to treat a host of ailments, with at least one doctors – Li Shizhen – continuing to do so twelve hundred years later to combat abdominal diseases. Equally, camel feces is a widely recognized Bedouin folk cure, documented by the Afrika Korps during the Second World War. Today, fecal medicine does still exist, but in a far more precise form than the liberal consumption of the past, involving the careful donation of healthy gut bacteria and implantation in small doses into a patient unable to develop their own.
7. Serving as the predominant medical opinion for almost one hundred years, it was not until 1985 the belief infant humans could not feel pain was finally debunked
Prior to the late-19th century, babies were widely considered to possess greater pain sensitivity than adults, with Felix WÃ¼rtz reasoning in 1656 “if a new in skin old people be tender, what is it you think in a newborn Babe?” However, from the late-19th-century medical opinion had shifted, with incoming medical practitioners trained with an understanding that infants could not, in fact, feel pain at all. Believing that pain was reflexive, and with the immature development of a newborn’s brain, it quickly became accepted fact pain was an experience beyond the range of a baby and thus anesthesia, muscle relaxants, and pain relief were unnecessary to the treatment of infants.
It was not until 1985 this opinion was finally challenged, when infant Jeffrey Lawson underwent open heart surgery. Discovering her son had been operated upon without anesthesia, Jill Lawson initiated a public campaign which spawned several independent medical studies. Concluding in only a couple of years the received opinion was entirely false, these studies measured the pain of young children and determined it could evidently be felt. In fact, studies have since shown infants feel far more pain than an adult, with inadequate treatment to moderate the immense agony endured capable of causing long-term psycho-physiological harm to children.
6. Asserting an inherent spark separates living things from the inanimate world, vitalism was one of the guiding principles of natural science for virtually all of human history
The belief that “living organisms are fundamentally different from non-living entities because they contain some non-physical element”, vitalism asserts all living beings innately house a principal element which is necessary for life. Often equated with the concept of a soul, this “vital spark” is purported by vitalists to be said building block, governing life by different natural mechanics than inanimate objects. Dating from the ancient world, both the earliest philosophies and sciences imparted by Egypt and Greece were founded upon vitalist conceptions, as well as the Far Eastern conception of the “chi”, and these ideas continued to maintain theoretical dominance well into the modern period.
Surviving the emergence of contemporary and rival scientific theories, notably the proposition of epigenesis in 1781 by Johann Friedrich Blumenbach, the notion of a core living principle separating the material world remained central to understandings. Affecting medical practices for centuries, rather than treating symptoms per se, physicians habitually focused on efforts to re-balance the mythical life force. Gradually abandoned during the twentieth century, it was not until the discovery of DNA by Francis Crick in 1967 that vitalism was struck a fatal blow, with Crick himself taunting remaining vitalists as “cranks”. Today, without empirical evidence in support of vitalism, the scientific theory has become an abandoned relic of history.
5. Imposing millennia of misogyny, science routinely produced unsubstantiated conclusions concerning the female sex, most notably spurious determinations relating to female genitalia
Enduring oppression throughout the ages, female medical treatment and knowledge invariably flagged well behind their male counterparts. Unable to separate the male anatomy from the female, in part because of a lack of actual examination and scientific understanding – take Leonardo da Vinci’s woefully inaccurate drawings of the female reproductive system for instance – reflective of his misinformation, throughout the ancient world it was assumed female genitalia equally dispensed sperm. Most representative perhaps, however, of archaic and misogynistic attitudes, throughout the Middle Ages the uterus was perceived as a “sewer”, existing to spread disease and allowing illnesses to spread throughout the female body and cause hysteria.
Even as recently as the late-19th century, the attachment of mental illness to women was a core component of scientific studies of the female body. Most famously argued by Sigmund Freud, the father of psychoanalysis, clitoral orgasms were considered as unnatural and harmful. Asserting such was a sign of mental illness within a woman, it was incredulously accepted medical fact that stimulation of the clitoris was a direct cause of lesbianism (as well as spiritual damnation). Perhaps this should be unsurprising, given in 1873 Harvard Medical School concluded women’s brains to be less developed than men’s and consequently, especially during menstruation, were unable to handle the rigors of higher education.
4. Building upon the commonly held belief during the Renaissance and Early Modern periods that all planets housed life, William Herschel went further by proposing the Sun was also home to lifeforms
One of the most prominent astronomers of his age, William Herschel cataloged more than five thousand celestial objects across decades of observations, including being responsible for discovering in 1781 the first new planet since antiquity: Uranus. However, despite this immense accomplishment, earning himself a position as court astronomer, Herschel, like many of his contemporaries, equally subscribed to the scientific belief all other planets in the solar system contained life. A commonly held position during the Renaissance by scientists, although rejected by the Church authorities, Herschel took his belief in life beyond Earth even further by asserting life also existed and thrived on the Sun.
Arguing in 1795 in his essay “On the Nature and Construction of the Sun and Fixed Stars”, Hershel contended our central star was merely a giant planet. Building on the works of similarly confused astronomers during the 18th century, it was broadly asserted that sunspots were evidence the sun merely housed a luminous atmosphere beneath which an ordinary planetary surface existed along with lifeforms. Today, of course, we know our sun to be completely incapable of supporting any life, existing at a temperature of approximately 5,505 Celsius and possessing levels of gravity capable of crushing even the densest of matter.
3. A now-discredited theory of climatology previously popular in the American West, belief that “rain follows the plow” was commonplace in the United States
Originating during the late-1860s and 1870s during the westward expansion of American settlement, the sudden greening of previously yellow and dry vegetation provoked sustained scientific inquiry and speculation. Led by noted climatologist Cyrus Thomas, due to a seeming correlation between the increased migration and rainfall, theories were developed attempting to connect the two occurrences. Concluding an uptake in soil cultivation coincided with the augmented rainfall, Thomas reasoned – in violation of the fundamental scientific principle that correlation does not equal causation – the two events must be related.
Arguing the plowing of the soil exposed the moisture beneath to the sky, Thomas’ proposition quickly became adopted as widespread fact within only a few years as a natural affirmation of Manifest Destiny. Resulting in the mass dynamiting of the Great Plains during the 1870s in the hope of provoking greater rainfall, Thomas’s supposition is now widely regarded as one of the archetypal logical fallacies to learn from. Prompting short-term and localized climatological changes due to sudden inversions of terrain, these fanatical attempts to produce more rain is often credited with worsening the Dust Bowl of the early twentieth century and depriving other regions of rain rather than generating more in total.
2. Dominating medical understandings for more than two thousand years, humorism dictated four primary bodily fluids had to be kept in perfect balance to ensure perfect health
Believed to have originated from either Ancient Egypt or Mesopotamia, humorism was a system of medicine adapted by Hippocrates more than two thousand years ago. Denoting the existence of four vital bodily fluids – blood, bile, phlegm, and black bile – Hippocrates posited an extreme excess or deficiency of any of these humors was the root cause of illness. Arguing “these are the things that make up its constitution and causes its pains and health…health is primarily that state in which these constituent substances are in the correct proportion to each other”. Accepted by his successors, Hippocrates’ theory oriented the entire practice of medicine towards the appropriate balancing of these presumed bodily fluids.
Resulting in the mass application of unscientific, and even harmful, medical practices, bloodletting, emetics, and purging became chief components of treatment in order to rectify an imbalance. Remaining the dominant medical conception for physicians throughout the Western and Islamic world, it was not until 1543 that Andreas Vesalius sought to challenge humoral theory. Nevertheless, continued belief in humorism dominated medicine until the advent of cellular theory and microbiology during the 18th and 19th centuries, diminishing into obscurity where today the practice is viewed as pseudo-scientific and dangerously nonsensical.
1. Stipulating living creatures could arise from nonliving matter, spontaneous generation was eventually discredited with the discovery of microbial life during the 19th century
Attempting to provide a natural and scientific explanation to the phenomenon of life without resorting to divine agency, Anaximander is the first known individual to offer the argument of primal chaos from which, by elemental means unknown, life is generated. Coherently organized and expanded upon by Aristotle, although recognizing some forms of life are spawned by natural reproduction, the Greek philosopher proposed that living things might also come from nonliving entities. Suggesting an expansive theory of spontaneous generation, Aristotle asserted the interaction of elemental matter and heat could equally produce life.
Although lost following the fall of Rome in the 5th century, following the reintroduction of Aristotle’s works to Western Europe via Islamic scholars his ideas received renewed support. Conflicting with some biblical and religious opinions, for whom many adherents saw all life as created by divine will and not by natural forces, throughout the Middle Ages spontaneous generation remained a leading scientific theory. Increasingly disputed and discredited, the issue was finally settled by Louis Pasteur in 1859. Demonstrating the existence of microbial lifeforms, these tiny forces beyond the human visual spectrum had, unseen to the human eye, provided the circumstances underpinning belief in spontaneous generation and their conclusive existence shredded the ancient theory.
Where do we find this stuff? Here are our sources: