The English, like other Western nations, were once plagued by rickets—a softening of the bones leading to fractures and deformity, particularly in children. Initially rare, it became much more frequent after 1600 and had reached epidemic levels by the turn of the 20th century (Gibbs, 1994; Harrison, 1966; Holick, 2006; Rajakumar, 2003). A survey at the Great Ormond Street Hospital found symptoms of rickets in one out of three children under 2 years of age and another one at Clydeside, in 1884, found symptoms in every child examined (Gibbs, 1994). Elsewhere during the late 19th century, in Boston and Leiden (Netherlands), autopsy studies showed rickets in 80-90% of all children (Holick, 2006).
The currently accepted explanation was developed in the late 1880s by Dr. Theobald Palm. For Palm, rickets seemed to correlate with lack of sun. The illness was more common in northwestern Europe, particularly England, where sunlight was naturally weaker. It was also more common in urban areas where “a perennial pall of smoke, and … high houses cut off from narrow streets a large proportion of the rays which struggle through the gloom” (Hardy, 2003). His hypothesis was strengthened in 1919 by the finding that ultraviolet light can cure rickets by releasing a chemical, eventually identified as vitamin D, that provides us with calcium and phosphorus from food passing through the gut (Gibbs, 1994). Such health benefits, together with the anti-microbial action of UV light, led to the ‘sunshine movement’ of the 1920s—a vast effort to boost sun exposure by redesigning our clothes, our streets, parks, and buildings, and even our notions of fun and recreation. This movement gave us much of the look and feel of modern life.
By the mid-20th century, rickets was again rare, thus vindicating not only UV therapy but also the view that lack of sun had been the cause (Harrison, 1966). This view nonetheless remains unproven. Although vitamin D does help the body absorb more calcium and phosphorus, no one knows for sure whether the epidemic was due to low levels of this vitamin. That kind of blood test did not exist yet. People may have developed rickets because something was immobilizing calcium or phosphorus in their bodies. They would have then required more vitamin D. This possibility is hinted at by Harrison (1966):
… a number of cases are on record of children with marked rickets who have received an amount of vitamin D ordinarily sufficient to prevent rickets and who do not have manifestations of intestinal malabsorption. When these children are given increased amounts of vitamin D, several thousand units per day, the biochemical manifestations of vitamin D effect result, and roentgenograms show healing of the rickets. The basis for this increased requirement is not known.
At the height of the epidemic, one physician did suggest that something was immobilizing phosphorus in people with rickets. Dr. John Snow (1857) observed that the illness was most frequent in London and the south of England where industrial bakeries used alum to make bread look whiter. It was rare in the north where bread was normally home-baked. He reasoned that this alum combined with phosphorus in the body to form insoluble aluminum phosphate, thus depleting the reserves of phosphorus needed for strong bones.
Snow pointed out that London bakeries would add about one and a half ounces of alum per four pounds of loaf. Since manual laborers met 70% of their energy requirements by eating bread, they would have been ingesting 20 g of alum daily or 4 g of aluminum hydroxide (Dunnigan, 2003). A recent case study describes an infant who developed rickets after consuming 2 g of aluminum hydroxide (via antacids) per day over five to six weeks (Pattaragarn & Alon, 2001). There have been many other reports of antacid-induced rickets (Boutsen et al., 1996; Cooke et al., 1978; Pivnick et al., 1995; Shetty et al., 1998; Spencer & Kramer, 1983).
Snow’s hypothesis was forgotten and has been dusted off only in recent years (Dunnigan, 2003; Hardy, 2003; Paneth, 2003). This renewed interest is partly due to the realization that rickets can be caused not only by lack of vitamin D but also by ingested substances that make phosphorus or calcium unusable. Alum is one, as seen in reports of rickets induced by antacids. Another is phytic acid in cereal grains (Sandberg, 1991). The acid binds to calcium and makes it unavailable to the body, as shown when dogs develop rickets on an oatmeal diet (Harrison & Mellanby, 1939). It is this calcium depletion, via food like unleavened bread or chapatti, that now causes rickets in the Middle East and South Asia (Berlyne et al., 1973; Harinarayan, Ramalakshmi, Prasad, Sudhakar, Srinivasarao, Sarma, & Kumar, 2007).
We may never disprove the view that lack of sun caused the rickets epidemic of a century ago. But we can point to some inconsistencies. First, rickets was much less frequent in northern England and absent in northwest Scotland—the area of Great Britain with the weakest solar UV (Gibbs, 1994). Second, it was not really a disease of cities with dark narrow streets and smoke-filled skies, as Snow (1857) himself observed.
The usual causes to which rickets are attributed are of a somewhat general nature, such as vitiated air, want of exercise and nourishing food, and a scrofulous taint. These explanations, however, did not satisfy me, as I had previously seen a good deal of practice in some of the towns in the north of England, where the over-crowding and the other evils above mentioned were as great as in London, whilst the distortion of the legs in young children was hardly present; moreover, I noticed that the most healthy-looking and best-nourished children often suffered most from curvature of the bones of the legs, owing to their greater weight; and I afterwards found that this complaint was quite common in the villages around London as well as in the metropolis itself.
Lack of sun also fails to explain why the epidemic initially broke out within a small geographic area. Indeed, the evidence points to a highly localized origin, essentially southwest England in the early 17th century. Rickets was completely new to observers at the time, including the College of Physicians president Francis Glisson. In 1650, he wrote:
The disease became first known as near as we could gather from the relation of others, after sedulous inquiry, about thirty years since, in the counties of Dorset and Somerset … since which time the observation of it hath been derived unto all the southern and western parts of the Kingdom. (Gibbs, 1994)
Gibbs (1994) attributes this rapid growth to that of England’s home-based textile industry, which by 1600 had become the main export. “Whole families worked from before dawn until after dusk in their homes and, whether the children were too young to work or old enough to assist in home production, they would have lived their lives predominantly indoors.” This explanation is hard to accept because textile cottage industries developed primarily in the midlands and around London. The southwest trailed the rest of England in this regard. In addition, family workshops were normally off-limits to children below the age of apprenticeship. Such children would have been left with elderly relatives or told to play outside.
But there may have been an indirect link with the growth of England’s textile industry, namely the parallel growth in the use of alum to fix the colors of cloth. Until the ban on alum imports in 1667, when newly exploited Yorkshire shales became the main source, England imported this substance from the Italian Papal States (Balston, 1998; Jenkins, 1971). The port of entry would have been Bristol—in Somerset county, southwest England. This may have been where English bakers first learned to whiten bread with alum.
When did bakers stop using alum? The practice seems to have died out after the turn of the century with tougher enforcement of food adulteration statutes in the United Kingdom and elsewhere (Kassim, 2001). By the mid-20th century, “the use of alum in bread was only occasionally encountered” (Hart, 1952). Eliminating this additive from bread probably did much to eliminate rickets. Probably just as important was the decreasing importance of bread in working class diets, as a result of increasing affluence.
But this is not what the history books say. As Steve Sailer tells us, history is written by those who like to write, and much more has been written about the sunshine movement and its presumed benefits for humanity.
Balston, J. (1998). “In defence of alum – 2. England”, In: The Whatmans and Wove Paper: Its invention and development in the West, West Farleigh.
Berlyne, G.M., Ari, J.B., Nord, E., & Shainkin, R. (1973). Bedouin osteomalacia due to calcium deprivation caused by high phytic acid content of unleavened bread, The American Journal of Clinical Nutrition, 26, 910-911.
Boutsen, Y. Devogelaer, J.P., Malghem, J., Noël, H., & Nagant de Deuxchaisnes, C. (1996). Antacid-induced osteomalacia, Clinical Rheumatology, 15, 75-80.
Cooke, N., Teitelbaum, S., & Aviol, L.V. (1978). Antacid-induced osteomalacia and nephrolithiasis, Archives of Internal Medicine, 138, 1007-1009.
Dunnigan, M. (2003). Commentary: John Snow and alum-induced rickets from adulterated London bread: an overlooked contribution to metabolic bone disease, International Journal of Epidemiology, 32, 340-341.
Gibbs, D. (1994). Rickets and the crippled child: an historical perspective, Journal of the Royal Society of Medicine, 87, 729-732.
Hardy, A. (2003). Commentary: Bread and alum, syphilis and sunlight: rickets in the nineteenth century, International Journal of Epidemiology, 32, 337-340
Harrison, D.C., & Mellanby, E. (1939). Phytic acid and the rickets-producing action of cereals, Biochemical Journal, 33, 1660-1680.
Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.
Hart, F.L. (1952). Food adulteration in the early twentieth century, Food Drug Cosmetic Law Journal, 7, 485-509.
Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.
Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.
Holick, M.F. (2006). Resurrection of vitamin D deficiency and rickets, The Journal of Clinical Investigation, 116, 2062-2072.
Jenkins, R. (1971). “The alum trade in the fifteenth and sixteenth centuries, and the beginnings of the alum industry in England,” in: Links in the history of engineering and technology from Tudor times: the collected papers of Rhys Jenkins, pp. 193-203, Newcomen Society (Great Britain), Published by Ayer Publishing.
Kassim, L. (2001). The co-operative movement and food adulteration in the nineteenth century, Manchester Region History Review, 15, 9-18.
Paneth, N. (2003). Commentary: Snow on rickets, International Journal of Epidemiology, 32, 341-343.
Pattaragarn, A., & Alon, U.S. (2001). Antacid-induced rickets in infancy, Clinical Pediatrics, 40, 389-393.
Pivnick, E.K., Kerr, N.C., Kaufman, R.A., Jones, D.P., & Chesney, R.W. (1995). Rickets secondary to phosphate depletion: a sequela of antacid use in infancy. Clinical Pediatrics, 34, 73-78.
Rajakumar, K. (2003). Vitamin D, cod-liver oil, sunlight, and rickets: a historical perspective. Pediatrics, 112, 132-135.
Sandberg, A.S. (1991). The effect of food processing on phytate hydrolysis and availability of iron and zinc. Advances in Experimental Medical Biology, 289, 499-508.
Shetty, A.K., Thomus, T., Rao, J., and Vargus, A. (1998). Rickets and secondary craniosynostosis associated with long-term antacid use in an infant. Archives of Pediatrics & Adolescent Medicine, 152, 1243-1245.
Snow J. (1857). On the adulteration of bread as a cause of rickets. Lancet, ii:4–5. (Reprinted in International Journal of Epidemiology (2003), 32, 336–337.)
Spencer, H., & Kramer, L. (1983). Antacid-induced calcium loss, Archives of Internal Medicine, 143, 657-659.