Explore chapters and articles related to this topic
Survival in the Context of Urbanization and Environmental Change in Medieval and Early Modern London, England
Published in Lori Jones, Disease and the Environment in the Medieval and Early Modern Worlds, 2022
Medieval and early modern populations in England experienced repeated, widespread, and severe famines and disease epidemics.1 Famine is the widespread shortage of or restricted access to food. It may have natural causes, such as the underproduction of crops resulting from drought conditions or the destruction of crops by plant diseases. It also can be artificially produced by human actions (i.e., it can be anthropogenic), such as the withholding of or purposeful destruction of food during times of conflict (Sen 1981). Epidemics are sudden increases in the number of cases of a disease within a population relative to the typical baseline number of cases under normal circumstances. Famine and infectious disease epidemics are often synergistically connected: malnutrition can reduce the effectiveness of human immune responses (Scrimshaw 2003), and thus famine conditions can increase susceptibility to, and the severity of, various diseases. As a result, infectious diseases are the primary cause of death during famines (Mokyr and Ó Gráda 2002).
Roots and Tubers
Published in Christopher Cumo, Ancestral Diets and Nutrition, 2020
Populations rose faster where the potato became a staple than where grains remained entrenched.191 It reduced famines’ incidence and severity and diminished deaths from undernutrition and attendant diseases.192 Potatoes produced the surplus labor that fueled the Industrial Revolution and the masses that emigrated to North America; gave Britain, France, Germany, and Russia vast armies that precipitated World War I’s horrors; and made Europe a colonial power.193 These developments prompted German economist Friedrich Engels (1820–1895) in 1884 to judge the tuber humanity’s greatest innovation since iron’s discovery.194
Nutrition
Published in Jan de Boer, Marcel Dubouloz, Handbook of Disaster Medicine, 2020
Major disasters, whether due to forces of Nature or of human origin, commonly bring impairment of food supplies and consumption, and consequent suffering, poor health and high rates of morbidity and mortality. Famine – defined as a late stage of food scarcity causing significant excess mortality – often stems from drought, severe flooding or agricultural catastrophes, but also from war, civil conflicts and other causes of displacement of populations. Malnutrition, in its many tragic forms, all too often characterises emergency situations, especially long-lasting ones. General malnutriton rates are often very high, affecting 90% or more of the population with 10% or more of children below 5 years being severely, acutely malnourished. Micronutrient malnutrition is equally ubiquitous, and devastating in its consequences.
Drought-related cholera outbreaks in Africa and the implications for climate change: a narrative review
Published in Pathogens and Global Health, 2022
Gina E. C. Charnley, Ilan Kelman, Kris A. Murray
Relying on agriculture can become tenuous during droughts, reducing food security through crop failures and livestock losses [17,28]. For example, during 1991–1992, 370,000 cattle were lost in Zimbabwe, crop production in Namibia fell by 70% and Botswana’s maize crop failed [28]. This leads to subsequent famine and malnutrition, decreasing host immune response and heightening the risk of cholera and other infectious diseases [26,36]. Drought and subsequent water scarcity lead to using different sources of food and water. For example, in Mali millet gruel is commonly eaten and acidified with curdled goat milk to prevent contamination, but in times of drought goat milk is often not available, along with several other acidifying ingredients such as lemon, tamarind, and vinegar. Famine foods are also often cooked less to preserve fuel [17]. The lack of available food increases reliance on roadside food vendors [30], which have been shown to increase cholera transmission in other outbreaks [37], often due to poor food hygiene practices, poor regulation, and no enforcement of bans.
How many premature deaths from pesticide suicide have occurred since the agricultural Green Revolution?
Published in Clinical Toxicology, 2020
Ayanthi Karunarathne, David Gunnell, Flemming Konradsen, Michael Eddleston
The Green Revolution in the 1950s and 60s introduced new farming techniques and high-yield varieties of crop plants, particularly wheat and rice, into low- and middle-income countries [1,2]. Yields increased markedly over the following decades, permitting food production to keep pace with population growth and reduce the risk of famine. However, the Green Revolution had downsides, with the new crop varieties being dependent on pesticides and fertilisers, producing harmful effects on human health and environment [1,3,4]. Subsequently, intensified agricultural pesticide use has resulted from a shift from subsistence farming to cash-crop and monocrop farming, a need to increase yield per area of land, a limited focus upon developing alternatives to pesticides use, and an overall increasing focus upon input dependent agriculture [5,6]. Alternative farming approaches without use of pesticides lack the heavy marketing and lobbying support provided by the agrochemical industry [7,8].
The Developing Genome: An Introduction to Behavioral Epigenetics
Published in Psychiatry, 2018
Toward the end of World War II, the Dutch staged an unsuccessful rebellion against the Nazi occupation. The Nazi’s responded with a denial of food to the major population centers in Holland. This event is now known in Holland as the Famine of 1944–1945. Over time, it became apparent that the children conceived and born during this famine had higher than normal risk for schizophrenia and depression, higher atherogenic lipid profiles, double the rate of coronary artery disease, worse performance on cognitive tasks, greater incidence of certain cancers, and a greater incidence of type II diabetes mellitus; for women, there was a higher incidence of breast cancer. Roseboom, Painter, van Abeelen, Veenendaal, and de Rooij (2011) published a description of this problem distressingly titled “Hungry in the Womb.” However, it was not simply that these children had developmental anomalies due to their mother’s lack of food at critical developmental times; that would not be a surprise. The surprise was that ensuing research on these children revealed that the effects were persistent over multiple generations (Heijmans et al., 2008). Retrospective studies of multiple generations of inhabitants of the Swedish city of Overkalix, based on meticulous health records kept since 1799, have revealed direct connections to cardiovascular disease from grandfathers to grandchildren (Kaati, Bygren, Pembrey, & Sjostrom, 2007).