By Shenggen Fan, Sivan Yosef, and Rajul Pandya-Lorch
Agriculture is the single most important innovation in human history. Over the course of thousands of years, it has staved off hunger, allowed populations to leave their hunter-gatherer lives behind, and freed up time for other pursuits (like inventing writing and the wheel!) that have propelled societies forward. As recently as the 1970s the Green Revolution – a global push to improve and produce more wheat and rice – brought India back from the brink of mass famine. The Green Revolution improved the lives of one billion people around the world. This number is all the more impressive when considering that the world population was four billion at the time.
There’s been a thing on social media for a while of photographing what you’re about to eat – whether it’s to brag about what fancy restaurants you go to or to show off your cooking skills, with hashtags such as #Eatingfortheinsta, #foodie and #foodporn. But food photography could play a useful role in helping dietitians to measure more accurately what people are eating.
16 October is World Food Day (#WFD2016); this year’s theme is ‘Climate is changing. Food and agriculture must too.’ Jennifer Cunniff, plant scientist in CABI’s editorial team looks at how harnessing crop diversity is vital for us to meet the challenge.
Of the wide variety of edible plant species growing on our planet it’s amazing how few of them we actually include in our diet. Around 30 000 edible plant species are known, yet only 30 of these feed the world, and we are heavily reliant on a handful of cereals – rice, bread wheat, maize, millets and sorghum – provide 60% of the energy intake of the world population (FAO). This narrowing of our food base largely started with the advent of farming – before then, there is plentiful archaeological evidence that shows we were foraging across a much wider breadth of plant species (e.g. Weiss et al. 2004; Fairbairn et al. 2006). Once we formed settled societies we began to focus on crops that offered the best level of return and were best adapted to the cultivated environments we created. Furthermore, even though multiple accessions1 of our widely grown cereal species exist (naturally and through breeding) only a few dozen are grown on a wide scale. This strategy has consequences – genetic variability for adaptation to future climate change is lost.
1 A single collected variety or cultivar. It could be a wild variety, a landrace or a bred cultivar.
I recently attended the International Sugar Organization’s annual conference in London, hoping to hear Dr. Francesco Branca of the World Health Organization explaining the rationale for the WHO’s recommendations on how much sugar people should eat, and see what response he got from the assembled sugar industry representatives and how he responded to that. As a reasonably independent observer (CABI publishes Sugar Industry Abstracts, but does much work on nutrition and health as well) I was looking forward to this. Unfortunately, however, he didn’t turn up due to other commitments, and sent a video presentation instead.
With the global population estimated to reach 9 billion by 2050, there has been much debate around the issues of nutrition and food security. Amid these concerns, a report published on May 6 by the International Union of Forest Research Organizations (IUFRO), calls for greater consideration of the use of forests as a food source as well as for biodiversity conservation. The report, titled “Forests, Trees and Landscapes for Food Security and Nutrition” was presented at the UN Forum on Forests and is a result of the collaboration of more than 60 scientists from around the world.
IN my March
2013 blog “Eat
less salt but make sure it contains iodine!”, I described the problems of addressing iodine–deficiency
diseases in Pakistan and the worrying
rise in iodine deficiency in the UK,
linked to a shift in eating patterns
away from dairy and oily fish, our traditional sources of iodine. Whereas, other developed countries had relied
on introducing a national supply of iodised salt, we had got away without it.
But even countries using iodised salt, now had to watch out, as salt–reduction campaigns to tackle rising cardiovascular
diseases, were allowing iodine-deficiency to reoccur albeit at a low-level (as
compared to the high level of iodine deficiency found in developing countries)
NOW there is further support for re-emerging iodine deficiency
in the UK: this time a study on pregnant
women published in the Lancet. They have identified changes in the IQ of primary-school
children born to mothers with low-level iodine deficiency: IQ goes down 3 points & reading age is
reduced. For more information, read the BBC article Iodine deficiency 'may lower
UK children's IQ and the Lancet
Need I say more? In the March blog, which featured
on Global Health Knowledge Base
and CABI-Handpicked & carefully
sorted , I covered the spectrum of iodine-deficiency diseases which can
occur in children born to mothers with iodine-poor diets, leaving the children with permanent physical
& mental intelligence problems.
Daily it seems, the case is being made to consider introducing iodised
salt into the UK and to advise would-be
pregnant mothers not only to ensure folic acid is in their diet but also
adequate iodine ( BUT not through
seaweed supplements). Pregnant mothers who rely on organic milk should be aware that this contains less iodine than usual and they will need to increase iodine intake to compensate.
WE do indeed “have a new challenge to addressing
iodine deficiency in both developing and developed countries”.