You don’t need a crystal ball to know Australia’s rural industries will face significant change at global, national and local levels over the coming decades. This will create opportunities and challenges for small and large farms, and will affect rural lifestyles, agricultural landscapes and Australia’s society and economy.
In a new report, we describe this future through a series of interlinked “megatrends” set to hit Australia over the coming 20 years. As we describe below, each prompts some serious questions (or “conversation-starters”, as we have termed them) for Australian farmers. We don’t yet know the answers, but we do know they will be crucial for how the industry fares in the future.
The world will get hungrier
We know that the world is going to require more food as populations grow – about 70% more by 2050, according to the United Nations. This will come primarily from increasing yields, along with some expansion of agricultural land.
The target is achievable but should not be taken for granted. There are competing uses of land for biofuels and urbanisation; in some places land is degrading; and we don’t have good predictions yet of the effect of climate change on agriculture. As a significant exporter of food, Australia has a vital role to play in supplying world food markets and buffering supply shocks.
We are well positioned — both in terms of geography and comparative advantage — to supply overseas markets. And while Australia can’t hope to feed Asia or the world, with astute R&D investment it can increase production and exports. How well we step up to that challenge depends largely on our ability to maintain a price competitive position and continue to improve yields. So the key questions are:
Will farms be able to scale up production and performance to meet this challenge?
What is a sensible investment in innovation, and how should it be funded?
The world will get wealthier
Some 1.02 billion people will move out of poverty and into the middle classes in the developing Asia region alone by 2040. Along with wealth comes the ability to diversify food choices – wealthier households will consume more meat, dairy and vegetable oils.
This presents an opportunity for Australian rural industries to identify new food types and connect to new markets. A diversified rural export base is likely to be more resilient to supply-and-demand shocks in markets.
Is Australia better off focusing on commodity markets that have provided solid export earnings, or should it be working hard to respond to the demand for a more diverse range of boutique, luxury and niche food and fibre goods?
Does Australia have the infrastructure and the persistence to get a wider range of desirable agricultural products into Asian markets competitively?
Customers will get pickier
The consumer of the future will be increasingly able and motivated to choose food and fibre products with certain characteristics. This has impacts both within and beyond the farm gate. Information technology will increasingly enable the consumer to access, share and validate information about products along the whole supply chain from farm to fork.
Health is likely to become a particularly prominent driver of food choice and consumption patterns – be that from a desire for food safety or to help prevent chronic disease. Many people’s lives are being cut short by poor diets, and at current trajectories government budgets could become crippled by unsustainable growth in healthcare expenditure.
The issues of environment, provenance and ethics will also play a vital role. The consumer of the future will have greater expectations for these qualities in the food and fibre products they choose to buy. Consumers will be “information-empowered” and rural industries stand to gain or lose market share based on this increase in consumers’ knowledge.
In the face of soaring diet-related health costs, will governments increase control of the components of food and diets?
How does agriculture in Australia build and safeguard its clean, green reputation?
Technologies will transform farm life
Advances in digital technology, genetics and materials science will change the way food and fibre products are created and transported.
Many plant productivity breakthroughs will be from gene technology. Big data systems and digital technologies will bring better risk-management approaches to Australian agriculture; weather and yields will be much more predictable and farmers will have sophisticated tools to assist with decision making.
Knowledge about land use and framing practices will increasingly move into the public domain as remote monitoring, be it from drones or satellites, makes available new data in a highly interconnected world. Business and capital models will change with the introduction of “disruptive” technologies such as peer-to-peer lending.
Will market perceptions hold back Australian agriculture by restricting access to advanced technologies being used by our major competitors?
How will farmers manage a higher level of scrutiny of their operations?
The rollercoaster of risks will get bumpier
Risk is an ever-present characteristic of Australian agriculture. However, the coming decades will see changes in the global climate, environmental systems and the world economy which will create new and potentially deeper risks for farmers.
Australian agriculture has shown a strong capacity to adapt and respond to risks in the past. But as trade globalises and we rely more on imported inputs such as fertiliser and fuel, the risk of supply chain shocks increases.
More international trade and passenger travel brings greater biosecurity risks. Climate change impacts are not well understood, and the need to cut greenhouse gas emissions will set up competing land uses for both biofuels and carbon storage.
Do we understand the likely implications of a global price on carbon of US$50-100 per tonne?
Is the agriculture sector at risk of complacency and underinvestment when it comes to risk management?
Overall, there is a bright future for Australian agriculture, laden with deep and diverse opportunity. The future outlined above will be a challenge for some producers and industries but an opportunity for others. The effectiveness with which Australian agriculture captures these opportunities and avoids the risks will largely come down to innovation.
Through centuries past, repeated innovation has allowed Australian farmers to expand into new land areas, develop water resources and increase crop and pasture yields. As we look to the decades ahead, innovation becomes ever more important. In a world of exponential growth in both technology and global trade, it’s about working smarter, not just harder.
Changing wildlife: this article is part of a series looking at how key species such as bees, insects and fish respond to environmental change, and what this means for the rest of the planet.
As the world warms, animals and plants will shift their ranges to keep pace with their favoured climate. While the changing distributions of species can tell us how climate change is affecting the natural world, it may also have a direct impact on us.
One good example is the disease carried by insects.
Those small, familiar flies called mosquitoes are responsible for much human suffering around the globe because of their ability to transmit diseases.
Could climate change cause these diseases to spread? While this an extremely important health question, the answer is far from simple.
Complicated life cycle
The life cycle of mosquitoes and its viral parasites is particularly complicated.
Only adult females consume blood, and the immature stages (larvae) live in fresh or brackish water, filtering out small organic particles.
The virus undergoes certain parts of its lifecycle inside particular mosquito organs, but also requires other organs in the vertebrate host to complete its life cycle. And to get into a vertebrate, such as us, it relies on a hungry blood-sucking insect.
These viruses always have other hosts besides humans, which may include native and domestic animals. The pathway that these viruses take to infect humans is often via our domestic animals, which are also bitten by the same mosquitoes that feed on us.
In addition, rates of virus transmission to humans is also affected by the human built environment, and also human behaviour.
Because mosquitoes breed in water, changes in rainfall patterns are likely to change the distribution and abundance of mosquitoes, and therefore could affect disease transmission.
Australian climate is characterised by its variability, however we have experienced a general trend towards increased spring and summer monsoonal rain across northern Australia, and decreased late autumn and winter rainfall in the south.
Kunjin virus is mainly transmitted by a small mosquito called Culex annulirostris, the common banded mosquito, in Australia. We are lucky because human infection rarely causes disease, even though Kunjin and the common-banded mosquito are widespread in Australia.
Kunjin’s close relative, the US strain of West Nile Virus is much more virulent, causing more human disease. These viruses are well known for their ability to mutate quickly, so they are always keeping medical authorities on their toes.
Higher than average rainfall and flooding in eastern Australia in the second half of 2010 and 2011 provided ideal conditions for breeding common banded mosquitoes, and in 2011 a dangerous strain of Kunjin appeared that caused acute encephalitis (swelling of the brain) in horses. This disease has only been detected in one human, however this mosquito feeds on both humans and horses.
This new virulent strain of Kunjin also appeared in new areas east of the Great Dividing Range, suggesting other unknown changes in transmission.
As temperatures increase, mosquito activity will begin earlier in the season and reach higher levels of abundance sooner, and maintain higher populations longer. These factors will all probably tend to increase the rate of transmission of Kunjin to both humans and animals.
While flooding may have helped spread Kunjin, drought may have helped another mosquito-borne virus.
It would be simple to assume that drought would reduce mosquito populations by reducing the larval habitat (water), and thereby reduce the incidence of mosquito-borne disease in Australia.
However, this is not necessarily the case. Another Australian mosquito, Aedes notoscriptus, the striped mosquito, is responsible for transmitting Ross River and Barmah Forest Virus in Australia.
The striped mosquito is unusual in comparison to its cousins because it breeds in small containers of water, such as tree holes in natural environments. The main carrier of Dengue in Australia, Aedes aegypti, shares this habit.
These small container habitats abound in Australia’s urban backyard, with water features, water and food bowls for pets, and various toys providing such breeding places.
With the drought, Australians became much more water wise, and installed various water storage devices in their gardens, ranging from buckets left out in a storm, to professionally installed rain tanks. All these are potential habitat for the striped mosquito to breed.
In this case drought has caused an increase in the abundance of a mosquito virus carrier because of a change in human behaviour.
The return of Dengue?
Dengue fever is transmitted in Australia by the mosquito Aedes aegypti. The mosquito is restricted to Queensland, and Dengue fever transmission is restricted to coastal northern Queensland.
Recent modelling predicts that moderate climate change would extend the Dengue risk zone to Brisbane, exposing much larger human populations to risk.
However, before the 1930s, Dengue fever transmission was known south almost to Sydney, and Aedes aegypti was known throughout mainland Australia except the deserts.
Both the mosquito, and the disease, have retreated to Queensland since then, and we don’t know why. What is clear is that we don’t really understand what controls the distribution of Aedes aegypti or Dengue in Australia, but given the contraction of the disease in historical time, it is unlikely that a warming climate will produce a simple response in the insect or the disease.
Australian insects will be affected by climate change, but simple predictions based on increasing average temperatures and changing rainfall patterns miss the important effects of complex biological interactions.
In addition, we are only just beginning to use models that are sophisticated enough to consider how insects might evolve under changing climate.
Investing in a deeper understanding of these complex biological webs, and their outcomes for human society, will result in great returns. Our predictions of the future state of Australian plants and animals will become more accurate and we will also improve human health and manage our biodiversity more sustainably into the future.
More than 350 million people worldwide suffer from type 2 diabetes. The condition is already rampant in several Western countries and numbers are now rising fast in emerging economies, such as India and China. But the right kind of dietary changes could dramatically reduce the impact of the illness on both patients and economies.
Alongside the impact of the disease and its associated complications on the lives of patients and their families, diabetes’ cost to health-care systems is huge. In Australia, for example, the total economic impact of type 2 diabetes is estimated at A$10.3 billion, while in the United States it is likely to exceed US$174 billion.
There are many ways to beat diabetes or reduce its impact; the key is making changes to your diet and lifestyle that you then follow for life. Indeed, lifestyle modification – eating a healthy diet and exercising regularly – is the cornerstone of any effective diabetes-management plan.
More than sugar
For decades now, the general recommendation has been for everyone to cultivate a high-unrefined-carbohydrate, low-fat diet. More recently, reducing sugar intake, even though it is one of the most popular carbohydrates, has been receiving a lot of attention. But a healthy eating plan for diabetes is not just about cutting out sugar. And scientific opinion is now turning in favour of lower carbohydrate diets – for everyone.
While excessive sugar will no doubt increase blood sugar levels, especially if you’re having sweetened drinks, any source of carbohydrate will have the same effect. This includes anything that contains flour, rice or pasta, as well as fruit and potato.
Carbohydrate foods with a low glycaemic index (GI), such as oats and legumes, on the other hand, will dampen down the blood sugar response. That’s why careful carbohydrate selection is now recommended for everyone, especially people who have type 2 diabetes.
New data from high-quality nutrition research now strongly suggests that restricting carbohydrates even further, while moderately increasing protein and unsaturated fat intake, may have further benefits for controlling type 2 diabetes and reducing the risk of complications.
What we did and found
Based on these ideas, our research teams have been studying the effects of a “Mediterranean” diet – which has low carbohydrate, high protein and includes a lot of vegetables, nuts, lean meats and healthy fats – in combination with an exercise plan. We wanted to see how much we could improve the health of people with type 2 diabetes.
We assigned 115 adults with type 2 diabetes to one of two weight-loss programs. One group followed a very low-carbohydrate and high-protein diet for 24 weeks. The other had a higher carbohydrate, but still low GI, diet.
Early results have been ground-breaking; our diet is better at improving diabetes control compared to traditional weight-loss diets. But its most striking benefit is that it reduces the amount of medication someone with diabetes has to take by half. This reduction was three times greater than for people who followed the lifestyle program that incorporates a traditional high-carbohydrate diet plan.
Our very low-carbohydrate diet also improved blood cholesterol profile by increasing the levels of good (HDL) cholesterol and decreasing triglyceride (blood fat) levels to a greater extent than the traditional high-carbohydrate, low-fat diet. Both diets achieved similar reductions in bad (LDL) cholesterol levels – often a concern with some low-carbohydrate diets.
Variation of blood glucose levels through the day is emerging as a strong independent risk factor for diabetes complications. In our study, the very low-carbohydrate diet was also more effective in reducing the number and levels of blood glucose variations over a 24-hour period.
In 2008-09, of the estimated A$1,507 million spent on the health care of diabetes in Australia, A$490 million was spent on diabetes-related medications. Our findings suggest that, by implementing a lifestyle program incorporating a healthy low-carbohydrate, high-protein, high-unsaturated-fat diet at a national level, the country could save up to A$250 million annually through reductions in diabetes-related medication alone.
This does not even account for any additional cost savings that could be generated from the marked improvements in diabetes control and patients’ well-being. It is these costs – related to the complications of diabetes and patients’ ability to contribute to the economy – that account for most of the economic impact of type 2 diabetes.
Our research shows evidence from the latest nutrition science can guide dietary approaches to tackling one of the most serious global health challenges of this century.
Chris Proud is Theme Leader, Nutrition and Metabolism at South Australian Health & Medical Research Institute.
Grant Brinkworth is Senior Research Scientist in Human Nutrition at CSIRO.
Manny Noakes is Professor of Nutrition & Research Director for the Food and Nutrition Flagship at CSIRO.
This blog was originally published on the Total Wellbeing Diet website.
Fans of intermittent fasting programs – think the 5:2 diet – often find they have success with weight loss, so today we are taking a look at the pros and cons of this kind of diet.
While fasting technically refers to not consuming any food or liquid at all, intermittent ‘fasting’ diets, like the 5:2 diet, do involve very minimal calorific intake on the fasting days – we’re talking around 2000 kilojoules all day, compared to the daily recommended intake of around 10,000 for men and 8,700 for women. These diets run on the premise that you fast for 2 days of the week and consume as many kilojoules as you like on the non-fasting days.
While 5:2 is the most popular configuration, others find they have more success following a 4:3 or 6:1 ratio of non-fasting to fasting days.
The surprising news is, studies are suggesting these diets are successful in achieving weight loss. Even more surprising, Dr Manny Noakes, Research Director of our Food and Nutrition Flagship, says research is revealing people don’t eat more than they usually would on the non-fasting days – which was what many experts expected to see.
The research is still limited, but Dr Noakes says animal studies have been optimistic. Some of these animal studies have shown intermittent fasting can fend off illnesses including cancer, diabetes, heart disease and neurodegenerative disorders and may improve insulin sensitivity.
Dr Noakes says she herself would not discourage someone following such a diet that was seeing success, though she cautions there is still a lot to learn before it gets the seal of approval.
“If people who are overweight have struggled to lose weight following other diets, and they find this works for them, then that is great. Weight loss, particularly belly fat, has many health benefits – visceral fat is involved in disrupting blood-sugar regulation and is associated with high cholesterol levels. It’s also a risk factor for developing Type 2 diabetes and heart disease.”
On the flipside, Dr Noakes says what we don’t yet know about intermittent fasting is what these diets mean for long term health.
If the person is simply losing weight because they are effectively cutting a lot of kilojoules from their weekly intake, but they are still eating poorly, then I’d have to argue they still need to address their eating habits for longer term health gain.
She says while restricting your kilojoule intake is a guaranteed way to lose weight, cutting back indiscriminately can lead to an unbalanced, unhealthy diet, and recommends a more balanced approach. “It’s important not to cut key food groups including dairy, grains and cereals – you’ll be missing out on some important nutrients essential for good health.”
To summarise the pros and cons:
ON THE PRO SIDE:
- Loss of body fat/ weight for overweight people is of health benefit in general.
- Early research shows contrary to what scientists expected to see, people do not consume more kilojoules on the non-fasting days.
- Intermittent fasting diets seem to be as effective as calorie restricted diets for weight loss.
- There is early research to suggest it is effective in curbing cravings.
- It provides an easier weight loss plan than standard kilojoule restricting diets – there is no weighing or ‘forbidden’ foods to worry about – on the fasting day, the limited calories will be accounted for very quickly and there are no restrictions on non-fasting days.
ON THE CON SIDE:
- Fasting diets don’t change the way you eat – there is no evidence at this stage that suggests people eat healthier food than they did prior to starting the diet. While maintaining a healthy body weight is important for good health; a nutritious diet offers important vitamin and minerals.
- There is limited research on the long term effectiveness – or any long term health issues related to intermittent fasting.
- This lack of research means we don’t know who the diet works for and who it might not – for example, what medications or illnesses it may interact badly with.
- Unlike diets that make healthy lifestyle changes – like the Total Wellbeing Diet – fasting diets do not provide advice on how to eat for optimal health, in a way that is sustainable in the long run.
These days, massive volumes of data about us are collected from censuses and surveys, computers and mobile devices, as well as scanning machines and sensors of many kinds. But this data can also reveal personal and sensitive information about us, raising some serious privacy concerns.
Data are routinely collected when we shop, use public transport, visit our GP or access government services in person or online. There’s also data from using our smart phones and fitness monitoring devices.
These data are generally collected for a purpose, called the “primary purpose”. For example, having purchased goods delivered, catching a bus from home to work, having a health check, obtaining a Medicare refund, navigating or searching our local area, as well as logging our fitness regime.
But in addition to being used for such primary purposes, many data are stored and used for other purposes, called “secondary purposes”. This includes research to help inform decision-making and debate within government and the community.
For example, data from Medicare, the Pharmaceutical Benefits Scheme and hospitals can be used to identify potential adverse drug reactions much faster than is currently possible.
What about privacy?
But these data can also reveal highly sensitive information about us, such as about our preferences, behaviours, friends and whether we have a disease or not.
Given the rapid change in the volume and nature of data in the digital age, it is timely to ask whether the existing ethics frameworks for the secondary use of such data are still adequate. Do they address the right ethical issues associated with research using the data? In particular, how will an individual’s privacy be protected?
There have been two important responses to these issues. A group of researchers, supported by the University of Melbourne and the Carlton Connect Initiative, explored these issues through workshops, desk research and many consultations.
They produced the Guidelines for the Ethical Use of Digital Data in Human Research. It’s a work in progress, requiring ongoing practice and revision, rather than a definitive set of prescriptions.
A team at CSIRO and the Sax Institute also addressed the deeper ethical issue of protecting privacy in the secondary use of health data. This work will be developed into Guidelines for Confidentiality Protection in Public Health Research Results.
Ethical issues for digital data
In the first of the guidelines, five key categories of ethical issues are identified as highly relevant to digital data and require additional consideration when using digital data.
- Consent: making sure that participants can make informed decisions about their participation in the research
- Privacy and confidentiality: privacy is the control that individuals have over who can access their personal information. Confidentiality is the principle that only authorised persons should have access to information
- Ownership and authorship: who has responsibility for the data, and at what point does the individual give up their right to control their personal data?
- Data sharing – assessing the social benefits of research: data matching and re-use of data from one source or research project in another
- Governance and custodianship: oversight and implementation of the management, organisation, access and preservation of digital data.
The voluntary guidelines were developed to help people conducting research and to assist ethics committees to assess research involving digital data.
Without such guidelines, there is a risk that new ethical issues involving digital data will not adequately be considered and managed by researchers and ethics committees.
Privacy risks from the data
Traditionally, the data custodians responsible for granting access to data sets have sought to protect people’s confidentiality by only providing access to approved researchers. They also restricted the detail of the data released, such as replacing age or date of birth by month or year of birth.
More recently, data custodians are increasingly being asked for highly flexible access to more and more details about individual persons from an expanded range of data collections.
Custodians are responding by developing a new flexible range of access modes or mechanisms, including remote analysis systems and virtual data centres.
Under remote analysis, a researcher does not have access to any of the data but submits queries and receives analysis results through a secure webpage.
A virtual data centre is less restrictive than a remote analysis system. It enables researchers to interact directly with data, submit queries and receive results through a secure interface.
But the results of statistical analysis as released by a virtual data centre may still reveal personal information. For example, if a result such as an average is computed on a very small number of people then it is probably very close to the value for each of those people.
By following such voluntary guidelines, researchers can maintain confidentiality while ensuring that society can benefit from their work.
The rapid technological advances in our society are creating more and more data archives of many different types. It’s vital that we continue to assess the ethical and privacy risks from secondary use of this data if researchers are to reap the potential benefits from access to the information.
Bushfires are highly chaotic natural events, dangerous to people and homes in their path and even more dangerous to those brave enough to fight them.
Australia is all-too-familiar with tragedy caused by bushfire, with days such as Ash Wednesday and Black Saturday ingrained into public and personal memories. The costs in a bad bushfire season can run into billions of dollars, although nothing can truly account for the lives and communities affected by these events.
Bushfires are hard to predict for two reasons. No-one can be sure where or when they will start, although well-educated guesses can be made.
Weather conditions conducive to the outbreak of bushfires are well known and serve to prompt total fire bans to reduce the chance of accidental ignitions. Unfortunately, some of the most frequent causes – lightning strikes and arson – are inherently unpredictable.
Once a bushfire has started it is also difficult to predict precisely where it will go.
While all bushfires do follow well understood physical laws, fine scale variations in factors such as the weather, topography and distribution of fuel mean that a bushfire may appear to behave erratically.
Sudden shifts in the wind direction may cause a quiescent flank to burst to life, creating a new wider fire front. A single tree next to a road or river may enable the fire to jump across an otherwise impassable barrier.
Fighting and controlling fires is a major difficulty for emergency services due to this level of uncertainty. Even deciding the best evacuation routes in uncertain fire conditions can be challenging.
Studying bushfire behaviour
This apparent unpredictability has not deterred fire scientists. Since the early part of the last century these scientists have been carefully studying the behaviour and spread of fires in different conditions.
The results have been collected and tabulated into mathematical formulae to predict how fast a fire will spread. These have been used in Australia for many years for early warning and planning purposes.
But the speed of a fire depends on a wide range of factors. These range from large scale effects, such as the weather or slope of the land, to the small scale, such as whether the fire is burning through leaf litter or grass. The resulting mathematical calculations are complicated, as all of these factors must be included.
Fire science, like many other science disciplines, has benefited from the recent growth in computer processing and data storage. These advances mean meteorological models can now give weather forecasts at very fine scales.
Improvements in computer algorithms have led to newer, more powerful, models to represent spreading fires. Growth in data storage has allowed the creation of detailed maps of terrain and vegetation.
Spark: a new insight into bushfire spread simulation
Fire spread simulation is an intersection of a number of disciplines including ecology, geography, physics, meteorology, mathematics and computer science. When simulating fires, each of these must work together.
To do this most effectively, a new way to bring all of these parts together was needed. This led to the creation of a new software system called Spark.
Spark is a bushfire prediction framework containing all the parts needed to process fine-scale weather and fuel data, run advanced fire simulations and depict the results. The system will be released today at the Australia New Zealand Disaster Management Conference on the Gold Coast.
The parts that make up Spark can also be connected together in whichever way best suits the user. This also has the advantage that as new models come along, the older parts in the system can simply be replaced.
The system enables scientists from multiple disciplines to collaborate. Currently, fire scientists are working to improve fire behaviour models, computer scientists are building new ways to simulate perimeter propagation and software engineers are developing the system on the latest computational hardware.
Spark has been built with the uncertainty of fire behaviour foremost in mind. For predictions of ongoing fires, multiple different cases can be run for slightly different weather forecasts.
The system contains statistical components that allow the results to be combined into maps of the likelihood of when the fire is going to arrive at a given location.
Other current research involves improving fire predictions by using a range of conditions, some likely and others very unlikely.
These predictions can be combined with real-world measurements of the fire using a statistical method to feed back into the model. This allows the model to respond to changing conditions, including highly unlikely events, providing better predictions of future fire behaviour.
Bringing the latest fire science to the fireground
The collaborative approach behind Spark means that services and agencies using the system will benefit from the latest advances in fire science.
The system can be fully customised and can be integrated with existing systems. Spark can also be built into any number of applications, such as evacuation planning or fire regime tools.
Spark can also be used for land management and planning, fire mitigation analysis, real-time fire prediction, risk analysis or reconstruction and analysis of fire events.
James Hilton is Research scientist at CSIRO.
Andrew Sullivan is Research Team Leader, Bushfire Behaviour and Risks at CSIRO.
Mahesh Prakash is Principal Research Scientist, Fluid Dynamics at CSIRO.
Ryan Fraser is Research Manager at CSIRO.
Australia’s CSIRO has come up with some pretty amazing inventions over the past 86 years of research, from polymer banknotes to insect repellent and the world-changing Wi-Fi. But we can also lay claim to something a little more esoteric – we actually invented a whole new word.
The word is “petrichor”, and it’s used to describe the distinct scent of rain in the air. Or, to be more precise, it’s the name of an oil that’s released from the earth into the air before rain begins to fall.
This heady smell of oncoming wet weather is something most Australians would be familiar with – in fact, some scientists now suggest that humans inherited an affection for the smell from ancestors who relied on rainy weather for their survival.
Even the word itself has ancient origins. It’s derived from the Greek “petra” (stone) and “ichor” which, in Greek mythology, is the ethereal blood of the gods.
But the story behind its scientific discovery is a lesser known tale. So, how is it that we came to find this heavenly blood in the stone?
Nature of Argillaceous Odour might be a mouthful, but this was the name of the paper published in the Nature journal of March 7, 1964, by CSIRO scientists Isabel (Joy) Bear and Richard Thomas, that first described petrichor.
Thomas had for years been trying to identify the cause for what was a long-known and widespread phenomena. As the paper opened:
That many natural dry clays and soils evolve a peculiar and characteristic odour when breathed on, or moistened with water, is recognised by all the earlier text books of mineralogy.
The odour was particularly prevalent in arid regions and was widely recognised and associated with the first rains after a period of drought. The paper went on to say:
There is some evidence that drought-stricken cattle respond in a restless matter to this “smell of rain”.
The smell had actually been described already by a small perfumery industry operating out of India, which had successfully captured and absorbed the scent in sandalwood oil. They called it “matti ka attar” or “earth perfume”. But its source was still unknown to science.
Joy and Richard, working at what was then our Division of Mineral Chemistry in Melbourne, were determined to identify and describe its origin.
By steam distilling rocks that had been exposed to warm, dry conditions in the open, they discovered a yellowish oil – trapped in rocks and soil but released by moisture – that was responsible for the smell.
The diverse nature of the host materials has led us to propose the name “petrichor” for this apparently unique odour which can be regarded as an “ichor” or “tenuous essence” derived from rock or stone.
The oil itself was thus named petrichor — the blood of the stone.
Bring on the humidity
The smell itself comes about when increased humidity – a pre-cursor to rain – fills the pores of stones (rocks, soil, etc) with tiny amounts of water.
While it’s only a minuscule amount, it is enough to flush the oil from the stone and release petrichor into the air. This is further accelerated when actual rain arrives and makes contact with the earth, spreading the scent into the wind.
According to the Nature Paper:
In general, materials in which silica or various metallic silicates predominated were outstanding in their capacity to yield the odour. It was also noted that the odour could be obtained from freshly ignited materials rich in iron oxide, with or without silica.
It’s a beautiful sequence of events, but one that may be hard to visualise.
Thankfully, in a testament to the ongoing scientific fascination with this finding, a team of scientists at the Massachusetts Institute of Technology have just this year released a super slow motion video of the petrichor process in motion.
Using high-speed cameras, the researchers observed that when a raindrop hits a porous surface, it traps tiny air bubbles at the point of contact. As in a glass of champagne, the bubbles then shoot upward, ultimately bursting from the drop in a fizz of aerosols.
The team was also able to predict the amount of aerosols released, based on the velocity of the raindrop and the permeability of the contact surface which may explain how certain soil-based diseases spread.
There’s a small body of research and literature on petrichor that’s fascinating in its own right, including Thomas and Bear’s subsequent paper Petrichor and Plant Growth a year after they first named the smell.
So what happened to Joy Bear and Richard Thomas?
Richard had actually retired from CSIRO in 1961 when he was First Chief of the Division of Minerals Chemistry. He died in 1974, aged 73.
Joy, aged 88, a true innovator and pioneer in her field, retired from CSIRO only in January this year, after a career spanning more than 70 years.
The joint discovery of petrichor was just part of a truly remarkable and inspiring career which culminated in 1986, with Joy’s appointment as a Member of the Order of Australia for services to science.
We are thankful to both for the lasting legacy on giving a name to the smell of rain and to Joy for the role model she has been to so many women in science.
This is part of a series on CSIRO Inventions.
This article was originally published on The Conversation.
Read the original article.