On Wednesday 11 February 2026 the United Nations marks the 11th International Day of Women and Girls in Science.
This year, the UN is focusing the day under the theme ‘Synergizing AI, Social Science, STEM and Finance: Building Inclusive Futures for Women and Girls’.
To explore this theme, we reflect on how gender bias in geoscience is mirrored by modern technologies, particularly artificial intelligence (AI). As the UN warns, “AI offers powerful tools for data analytics, health diagnostics, climate modelling and more; however, without targeted interventions, its benefits risk bypassing women and girls”.
Artificial intelligence is the simulation of human intelligence by computer systems, including learning, reasoning, and self-correction. Its use in everyday life is becoming more prevalent, from digital assistants like Siri and Alexa, personalised recommendations, targeted advertising, and media generation.
AI has also been increasingly used in geoscience to improve data analysis, modelling techniques, and decision-making. Some specific examples of the use of AI across the geosciences are:
- Remote sensing and image analysis: AI algorithms can be used to analyse large amounts of satellite data to identify patterns and anomalies that may be difficult for humans to detect. This has applications in natural resource exploration, disaster response, and land management.
- Geological modelling: AI can be used to create 3D models of geological features, such as faults, fractures, and mineral deposits. These models can help geoscientists better understand the subsurface geology and make more accurate predictions about the subsurface characteristics and behaviour.
- Climate modelling and prediction: AI can be used to model and predict climate patterns, such as sea surface temperature, precipitation, and extreme weather events. This can aid in the development of climate change adaptation and mitigation strategies, as well as geoscientists' understanding of our planet.
We asked ChatGPT (a type of AI that uses algorithms to create conversational responses) to tell us about the impact gender bias in AI could have in the geosciences.
“Gender bias in artificial intelligence (AI) can have significant impacts on the geosciences, as it can perpetuate and exacerbate existing gender imbalances in the field. AI systems are trained on large datasets, and if these datasets contain gender biases, the AI algorithms will learn and replicate those biases.
One area where this bias can be seen is in the use of AI in hiring and promotion practices. For example, if an AI system is trained on historical data that shows men as more successful in the geosciences, the algorithm may give preference to male candidates, perpetuating the gender imbalance.
This can result in women being overlooked for positions or promotions, leading to a lack of diversity in the field.
Another area where gender bias in AI can affect the geosciences is in research and data analysis. If an AI system is trained on data that is biased or incomplete, it may produce results that reflect those biases. For example, if a dataset only includes data collected by male researchers or only includes data from certain regions, the AI algorithm may produce biased results that do not accurately represent the entire population or the full range of geological phenomena.
The use of AI in natural disaster response and risk assessment is another area where gender bias can have significant consequences. If an AI system is trained on data that does not account for the experiences and vulnerabilities of different genders, it may produce inaccurate risk assessments or inadequate responses to disasters, putting women and other marginalized groups at greater risk.
To mitigate the impact of gender bias in AI on the geosciences, it is important to ensure that datasets used to train AI algorithms are diverse, inclusive, and representative of the entire population. Additionally, AI systems should be designed and tested with a gender-inclusive lens, taking into account the experiences and vulnerabilities of different genders. By addressing gender bias in AI, we can help to create a more equitable and inclusive geosciences field.”
Interestingly, ChatGPT is aware of its own weaknesses, recognising the limitations data bias can have. Data bias occurs when systematic errors or distortions in datasets that can skew results and lead to inaccurate or unfair conclusions. In AI, data bias can perpetuate and exacerbate existing biases in data, leading to inaccurate or unfair conclusions, and hindering efforts to create more inclusive and equitable systems.
While using AI can speed up decision-making, it is important to remember that AI technology is only as good (and as equitable, informed, and reliable) as the data available to it. When relying on AI to support decision-making It is important to consider where data has come from, who generated it, whose voice has been included and whose is missing.
The absence of ChatGPT’s mention of issues of intersectionality highlights a potential challenge around the volume of data that exists on the power dynamics and the variables that impact experience within gender itself, from gender identity, ethnicities, cultures, religion, and class. Leaving out intersectionality can lead to a narrow and incomplete understanding of gender issues and perpetuate the marginalisation and exclusion of individuals with multiple marginalised identities. Failing to consider intersectionality can also result in policies and initiatives that do not adequately address the needs and concerns of all individuals and may even exacerbate existing inequalities. It is essential to recognise and address the complex and interconnected ways in which gender intersects with other factors to create more inclusive and equitable solutions.
Despite best efforts, bias can seep into data because of each researcher’s own history, culture, experiences, and interests – even down to the area they choose to study. This means that whilst underrepresentation continues to exist within geoscience research roles and careers, the needs and experiences of underrepresented groups are less likely to exist within research, thus increasing the likelihood of bias in AI algorithms.
Overall, AI has the potential to greatly improve our understanding of the Earth and our ability to make informed decisions about natural resource management, disaster response, and climate change adaptation and mitigation. However, it is important to ensure that the use of AI in geoscience is ethical and inclusive, taking into account potential biases and considering the impacts on all stakeholders.
Note: This article was originally posted on the Geological Society’s blog on 8 March 2023.