We’re living in a time when two powerful forces are reshaping our world, climate change and rapid advances in technology. Faced with rising heat, floods, and extreme weather, governments and businesses are turning to data and artificial intelligence (AI) to find answers. But as these tools begin to shape major decisions, we have to ask: Whose data are we using? Whose voices are being left out? And who gets to decide what counts as a ‘solution’? For organisations like CAG, which have spent decades championing environmental justice and citizen participation, this is both a challenge and an opportunity. It’s a chance to rethink how evidence is used, not just to predict the future, but to shape it in ways that are fair, inclusive, and rooted in the needs of people and the planet.
The Power of Evidence in Policy Advocacy
Since its inception, CAG has grounded its advocacy in data and evidence. Whether showing how residents are adopting sustainable practices in energy, water, waste and transport, research into climate‑induced migration patterns that reveal hidden social costs, or assessing climate literacy among school children and policy makers, CAG has used data to highlight invisible realities and demand action. These efforts show how evidence, when collected ethically and analysed thoughtfully, can bridge the gap between citizens and the state.
But data alone does not create change. It is the democratisation of data, ensuring that communities participate in defining problems, generating evidence, and crafting solutions, that transforms information into action. CAG’s work has consistently prioritised this people-centred approach, where data about communities is not extracted, but produced with them.
AI Enters the Equation: Promise and Democratic Dilemmas
The arrival of AI in public policy offers remarkable possibilities. From modelling flood risks and optimising renewable energy distribution to forecasting crop failures and managing transportation systems, AI has the potential to improve the precision and responsiveness of governance. In India, AI is already being integrated into early warning systems for floods, agricultural monitoring, and urban planning.
Yet, beneath this promise lies a set of democratic dilemmas. AI systems are only as fair as the data they are trained on, and that data often reflects the biases and blind spots of existing power structures. Marginalised communities, informal workers, and vulnerable groups are frequently absent from official datasets. When these blind spots become encoded into algorithms, the result can be unjust policies and invisible exclusions. As governance becomes more data-driven, there's a real risk that authority recentralises among technocrats and data scientists. Then comes the opacity problem. Many advanced AI systems, particularly those using deep learning, function as black boxes, offering little transparency into how decisions are made. This makes accountability harder to enforce and public trust more difficult to build.
Moreover, AI-driven decision-making can distance governance from the people it serves. When algorithms determine resource allocation or prioritise infrastructure investments without public deliberation, we risk losing the democratic muscle that keeps governance accountable and participatory. In addition to that, AI can amplify digital divides, as access to data, technical expertise, and digital infrastructure remains highly uneven.
Climate Justice in the Digital Era
Let’s take the example of just transitions in an era pockmarked with extreme climate events. Climate change is not only a scientific issue, it is a justice issue. Its impacts are uneven, hitting the poorest and most vulnerable the hardest. As AI becomes embedded in climate-related decisions, from where to build flood defences to how to distribute renewable energy, it is essential to ask: who benefits, and who is left behind?
AI models that optimise electricity distribution in urban centres, while efficient on paper, may overlook the lived experiences of informal settlements, where energy access is precarious and irregular. Without grounding AI systems in local realities, we risk designing "smart" systems that fail to serve the people most in need.
The Role of Civil Society: Navigators, Bridge-Builders, and Champions of Justice
In a world where data and technology increasingly influence how policies are made, civil society must move beyond reacting, it must help shape the direction we take. Organisations like CAG have a vital role in:
- Promoting transparency and accountability in the use of AI and data, ensuring systems are designed and deployed responsibly.
- Championing inclusive data practices that reflect the full diversity of lived experiences, especially from communities that are often underrepresented.
- Strengthening digital capacity at the grassroots level, so that individuals and communities are equipped to understand, engage with, and question decisions that affect their lives.
- Collaborating to create ethical standards for technology use, particularly in environmental governance, so that innovation serves the public good.
This moment isn’t just about adapting to change, it’s about actively shaping a digital future that respects human dignity, protects our environment, and deepens democratic participation. At its core, CAG works to ensure that technology supports, not replaces, community voices and democratic accountability.
Evidence with Empathy: CAG’s Approach
At CAG, we continue to explore how technology can serve justice, not just efficiency. Our pilot projects on climate literacy in schools, for example, combine data collection with behavioural insights, helping us understand not just what children know about climate change, but how they feel about it, and what they are ready to act on.
Similarly, our research on distributed renewable energy and solid waste systems does not stop at policy recommendations. It examines how households, informal workers, and communities experience these systems on the ground. In an age where numbers dominate narratives, we strive to bring empathy into evidence.
As we cautiously explore the role of AI tools, whether for data visualisation, translation, or analysis, we remain anchored in our core values: transparency, inclusion, and equity.
Justice Needs More Than Code
For AI and data to truly serve climate justice, they must be governed by principles of democracy, not just algorithms. This means asking hard questions about ownership, consent, access, and accountability. It means ensuring that civil society has the resources, tools, and space to engage with these technologies critically and constructively.
We call on funders, governments, and technologists to back this mission, not just with rhetoric, but with real investments in digital readiness, ethical AI research, and community engagement. Because building a just future cannot be outsourced to code.
In the end, data alone doesn’t create justice. People do.
Add new comment