Skip to Content

How the Global South is Shaping Inclusive Tech: Lessons from Payal Arora’s ‘From Pessimism to Promise

Lessons from the Global South on Designing Inclusive Tech

Discover how optimism in the Global South is redefining digital culture and AI design. Explore Payal Arora’s key lessons on building inclusive, pleasure-positive, and sustainable tech that empowers marginalized voices and supports a green economy. Learn actionable strategies for designing technology that truly benefits diverse communities and the planet.

Ready to see how the next wave of digital innovation is being shaped by young, creative voices from the Global South? Dive deeper into Payal Arora’s transformative insights and learn how you can champion inclusive, sustainable, and impactful technology design. Keep reading to unlock the future of tech!

Recommendation

People in the West tend to be anxious about the digital future. They worry technologies like artificial intelligence (AI) will manipulate them and undermine democratic institutions. Western policymakers are focused on breaking up large tech companies and how to regulate social media and AI. But according to digital anthropologist Payal Arora, young people outside the West, and especially in the Global South, are optimistic about, and even excited by, the digital future. For them, the digital world is a source of pleasure and entertainment — and offers hope for a better future.

Take-Aways

  • Western users are pessimistic about digital culture — but young people in the Global South are optimistic.
  • AI design should focus on relationships and how people can develop technology for positive ends.
  • Think of AI and its algorithms in the context of a creative economy that will engage young people.
  • Design digital technologies to encourage pleasure and intimacy.
  • Digital technologies can be used for surveillance in a police state but can also be used to care for others.
  • Design AI in ways that support a green economy.
  • AI cannot stand apart from the physical, multicultural world.
  • AI design must move past the Global South’s colonial past.

Summary

Western users are pessimistic about digital culture — but young people in the Global South are optimistic.

People in the West today have remarkably negative attitudes toward digital technologies. They see digital platforms as traps, crafted to rob people of their attention and their will to take action in the world. They view the algorithms that drive search engines and social media as manipulative and inherently biased. This perspective has bred a widespread “pessimism paralysis”: Westerners are more likely to make the choice to accept the status quo — despite railing against it — than to try to change it. Moreover, they ignore or discount the ways that digital culture supports human connection and expression — especially in places in the world controlled by oppressive regimes.

“Why do we feel the need to cancel the positive with the negative?…Much like any good marriage, society’s relationship with technology needs work. We need to draw boundaries, watch for imbalances, and ensure that the union stays on track.”

For young people in the Global South — including China and India — digital culture is a source of joy, inspiration and liberation. Nearly 90% of youth worldwide live in the Global South, and, unlike their Western peers, they are optimistic about the future and eager to use digital platforms to change their lives and the world in which they live. Current algorithms may be trained on data that encodes various forms of racial and gender bias, yet the vast number of young people in the Global South are opting in to WhatsApp, Messenger, TikTok, and Instagram. They are using digital media to push beyond the boundaries their societies have set for them. Indeed, during the COVID-19 pandemic, those apps provided many young people’s principal source of self-expression.

AI design should focus on relationships and how people can develop technology for positive ends.

Everyone knows the kinds of problems artificial intelligence can bring with it, such as algorithmic biases. But many tech leaders want to use AI as a force for good — offering AI-enabled solutions for the problems that poor and marginalized communities in the Global South face. The question is, what does “doing good” actually entail? When British forces occupied India in the 1800s, they framed colonialism as beneficial to “the Natives.” Even assuming Britain had noble intentions, it’s difficult to argue, in hindsight, that the result of colonialism was not oppressive.

“AI for Good projects driven by tech philanthropy spread the narrative that tech can solve problems of ‘the natives’ by tapping into their aspirational yearnings for freedom, stability, and inclusion.”

It’s not that “AI for Good” projects, as applied to the Global South, will inevitably reproduce colonial oppression. But in order to avoid negative effects, it’s important to be mindful of the diversity and specificity of the contexts and people involved. The tech industry, and Silicon Valley in particular, has become infamous for promising to solve problems with new innovations, only to create new problems with those novel technologies — which they then attempt to solve with the next round of developments. Tech designers need to focus less on grandiose plans to “save the world” and more on the relationships between real people, concrete circumstances, and actionable policies. People need to stop looking at things in black-and-white or “binary” ways. The real world is messy. Technology that will benefit humans in their daily lives requires designing with the on-the-ground contexts and experiences of users in mind.

Think of AI and its algorithms in the context of a creative economy that will engage young people.

Some two decades ago, urban studies thinker Richard Florida came up with the concept of the “creative class.” In Florida’s original vision, members of the creative class were highly educated, freewheeling, disruptive innovators who mostly lived in urban centers like Los Angeles, San Francisco, New York, and London. But at this point, Florida’s conception is dated. The next big movement in digital culture and the creative economy isn’t going to come from Europe or North America but from the tens of millions of young people in the Global South. Weighed down by social and political pressures, young people in the Global South are eager to define their digital futures. They don’t have the choice to do otherwise. They can’t just sit around and wait for their societies to change.

“We are witnessing a creativity turn in the Global South as Indigenous innovators are looking at their people as creative assets, legitimate markets, and content partners to help build their data products and services.”

Social media apps like TikTok and Bigo did well in the Global South because they focused on the poor and marginalized, allowing those most overlooked by their societies to gain greater visibility in the public sphere. The market in the Global South is highly competitive, in part because users are so eager for change. And since most users in the Global South live in closed, repressive, non-democratic countries, they have to be more innovative — if only to evade the regimes they live under. If the creative economy in the West is all about the urban elite, in the Global South, it’s all about the underclass or the “marginalized majority.” They are hungry for attention. They want to be visible. The creative economy and digital culture offer them a way to chart a better future.

Design digital technologies to encourage pleasure and intimacy.

Policymakers and technology professionals need to adopt a “pleasure-positive” approach to online sexuality. Though the technology industry avoids acknowledging data on customer consumption of pornography, it’s crucial for many business models. 35% of internet downloads are in some way connected to porn. One-third of that porn-viewing audience is women. Health publications steer clear of discussions of sex as a core human need, and educational and other public institutions don’t really offer young people any meaningful information on sexuality. Thus, it’s unsurprising that young people — especially those living in cultures where open discussion of sex and sexuality is taboo — turn to digital pornography as a form of sex education.

“We need to shift the social mindset of a war on vulgarity by making peace with sex.”

Countries in the Global South, like India, are especially proactive about placing controls on people’s access to online sexual content. Indeed, in Muslim countries in the Middle East, so much as browsing a sex site can be a criminal offense. Even so, young people in largely Muslim countries like Egypt find ways to access content on love and sex, often in encrypted ways or in contexts in which people can conceal their identity. While restrictive governments often decry their efforts, NGOs have embraced social media campaigns and created apps to help broaden access to sex education in places like Pakistan, where, even between mothers and daughters, discussion of reproductive functions is largely taboo.

Despite these developments, public-facing views on sex and sexuality in the Global South are unlikely to change anytime soon. So, some digital innovators are exploring ways for young people to build ties online that prioritize romance and relationships over sex. One algorithm-driven website, Soulgate, which is not a dating site, allows young men and women to interact with each other in a free and playful way — by way of avatars. The Indian dating app FRND, which doesn’t allow real pictures, targets smaller towns and villages — and by 2021 had over five million users. It seeks to create an open, pleasurable, and safe environment for young men and women to connect. In the lonely world of the 21st century, it’s important to have digital venues that promote desire and pleasure that value care, compassion, and intimacy.

Digital technologies can be used for surveillance in a police state, but can also be used to care for others.

People constantly observe each other but are often paranoid about surveillance, especially in its digital form. In the midst of the COVID-19 lockdowns, Zoom, WhatsApp, and Facebook became ways for isolated individuals to safely connect with circles of friends and sympathetic, like-minded groups. Still, the data generated by social media can be used by intrusive state institutions and by companies seeking more customers. It may be, however, that “social surveillance” in its various digital forms can be principally used as a means to heal post-pandemic societies’ fragmented and often alienated social bodies.

“To rebuild social trust, society needs a surveillance system of care — one that moves away from watching each other as a form of policing to watching over one another as a form of recognition and compassion.”

For public spaces to be safe, there needs to be “eyes on the street.” In Mexico City, where the murder of women went up over 100% in 2021, women joined Facebook and WhatsApp groups where members can track one another’s locations and contact the group if they are scared or in trouble. During the worst phases of the COVID-19 pandemic in China, Chinese mothers were able to support one another and share their problems on TikTok.

In a world plagued by often dangerous mental health issues, apps can provide a way to improve people’s self-care. The Global South is a leader in using apps to respond to emotional and mental health crises, identifying serious mental health risks, managing and monitoring symptoms, and seeking help. Virtual Hope Box, for instance, provides tips on coping with stress along with calming visuals and music. And all of this is lucrative, too. The global mental health app market was worth over $30 billion in 2021, and it’s expected to continue growing.

Design AI in ways that support a green economy.

Anything that’s bad for the Earth is probably also bad for human beings, and whatever is good for the planet is likely good for human beings and human life. Design, whether that’s the design of buildings or digital technologies, needs to be aligned with people’s environmental and social goals, and those basic principles should be translated into real professional standards. According to a 2012 European Commission report, well over half of a product’s environmental impact is shaped by decisions made at the design phase.

“Indigenous communities across the Global South have, for centuries, related to nature as a sentient being with its own moods, feelings, intelligence, and rights.”

Western designers can model their approaches on what Indigenous communities in the Global South have been doing for generations. These approaches fall under four categories: frugality, collectivity, subsistence, and repair. Creating a culture of design frugality involves both the nature of the design and the materials involved, and the nature of consumption. It should consider the way current consumption practices also tend to exacerbate social inequalities often inherited from the colonial past. Creating a culture of collectivity in design involves giving local communities agency over the design, and it also involves reimagining and redefining power relations between people. Creating a culture of subsistence in design involves affirming the value of diversity — for example, how diverse ecologies make agriculture more sustainable. Women have always provided leadership on these issues. Creating a culture of repair in design isn’t just about fixing things and allowing them to be sustainable over time. It also means making it possible to repurpose, renew, and regenerate what already exists. Repairing small parts of the world can change the whole world.

AI cannot stand apart from the physical, multicultural world.

The media tend to assume that the adoption of digital technologies, and especially AI, will instantly change everything about the way people interact and live. But design doesn’t immediately create positive or negative social change. Design is a human activity, and if a particular design survives, it gets folded into the way people live their lives.

“AI is the air we breathe, the water we drink, the land we stand on. It manifests in underwater cables, data centers, and solar panels. The hunger of the cloud is fed by our planet.”

People get the impression that digital technologies like artificial intelligence are cerebral and abstract. Once people understand that digital technologies are actually physically embodied, it will be easier and more natural to incorporate sustainable, environmental practices into design approaches. For sustainable, green design to move forward, it will be crucial to evaluate the material costs of incorporating AI and various forms of digital automation into human life. The digital future must be one that promotes the health of the environment and human societies. In addition, the vast number of users in the Global South is going to diversify digital culture — and the datasets and algorithms that come with it. Companies, entrepreneurs, and designers are all going to have to make a multicultural and even multilingual approach their default mode.

AI design must move past the Global South’s colonial past.

AI and digital culture in the Global South and, ultimately, everywhere is going to have to “decolonize.” Everyone knows that the injustices and inequalities of the past are inscribed into AI and its datasets and algorithms. In order to move forward, AI will have to be rendered more inclusive. For this to work, businesses and institutions will have to do more than state moral and political principles. They will have to incorporate a decolonizing agenda into the way they actually work in a day-to-day way.

“If an organization wants to get on board to build hope through their AI-enabled designs and policies, they need to buy into the fact that culture, local and global, truly matters.”

Organizations will also need to work in an equitable way and establish rich relationships with civic groups, artists, and community leaders. Organizations need to be open to and learn from others. And finally, companies need to be accountable for their products’ effects on society. They should have people in place to monitor their products’ societal impact. Social change only happens slowly. Still, pessimism isn’t an option.

About the Author

Payal Arora is a digital anthropologist and consultant. She is the author of The Next Billion Users and is a professor of Inclusive AI Cultures at Utrecht University and co-founder of the feminist future of work initiative FemLab.