IT Insights

Open

Podcast

Human-centrism and the cultural context of technology with Payal Arora

Human-centrism and the cultural context of technology with Payal Arora

Michał Grela

Michał Grela

Relationship Manager at Future Processing

Contact me
Payal Arora

Payal Arora

Professor, Author and Speaker

Our guest is Payal Arora, a co-founder of FemLab.Co and a Professor and Chair in Technology, Values and Global Media Cultures at Erasmus University Rotterdam; authoress of several books including the award-winning “The Next Billion Users” with Harvard Press. Her expertise lies in inclusive design and user experiences among low-income communities worldwide.

In the podcast we discuss the aspects of human-centrism and the cultural context of technology. We also unveil the notion of digital anthropologist as well as the connection between creative design in technology and human centred solutions.

Michał Grela (MG): Hello and welcome to yet another episode of IT Insights by Future Processing. My name is Michał Grela, and today, I’m happy to talk with Payal Arora, co-founder of FemLab.Co and a professor and chair in technology, values and global media cultures at Erasmus University in Rotterdam. Payal is an author of several books, including the award-winning The Next Billion Users with Howard Press. Her expertise lies in inclusive design and user experience among low-income communities worldwide. And today, we’re going to discuss the aspects of human centrism and the cultural context of technology. Thank you for joining the podcast, Payal.
Payal Arora (PA): Yeah. Happy to be here.
MG: Can you introduce yourself and share a few words about your background, please?
PA: Sure. So as you said, my background is really about digital anthropology. I’m a digital anthropologist, and I’ve been doing this for the last two decades. What this really means is that I study how people use digital media and make sense of it in their everyday lives. And in very different contexts, from Brazil to India, to Africa, I have examined these and really, the underlying drive for this is that are we, as societies and groups of people, distinctly different in the way we express ourselves, in the way we make meaning the technology? And how do we take that information forward in the designing of systems that can genuinely enable human flourishing? So that’s been my driver for the last two decades.
MG: That’s very interesting. I noticed that Forbes named you the next billion champion and the right kind of person to reform tech. Congrats on that. That’s a big thing. And you refer to yourself as digital anthropologist, and I’m super keen on understanding who actually digital anthropologists are, and what is your place in the world of software development?
PA: Yeah. So digital anthropology is a fancy term, but actually it really means is that how do you study how people use technologies in their everyday lives? How do they integrate it? How do they make sense of it? How do they value it? Because we all know that many of us feel it as an extension of ourselves, but what does that actually mean? And when you talk about an extension of yourselves, you have to put that self in the context of the social group you’re in, the context within which you’re living. So those can shape your behaviors in ways which are distinctly different. Whether you’re a woman in Brazil or you’re a young boy in Namibia, they may have distinct different experiences and yet, perhaps, unifying universal qualities that bring us together.
MG: I guess this anthropological study of relationship between human and the digital era technology seems to have a definitely growing importance in terms of designing software solutions as well. Is that a fact?
PA: Absolutely, because in the last few years, you increasingly see these terms like human-centered design, inclusive design, responsible design. There’s this whole notion that we need to put the user at the center again because generic applications don’t quite fit. And it’s partly also because we can, technologically speaking, because of the capacity of personalization, which enables us to also aggregate insights through deep learning technologies. So in some ironic way, the much more advancement technology gets, the more we realize how we have the capacity to finally do something right, which is tailor make and personalize technologies, which can work for everybody. So diversity shouldn’t be looked upon as an obstacle but rather as a market opportunity.
MG: These are definitely buzzing concepts, and there’s plenty of conversations around how to create a technology that would be human centered and put stress on that aspect. From your perspective, is the business actually there yet? Are these concepts actually being implemented or is it just big talk at the moment?
PA: Well, there’s an attempt to doing it, but there are a couple of obstacles because we still go by the default normative, which is typically male, white, middle class, in the west, Anglo-Saxon context, partly because it’s within reach, partly because it’s the market you know. So if you’re a typical company, you are functioning in a saturated market space, say, if you’re in the United States of Europe, and yet it seems far more risky for you to get out of the west to look at these audiences and potential new user markets because it appears extraordinarily alien and exotic and thereby high risk. While I’ve been arguing that it is actually much more risky to stay in a saturated market because basically, then you’re competing with everybody else, and you’re ignoring majority of the world’s users who are now online and are doing some phenomenally interesting things with technology and are even leading the way in which technologies can be seen and used in the future. So you are putting your company and your teams at risk by ignoring this basically new normative.
MG: That’s very interesting. I was just about to ask that. Is there something like, let’s say, universal cross-culture design or would that unification be too risky as maybe exclusive? Is there any way to tap into this larger pool of users? Of course, there’s no magic wands, but a silver lining. I don’t know. Is there any way to prepare something that will be more universal?
PA: Yeah. So it’s a great question about in a typical tension that has been posed that if we go universal, it comes at the cost of personalization. It comes at the cost of culture. So it seems like as if they’re two opposing ends, but actually that’s partly because we seem to construe culture as either identity like cultural groups or nationalism like west and the rest, global south and the global north. But actually, if you look at culture as practice, then I think we will find the two of them coming in sync.
And let me give you an example. For example, if you’re designing for someone who has only one arm, then you would say, “Well, say you have a mobile phone and you’re like, okay, I need to design for people with one arm, so disability groups.” Then as a company, I’d say, “Well, that’s rather limited because that’s such a tiny, tiny niche. It’s not viable.” But if you ask a different question, how many people usually need just one arm while they’re using their mobile phone because of a variety of different reasons? A mother who is carrying her child, who’s holding a pram, who’s in the Netherlands, cycling, I notice with flowers in the front, AVs at the back, got their groceries in one hand and they’re trying to text. Or teenagers, half the teenagers you see cycling around are eating their sandwiches and they’re cycling and they’re texting.
In fact, the one arm phenomena is much more common, so it’s a universality of practice, even though the groups that they can benefit will be different based on their different needs. It could be disability. It could be a youth who is trying to multitask. It is a mother who doesn’t have a choice often and she has to multitask. So there’s a whole variety of reasons why they need to only use one arm. And that’s just one example.
There are so many kinds of examples like battery life. If you look at why mobile phones took off, particularly in the global south, one of the big success stories was long lasting battery because they don’t typically have access to electricity. So for mobile companies to really have taken off, whether in Africa or Asia, they had to have long lasting battery. So there was a lot of investment and actually a lot of innovations coming in the global south that enabled the telecom sector to take off. Obviously though, everybody benefited from long lasting battery life. So that’s just another example and I can go on. So as you can see that if we can shift from identity focus to process focus, we can bring the universal and the particular together in a way that can work.
MG: I definitely see that you are into challenging stereotypes and this status quo and that’s really good. And in one of your work, The Next Billion Users, you write that most of our assumptions regarding, for example, the users of internet in some regions are wrong. Can you elaborate on that bit?
PA: Yeah. When I say that we get the assumptions wrong, I mean that the typical assumptions we have for the so-called exotic other, which is majority of the world’s population. So for decades, we’ve been designing for a very tiny sliver of the human population, which is, as I said, male, white, middle class, Anglo-Saxon context. And that is fine as long in terms of, of course, you subscribe to one of those populations, but there is a huge – we’re talking about two-thirds of the world’s population totally neglected. You can’t just apply your understandings from one group to another seamlessly.
So if you look at what’s happening in the global south, there’s a variety of different practices particularly driven by one of the key drivers, which is leisure. They are very much like you and I. They are seeking for entertainment, for sociality, for gaming, for even pornography. And these are typical practices also on the west, but we are not allowing them to be examined in the same way because we have a double standard that if we look at people in the global south, we will look at them in a much more instrumental way. How are these populations using it for healthcare? Because that’s where a lot of venture capital funding goes in, because they’re like, “Well, healthcare is big business for fertility apps in Africa or for population control in these contexts or banking practices in education.” Those are very, very instrumental ways, which means that you’re constraining the market and the user practice and not allowing to them to be the norm.
But that would mean that they are universal like you and I, but yes and no. On the surface, of course, they’re universal in terms of their seeking for leisure, but the reasons and the way in which they acquire that digital leisure are different because they are limited and constrained by social cultural factors. If you’re a woman, you can’t just profile yourself when online. You can’t just like, it’s not about constantly showcasing how nice you look and be the fashionista. A number of cultures actually condemn that. There are fatwas about even revealing your face on your profile. So in fact, it’s safer for many women to be anonymous actually, to have locked profiles. So it’s quite contrary to the way in which we say, “Oh, look at these typical teenagers who are constantly feeding off the attention economy.” On the contrary, here they’re escaping the attention economy. That’s just one example.
The other is, for example, that Facebook, there was a lot of declarations that clearly we know the expectations of the internet as a global village where we will start friending strangers is completely wrong. That was what the economist had said a few years ago. And declaring that social media is all about friending acquaintances and known people with less than 4% of the profile friends being strangers. But if you look at people in the global south, they are actually majority of them strangers. So the friending practices are completely different.
And I can bring a number of other examples, but the fact is that these behaviors are driven by a need for perhaps breaking out of your social circles. The more desperate a circumstance, the more you need to break out of the social circles that you’re born into because the internet allows you perhaps a potential escape, a new form of affiliation with completely unknown and global groups, which is also a beautiful thing about the internet. So in a way, the expectation of the internet in the early days has come true and not.
MG: Well, these are very interesting examples. You briefly touched on the aspect of internet. I was just wondering the future of it should mean that it just won’t be the same worldwide. Is that not your take as well that it doesn’t have to be the same everywhere? What do you think about it?
PA: Well, it depends how you approach this question from a top down versus bottom up approach. So we’ve been talking about users all this time, and obviously diversity is part of the beauty of human ingenuity. Who wants the same of everything? We get bored if the same story is told to us a hundred [00:15:00] times. Everybody wants something novel, something fresh, something unexpected. So storytelling by nature is diverse. It’s human. It’s relatable. It’s cross cultural. But in terms of a top down measure, there are changes happening which are going to be directed at unifying us as well as dividing us.
For example, there’s sovereignty politics going on where you see data localization laws, whether it’s in India and elsewhere, which is going to be treating platforms as typical broadcasting platforms, which means that they will be subjected to the same censorship laws in the nation of concern. So it means whether you’re in Pakistan, you have to subscribe to the blasphemy law. If you are in India or many other contexts in Africa, you have to subscribe to the sedition law, which basically comes from the colonial time where you can’t create disharmony. If you look at what’s happening with China, they have just come out sweeping reforms on how AI needs to be regulated and how the onus comes on the platforms, that they are in fact responsible for the content, which is quite unprecedented.
So each nation is coming up with their own ways of regulating and shaping the internet, which will, of course, shape user practice. For example, a number of countries ban TikTok, including that of India, which already de-platformizes and pushes people and users to move across platforms. So you have that kind of digital fragmentation based on national interests, sovereignty, the need for control. And then you have the opposite of a need for solidarity and global cooperation, whether it’s to do with fintech because trying to close the loopholes in financial tax havens basically, and you see that there’s a big push with the partnership with the US and EU. And there’s this need for GDPR going global, so privacy and data security can become much more vigorous. And with the latest with the Ukraine war and the constant cyber hacking, there’s a renewed vigor for global solidarity in strengthening our system so it can be less vulnerable to these kinds of hacking.
So we are seeing both in parallel. And I think what’s really important is can we get the right formula? It’s always going to be both, but can the formula be correct in a way which is really geared towards people and their wellbeing and not towards national and political interests? And so let’s hope for the best for that.
MG: Definitely. I do believe that hopefully the importance of this inclusive design is growing in the consciousness of tech companies. And hopefully, the big organizations because they are spearheading the market are, and will be, open for the users among these different communities. So different when it comes to the income context or the cultural context, etc. But when it comes to designing technology, with that in mind, how do you even approach it? How do you perceive the connection between what we just talked about and let’s say the creative design in technology or the human centered solution, all that stuff? Is it even interconnected?
PA: Oh, absolutely because, look, if you are a designer, you are an entrepreneur in this space, you are embedded in the context within which you’re designing, which means that being human centered is just part of the approach. But if you are constrained by certain regulations, you need to know the playing ground. So you can’t design the most inclusive application for TikTok if it is already banned in that context. So you will try to see what the nation and what are the leanings of the context you have in mind.
Also, you will look at the trends of the convergence of multiple apps, what they call the super apps. And why is that so? For example, why are people so geared particularly in resource constrained settings towards these super apps? Because every time you’re going out of that app system, it requires more data. Every hyperlink, every additional download of a separate app, which requires you to move that design into a main app, takes effort, takes time. It takes data, and you can lose connection because electricity is unstable. So that means that if you want to become a viable competitive space, you need to keep it simple. You need to keep it clean. You need to keep it multi-purpose for resilience’s sake because you need to understand that the environment is volatile. It could be changing. So you need to have a plan B and a C.
And it needs to also have a number of aspects where the financial stream needs to be in, the entertainment, the verification, the security. So you need to put all this into consideration when you’re designing it. So you’re not just a designer in terms of the aesthetics and usability. You need to think far beyond it in a design thinking way because otherwise, your app may not survive. Your design may not be used. However beautiful, however efficient and however glorious it is, a mediocre app may be more successful if they take other aspects into consideration.
MG: Definitely. I couldn’t agree with you more. I know that there’s maybe another hat that you wear that’s related to AI and the global ethics and AI for good movement and how to leverage that to create a more inclusive digital public commons. And how do you see that in the future? Where is the direction it’s heading when it comes to AI and AI for good trends?
PA: Yeah. I have a huge disdain for that trend, AI for good. It’s a terrible framing, and here’s why. Because it comes from this notion that it is altruistic. If you think about 80% of the funding is coming from Silicon Valley in the name of AI for good, but what this actually translates to be is I want to pilot and test all sorts of weird, interesting ideas and use users in the global south as Guinea pigs, whether you see the Worldcoin and its Orb. I don’t know if you’re aware of that. There’s a particular cryptocurrency, and they have this Orb device, which these founders of Worldcoin basically came up with this idea that if you have this Orb, literally, which is a black globe, which can capture seamlessly all kinds of data by just moving it around among people, you can already see the privacy violations taking place, but they somehow are able to circumvent the GDPR laws in the name of innovation and experimentation.
So when you do AI for good projects, it is basically on the premise that because it’s altruistic, we need to circumvent typical standards for security and privacy. We need to circumvent net neutrality. For example, Facebook’s Free Basics, which came under a lot of heat and it called itself internet.org. Obviously, misleadingly promising the internet when it was actually Facebook and limited apps. So that basically is an entryway into these markets, and then it’s also worse yet that it is not an altruistic act because this is a market venture.
When Facebook or rather Meta today is laying down the cables, underground cables for Africa, they’re not doing it out the goodness of their heart. They’re trying to do hyper monopolization of an entire region where the future of the young people are going to be residing. Sub-Saharan Africa and South Asia are going to be where the future of young people are. And so of course, they’re looking at their markets. And what better strategy than to become the very infrastructure upon which all applications of an entire country or even an entire region gets to be built.
So AI for good can be very, very misleading and dangerously so, because let’s call it for what it is. It’s a market strategy and an investment strategy for first mover advantage, which is a classic business take.
MG: Well, you definitely have some bold assumptions, but it’s good to raise the voice and speak up for those who perhaps are not that heard. Thank you for sharing all of these very interesting insights. It was super interesting and super valuable for me personally to hear about all these aspects that we usually don’t take into consideration, and putting them in the spotlight is definitely available and a practice that we all should align with, hopefully.
That was IT Insights with Payal Arora. Thank you.
PA: Thank you.

INSIGHTS

Check similar insights

Ewa and Manjit
The payments point of view by Kaspar Loog and Michal Grela
How to Prepare and Run an IT Project Transition for the Best Possible Chance of Success? by Karolina Trzcionka, Paweł Pukocz and Micha Grela

Contact

Get in touch

Have any question about specific material?
Let us know!

Ewa Banaś

Ewa Banaś

Marketing Manager - Insurance at Future Processing

Anna Mleczko

Anna Mleczko

Relationship Manager at Future Processing

Paul Blowers

Paul Blowers

Commercial Director at Future Processing