Alejandro:: Hello, hello.
Alejandro:: How are you doing?
Kriti: I’m good. How are you?
Alejandro:: Good. Thank you. Where are you? You’re in London.
Kriti: I am in London, yes. I am jumping from meetings to meetings. It’s been a very busy time that’s why I apologize…
Alejandro:: No, it’s okay. No worries.
Kriti: It’s just rather…
Alejandro:: I’m trying to gauge what time it is over there. What is it?
Kriti: It’s five o’clock. 5PM.
Alejandro:: Oh, okay. All right.
Kriti: I’ll just charge my computer. Yes, great. How long do you anticipate this to take?
Alejandro:: I say we can hopefully keep it at 45 minutes.
Kriti: Okay, cool. Yes. I have a hard stop at 6 but that’s an hour. It should be good.
Alejandro:: Perfect. Do you have headphones or…?
Kriti: I do, yes.
Alejandro:: Okay, cool.
Kriti: I have been running from one meeting place to another. And London is… I don’t know if you’ve been but it’s a very big city. It can take an hour to go from central to central.
Alejandro:: Yes. I grew up there… Not grew up there. I lived there for about four months. And it was beautiful. And it was actually at a good time, I think March right when it gets a wee bit warmer
Kriti: Yes. And how old were you at that time?
Alejandro:: This was… I was 21, 22.
Kriti: Oh, okay.
Alejandro:: So I just graduated. Yes, a couple of years after college. It was beautiful. I loved it. People in High Park, any ray of sunshine, you see 20 bodies like stacked up…
Kriti: In shorts, wearing shorts. It doesn’t matter how cold it is but if there’s sun…
Alejandro:: Yes. Sun is very much on demand.
Kriti: That’s right.
Alejandro:: All right. Kriti, thank you for your time and for being able to share your story. So I always begin the Podcast by asking two similar questions. And they are, is there a morning routine that helps you get your day started in the right mindset?
Kriti: When I’m not travelling, and I do travel a lot. When I’m not travelling, I do like to take some time, you know, just have a nice cup of coffee and just plan the day a little bit better but the big priorities that I need to tackle. It’s quite easy to fall into the trap of trying to do a hundred things at the same time but it’s as important to just take a step back and reflect on what are the two, three things that if you achieved today would be progressed towards the bigger goals, not just cleaning up my Inbox or responding to all those people and returning those messages but what about the bigger picture? And I think that reflecting time helps me quite a bit.
Alejandro:: That’s great.
Kriti: When I’m travelling, I get jetlagged quite a lot. I’d say that’s a constant struggle. When I’m travelling, I don’t sleep very well. That time from waking up and getting on with the day.
Alejandro:: In terms of travelling, it’s across Europe like international travel or it’s a lot of travelling within, domestically?
Kriti: No, more international.
Alejandro:: Oh, okay. So that definitely hits you in terms of time difference.
Alejandro:: That’s hard.
Kriti: [Inaudible 00:05:4].
Alejandro:: You haven’t what? You haven’t mastered it yet?
Kriti: I haven’t done that yet, no. I had mastered travel packing. I can pack a carry on for nine days.
Alejandro:: Oh, that’s good.
Kriti: And better but time-wise, no. Well anyways…
Alejandro:: [Inaudible 00:06:00]. That’s for sure. So what about stress? I mean, we talked about getting the right mindset but in terms of stress, what are things that when you feel you have had a very long day, what are things that you do to make sure that you just calm your nerves down?
Kriti: Mathematical puzzles.
Alejandro:: Mathematical puzzles.
Kriti: Yes, it really works for me. I love solving mathematical puzzles. It calms my brain. It just relaxes me a lot. It helps me focus.
Alejandro:: Is there an app? Or is there somewhere you go to play with in terms of puzzles, mathematical puzzles? Or you just kind of do it yourself?
Kriti: Books, apps, a combination. I prefer books because it’s good to be analog sometimes. I do difficult Sudoku puzzles sometimes in my phone if I have nothing else around me but it really helps me concentrate. I studied Engineering, Computer Science and Math has always been something that I love and I want to keep improving my skills. And it also calms me down. I think my brain is just wired to solve mathematical puzzles.
Alejandro:: I love it. So tell me where did you grow up? If you can paint us a picture of the city, the community, the neighbourhood, that will be wonderful.
Kriti: Great! I grew up in a city called Jaipur in North of India in Rajasthan. And it’s a very dense city but I think it shaped me into who I am in many ways because there were a lot of issues like infrastructure. Pollution is pretty bad. But there were some that really hit hard for me like when I was a very young girl, the first thing I remembered my parents telling me was to not walk out alone on the streets. Not even during the day. It wasn’t safe. Not just for me but for women or girls in general. Safety is a real issue.
And I was a very imaginative kid. I always thought, it was a bit strange. How could things be changed because it’s unfair to have to worry about if you’re allowed to or if you feel safe walking out. And then a lot of… yes?
Alejandro:: I’m sorry to interrupt. This was because of being a woman like a female? Or in general it was not safe to walk out?
Kriti: I think it’s more for women and girls where there’s a much higher crime rate against them and physical safety is a huge issue. A lot of these conversations have become normalized. Being harassed on the streets is fine. It happens to everybody. That’s the kind of normalized attitude towards it.
So to me, that was something I kind of struggled with absorbing. I was very fortunate to have extremely supportive parents and the rest of the environment could be quite conservative in many ways.
For example, the average age for girls to get married is 16 but the legal minimum age is 18 in the States. So there’s still a lot of inequality and traditions that perpetuate or reinforce inequality towards women and girls.
I had some amazing role models. My mother, she was a journalist in the 80s and 90s in [inaudible 00:09:47], which is quite cool. And my grandmother, she had two degrees in Literature at that time which is very unusual. So for me, I had like amazing role models and it was normal that I have to achieve my potential and have to do things that I love regardless of my gender. But I saw a lot of inequality around me. And I think in many ways, a lot of what I do today was shaped by observing that in various forms.
Alejandro:: What did your grandmother or your mother…? What were their beliefs in terms of this inequality? Did they just say, this is the way it is and just be careful? Or what where their thoughts around that?
Kriti: I don’t think they ever talked to me about it until much later, until now over the last few years that I have become an activist for equality. And now we talk about it a lot more but at the time, it was more learning by seeing them doing things in action. Say, my mother was a journalist. Everyone else around her was men and she was reporting on social topics, on religious topics and cultural topics. Everyone she talked to would be a bunch of dudes and I saw how supporting my dad was. At the time, my mother was more famous than my dad. She earned more money than him and he absolutely no issues with that.
And that’s what I assumed a normal world would be until I broadened my horizons to see how divisions do exist. Like there is gender pay inequality in the world. Women can’t rise to the top. Whereas where I grew up, I just saw these people around me kicking ass every day and so I was a bit surprised. I think I was very lucky to be in a more literate environment in many ways.
Alejandro:: That’s really cool that you mentioned about your father being able to lead the way by being supportive. By being supportive of your mother and how that seems normal and that should be normal but even nowadays for many men, having a wife that is the breadwinner, it’s incredible how still old school it is for people and how most men don’t know what to do or how to react or how to support and what’s expected of them from society and all that. That’s incredible that your father during that time and in that community.
So in that community, you mentioned the fabric of your family was very much different from most families within the community.
Kriti: Yes. I mean, my parents went through a journey of understanding how to… eventual journey of understanding the potential and how to give back to society in their own ways. So that was great. But then around me, I saw the traditions that exist in any society, they can be quite limiting. So I’ve seen both sides, is what I’m saying. And I feel very privileged to be in an environment where they supported me. I know it doesn’t sound like a lot but the fact that they let me do whatever I wanted to do was everything to me.
Kriti: There are a lot of… You might think, what does that mean? That they let you do what…. but in my world where I grew up, that is a huge advantage or privilege
Alejandro:: And did you know from very early on what you wanted to do?
Kriti: No, I had no clue. There was a lot to do whatever I want. I have no idea what I wanted to do. But when I was a kid, I was just building computers and robots as I was fascinated by machines. [Inaudible 00:13:59] computer. I didn’t have one so I thought, what do I do? I read a book and figured out how to build it. That was quite fun. And then I thought I can build a computer. I can make a robot. My computer cost $50. And it wasn’t like an Apple MAC but it was very helpful in teaching myself how to code. And then I built some robots and robotics because it was quite fun. And then slowly I started to see my path. I started to enjoy engineering but more importantly, I like solving problems.
I started to learn. I’m 30 years old now. I’ve met several people over the last few years particularly through my public engagement and activism work that we tend to find in the tech world two kinds of people. Those that are fascinated by solving puzzles and just geek out over details and others who are interested in solving bigger problems around society, or equality, or business models or sustainability. And I find myself as a scenic person who can connect to both sides. In the morning, I’m tired. I want to solve puzzles and nothing else and not think about the world. But during the day, I’m interested in large scale issues around healthcare, around democracy, building around ethical technology and responsibility. I think that has really come from being exposed to different challenges to a very young age.
Alejandro:: What was your first robot? What did you build?
Kriti: It’s quite embarrassing. It fetched Snickers, the chocolate from the snack bar. It’s a very important problem for society. I was a teenager. I do better things now in my life.
Alejandro:: That’s hilarious. That’s great. So what was it? It would grab the Snicker bars?
Kriti: Yes and bring it back.
Alejandro:: That’s funny.
Kriti: I learned to navigate and got smarter over time.
Alejandro:: You got smarter over time. I love it. So you ended up going to Saint Andrews University in Scotland. What did you study there and why did you find your…? What drove you to Scotland?
Kriti: Definitely it wasn’t the weather. I flew from New Delhi to Edinburgh and New Delhi was 48 degrees and then in Edinburgh it was 8 miles per hour wind. I did think twice, gosh, what did I do? But I went to study Computer Science and I specialized in Graphic Basis which is a form of AI data structure. It was more advanced Computer Science at the time. It was really cool. I did my Masters there.
I had a different plan for myself. So I always wanted to do a PhD in Computational Science which is in modern days you could say informs neural networks in AI but using some of the biological principles. So I wanted to do my PhD in learning how to brain works and simulate that to teach computers to do similar things.
Alejandro:: Got it.
Kriti: And that was the plan. I got in to my dream course at Oxford but towards the end of my Masters Program [inaudible 00:17:45]. I could not take anymore. I wanted to do things beyond research.
So I decided to not complete a PhD at Oxford. But that’s fine. That’s one of the things you learn is…
Alejandro:: What got you to say, “I don’t want to do this?”
Kriti: As much as I love computing, I thought that I wanted to work in the industry and make impact much quicker and just build consumer products. I got really interested in that. I wanted to do it. And also I think it was a little bit under the weather. I told myself I’ll go back and do it and it has been a few years. I think it’s
Alejandro:: You can still say you’ll go back. You can just move forward.
Kriti: But what I would say though is, if I go back, now I wouldn’t do my PhD in the same year. If I had to go back, I would learn something that’s not Computer Science because I have skills in that area but I believe that society needs people who can bring various fields together. And the future of our society, the future of our planet and the world should not be decided or designed just by geeks like me. It has to be a collaboration with people from different skill sets. I think I’d go back and study something different.
Alejandro:: What would you study?
Kriti: I think learning how to be more human. Talking, communicating with people would be helpful.
Alejandro:: You would think now you can create an intelligent computer now you got to go back to like philosophy.
Kriti: I think policy is interesting. Policy and politics. I got introduced to this fascinating world a couple of years ago when I was asked to give a testimony to the UK Parliament, to a select committee on AI and how algorithms can be biased and how they can impact the future. And I thought, well I’m a computer scientist talking to politicians. I didn’t know. And then I really enjoyed it because it is quite overwhelming being asked to give that. It wasn’t James Comey versus Trump. It wasn’t [inaudible 00:20:08] but it was really interesting because you start to see the impact of your work beyond just yourself and your product or your users. You start to see the impact of technology on society and not just the users you can reach but the fabric of our democracy.
Alejandro:: Did this take place when you were invited after the Anita Borg Institute awarded you the Systers Pass? This was already afterwards?
Kriti: Yes. The Anita Borg Pass It On award was when I was still in Rajasthan. And I was running a lot of outreach activities and programs to encourage more girls from Rajasthan into engineering and technology careers because I saw the inequality there on girls from disadvantaged backgrounds or marginalized backgrounds. I also did some interesting innovation that got Google very interested in what I was doing. And then I got the Google award which they give to the top 10 women in engineering.
Alejandro:: Wow. And for the Systers Pass award, was that before or after finishing your studies at Oxford or Saint Andrews? Where did that fall in line?
Kriti: During my undergrad. I did my undergrad in India and then I did my Masters in Saint Andrews. And the Oxford PhD, I never started. I will though in politics.
Alejandro:: I love it. I love it. So after Oxford, you decided, “I’m not going to pursue my PhD.” What did you do next? Was that with Barclays Africa or that…?
Kriti: With Barclays in London. It was the beginning of the mobile banking movement. In the UK, banks were starting to think about how they could reach customers in new ways and mobile banking was the thing. And I remember some conversations at the time where people, “Oh, this mobile thing is never going to take off. People in the UK will never trust their phones to make…”
And I was like, no I grew up in India. That’s how we always bank because it’s a very mobile… It was low end experiences but you would just text a number and you would get your balance. That’s how everybody would…
Alejandro:: How long had that been going on in India?
Kriti: Quite some time actually. Quite a while. Several years. And I think that’s the beauty of some of these emerging economies where they sometimes leapfrog experiences that the rest of the world is still trying to catch up on like cashless transactions or cardless mobile payments. They’ve been in parts of Africa for a very long time.
Alejandro:: Why do you think that was happening in India a lot earlier? What were some of the reason you think caused some of that?
Kriti: I think I would call it a leapfrogging opportunity because for example, in the UK if you think about access to services, going into a branch to getting online is a huge advancement. Doing things online versus doing it on your phone is incrementally [inaudible 00:23:43] moving them from doing online banking on their computers to their phones is a much… It’s not that much value added. It doesn’t save you that much more time whereas in many countries, people don’t have laptops. They do not get PCs. They do not get computers. They move straight to mobile. Mobile was their introduction to internet.
These days, I do a lot of work in rural India with young people on sex education and reproductive health programs. And they don’t have computers. They’ve never had one. Mobile, they’ve come online for the first time in the last 12 months in their entire life.
Alejandro:: And before we get into the sex education and all the incredible things you’re doing there, you had mentioned that with Barclays, they did not believe this mobile revolution in the Fintex base would be anything. That was the space that you were in at that time. And then what occurred within that team? What were you doing? What did you end up doing in that Barclays team? And I believe it was in Africa or like it was concentrated.
Kriti: Initially in the UK and then I went to Africa. I wouldn’t say they didn’t believe it would be a thing. There were a lot of people, early detractors I’d say who thought this might not work or it will take forever. And then I think the reason I got the opportunity to do it straight out of college is they were like, “Okay, here is a millennial. Let’s see if the millennial can do this.” I oversimplified it. I think it was a great opportunity actually.
I got the opportunity to own the product which is the mobile banking solution when I was 23 years old, straight out of college for a large bank with millions of customers. And within the first six weeks, we had a million users who’s using the product. And it was the first mobile payment solution at the time in Europe. It was really, really exciting. I learned a lot and I became very comfortable with building and shipping large products in complex organizations. And to that scale and to doing it like with controlled in this regulated environment.
Alejandro:: Was that product what led to Pegg, the software that you created? Or that was Pegg?
Kriti: No, that was a completely different product. It’s a Barclays product which won the Apple App Store, App of the Year in 2012 at the time and really started the mobile banking phenomenon in the country. And after that, I got really interested in going back to my roots in data and machine learning and also increasingly getting fascinated by what is happening in parts of Africa in terms of people coming online, generating more data and the opportunities that could be created. And so I went to live in South Africa for some time and build the data capability there, just more advanced analytics machine learning and stuff. I had a great time and also learned a lot about in similar patterns to where I grew up in terms of some of the opportunities to create a better world.
Alejandro:: This attitude of, “I’ll just go to Africa and check it out and work at this completely new project.” Where did you get this? I mean, was it easy? Was it just as easy for you to make that decision of leaving UK and going to Africa and spending some time there and tackling these projects? Where do you believe you got this type of, I guess you could call it motivation. You can call it perspective in terms of taking on challenges that for other people might seem pretty scary, let’s say.
Kriti: I think I’m fascinated by challenges and that felt like a challenge. And opportunities. That’s how I’d say it. I think it’s a huge opportunity to create new solutions for a market and a segment of society that would appreciate it a lot more. And many things we take for granted don’t work out quite that way. We assume high speed internet at all times. It might not be the reality. We assume everybody has access to bank accounts when there’s a huge population of people who are not in the banking system and they’re going to loan sharks because they may not have IDs or any kind of formal identity. That stuff.
I think for me, it was more a challenge and also the opportunity of doing something in that environment. It isn’t necessarily a technical challenge but it’s more of an applying what technical skills to solve the problems challenge, which I find quite interesting.
Alejandro:: That’s great. In Africa, that was when you came about with Pegg?
Kriti: No. Africa was very much about building the data and machine learning possibilities there. Looking at, for example, credit risks or banking services in a completely new lens because non-traditional models might not always work for people who don’t have or who are not part of the system or the financial processes or financial institutions really. So it was a very interesting time.
And Pegg was something… I came back to the UK and I got hired by the largest software company in the UK called Sage B2B, business to business software predominantly for small to medium sized businesses. And as a technologist when you think of starter, you think of tech companies but we tend to forget there’s a whole bunch of other businesses out there who are driving the economy. They are creating the most number of jobs and they don’t always have access to technology or they don’t think about it. These are your mom and pop show stores and your restaurants, the local businesses that are growing the economy and they’re creating jobs. And I thought there’s a great opportunity to build solutions for them.
So we launched a product called Pegg which is like a personal smart assistant to a CFO. This is the way a large company CEO doesn’t have to do accounts, or taxes, or compliance, or boring processes or other journal reports. [Inaudible 00:30:41] technology and implementation. We should be able to offer the same luxury to other business. That product was quite successful also and it was really benefiting the businesses who are spending 120 days a year doing admin activities, which is like in 2018, 2019 you shouldn’t have to spend 120 days a year just doing administrative tasks. You should be building the business. You should be growing the economy. You should be hiring people. So yes, that was that.
Alejandro:: As you progressed through creating Pegg and having that be extremely successful then continue your… I don’t want to call it fight but continue your journey in terms of bringing equality and having technology that’s ethical. You were then invited by Obama. How did that take place? What happened? When were you reached out by the White House? Was it exciting? When you heard this, where were you because you got to meet Obama?
Kriti: He’s a pretty cool guy. Well throughout this time of working on projects and bringing services to millions of small businesses and consumers and society, I’ve always been troubled by the fact that it can still only reach a handful of people or even when we do reach these people, there might be a potential for biases in technology. As someone who has been on the receiving end of it, I know it way too well.
For example, I’ll go ahead and make a bias where if you’re a woman looking for, searching for jobs online, you’re less likely than men to see adverts of jobs that pay more than $200,000. And that comes from algorithmic bias. The same for when you’re applying for a loan in the credit decision process based on the year you were born, what’s your post code, your education, income levels. You might get a different set of options that someone else. It’s the same with insurance. Algorithms are making decisions about who gets invited to a job interview, whether you get your loan approved or not. In fact, even in criminal justice system, when machines are making decisions whether you’re more likely to commit a crime or if you’re a repeat offender or not. And if these machines are learning from historical biases by the space, or gender, or race, or background, that’s wrong. We wouldn’t allow humans to be racist and sexist but algorithms could somehow get away with it.
Alejandro:: So you’re saying that because those historical biases so AI, we have the power to create these artificial intelligent machines and if we have biases, those will automatically just through sysmosis get passed on to….
Kriti: To the data. Say that the data you used to train the algorithm. Say for example, Joy Buolamwini did a study on MIT on consumer technology for facial recognition software. And she founds that a facial recognition systems which you might be using anywhere possible control or as your identity, they have a less than 1% error rate for light skinned men but over 35% error rate for darker skinned women, which can often fail that recognizing all problem free Michelle Obama. That’s pretty messed up.
You see also other…
Alejandro:: How is that possible?
Kriti: Because the underlying data sat say it’s the data that the algorithm is being trained on and the teams that are creating it are predominantly men. They’re testing it on similar users. The algorithms… If the data is biased, the knowledge of the machine, the decisioning would be biased.
Alejandro:: That’s such a humongous diff erence. That’s incredible, those numbers.
Kriti: That’s right. And this is why we need to… I’ve been raising awareness of this topic and creating solutions on how to avoid it, or counteract it or fix it for a very long term. And I guess that caught attention of people in influential policy making positions. And I think it’s a growing movement now. When I first started working this field, no one else was… It wasn’t much talked about very much. It was a new thing but now governments are taking action. And for me, even if [inaudible 00:35:42]. The UK government last year decided to set up an independent body which is specifically looking at data and AI [inaudible 00:35:53] in the modern world and to inform potential regulation and policy decisions around that. So there’s a lot of action happening now and we are a society moving forward on that.
Alejandro:: So you’ve been able to do all this work. You got reached out by Obama. Did you get to…? What was this particular event in the White House where you had a number of individuals that were all in the similar space? Or this was you individually going and saying hello to Obama? How did that actually look like?
Kriti: It was after he left the Office and in Chicago [inaudible 00:36:41]. It’s part of his activities now which is the Obama Foundation that he has set up. And I was invited t o be a part of it, the Inaugural Summit. It was a very surreal experience. Of course, yes, meeting… and this was a post Brexit, post President Trump getting elected world. I think he started to appreciate a different world a lot more. [Inaudible 00:37:21].
Alejandro:: I was going to say you sound very diplomatic.
Kriti: Well I do get a lot of online internet trolls so I’m just thinking next word I might say is going to create a whole [inaudible 00:37:36].
Alejandro:: No worries but I get what you’re saying.
Kriti: But it’s fine. I think it’s okay. People have different political opinions however to me, it was quite important to look at a leader’s… a lot of [inaudible 00:37:48] working on now would become foundation to be lived up to the work of young people around the globe who are driving real change. They are not as famous as Elon Musk or Mark Zuckerberg yet but they are working on driving real change. And I think that’s really commendable in that their effort is being spent and going into lifting up the good work of people like me, for example. We’re trying to make change.
Alejandro:: I was going to say, there’s an organization called AI For Good and this is something that you co-founded.
Kriti: That’s right. I set up AI For Good UK in the UK last year with the idea of solving… I look at where AI is being used in reality. And to be brutally honest, it’s being used in three areas. First, making people click more ads and that’s your Facebook and Google kind of model. They definitely use AI for a long time. By banks, trading, hedge funds kind of models to make more money. And lastly, digital addiction to bring you back or recommending you the next video to watch, next product to buy. And I thought we could do better than that in our society. And I did not want to look back in time thinking the most powerful technology of our time is being used for these activities.
So I have set up the organization where we as a social enterprise, it works with organizations who are the frontline of some of these social justice challenges in technology. It’s not a tech saviour. We do not claim that technology is going to solve domestic violence in South Africa or it’s going to get sex education to every child in India. We absolutely don’t want to be that. I don’t think technology alone can do it. However it’s the humble realization that there are amazing activists and organizations on the ground and there are policies and frameworks in place to make this change happen. And technology could help make it happen faster and hopefully a more safe way.
Alejandro:: You mentioned about sex education in India. Can you go a little bit deeper into that and highlighting what in particular, what something would AI For Good, what some projects that you are tackling there?
Kriti: Yes. So we work with a partner called The Population Foundation of India. They’ve been working on population management for the last five decades almost. If you look at India, it’s a fascinating example. There are 240 million young people who are in the adolescent age, who are in between 16 and age 24. They’re going to be the biggest contributor to the next wave of population growth in the country. India is on track to be the largest country by population in the entire world in the next few years. And what’s important is for these young people to have choices. And it can be difficult when you are in a very traditional, conservative environment potentially and in a different political systems in place where getting access to what your choices or options are can be very difficult. It’s the things that we take for granted. And a lot of young people as I mentioned earlier, they have come online for the first time in the last 12 months. This data [inaudible 00:41:25]. And their primary source… I fear, if you’ve come online now you can be very easily influenced by information you see online. And it might not always be true.
So for many of them, the primary source of information [inaudible 00:41:45], that’s not the place we want them to go to. So we’re using an AI system, AI powered technology platform which is designed by these young people in rural India to give them access to sex education and information, and reproductive health advise. We’ve covered topics like menstruation, contraceptive information, those kind of hard hitting topics which are domestic violence, healthy relationships which are difficult for young people to get access to. It’s not something your parents would teach you or your school would tell you.
And there’s also a lot of provider bias. So currently there’s a model of healthcare workers who would go and face to face and disseminate this information to their communities. That’s how information is supposed to [inaudible 00:42:40]. There’s cases where healthcare workers might not want to give you contraceptives because they believe you’re young, you’re married and you don’t have kids, you should have kids. Or they have their own biases. We’re seeing…
I was in a meeting earlier today in another country where there were issues around giving access to young people information about abortion services. And people might have their own views. They’re the service provider. They might project on them so they would never forget access to information that might be right for them or to make that choice for themselves. So it is, I believe once it’s absolutely… We do not claim by any means that this technology is going to be better than the humans helping but I think it can help them make more informed choices and provide quality information.
Alejandro:: Is there a place that… Where would you go to if you wanted to find more information about this?
Kriti: I would welcome people to either to contact us through the website. It’s called www.aiforgood.co.uk. Or just struck me a line on Twitter. I’m @sharma_kriti.
Alejandro:: Okay. Are you currently…? Is AI For Good, that’s what you’re currently working on and are there other projects happening because I know that you’re constantly creating and looking at solving a lot of major problems. What are you currently doing now?
Kriti: So a few couple of different things. I am working on AI For Good, as you mentioned which is doing a few different projects. One it does is sex education for young people in India, the premise being we’ll take the awkwardness away from talking about sex with the grownups around you.
And we’re doing a very amazing, this incredible project in South Africa called Rainbow, which is to help women who face domestic violence, abuse, sexual assault, rape in the country and to give them access to unbiased, non-judgmental information. Because in a lot of these cases, victims can be blamed instead of the perpetrators. And the scale of the problem is huge. So we’re doing a huge amount of work there. It’s a very successful service already. It’s 100 days old but within the first 100 days, Rainbow has had over 200,000 conversations with women and young girls in South Africa about complexities around domestic violence and some of the issues related to that like HIV which in South Africa there is a very high rate of HIV and issues around impact of violence on children.
Alejandro:: If I were in South Africa and you call a number? You call, it’s a number you reach out to and then someone answers and that someone is either human or not? How does that actually…?
Kriti: So because it’s an issue of domestic violence which means abuse and violence from your own partner. And around the world, one in three women face domestic violence which is assault by their own intimate partners. It could be their husband, boyfriend, some of the family members. And this is a huge issue that is taking society. And people don’t talk about it because it’s something that happens to you in your own home. And the first time, the average first time they actually go in and report it, it’s the 35th attack on average. So the first 34 times, they do not report. The reporting numbers are already very, very low. It’s actually less than 5% cases ever get reported. And when they do get reported, it’s the 35th attack.
What we’re trying to do is make it easy for you to understand what abuse is. When you’re in an abusive relationship, you’re not Googling for, “What is domestic abuse?” You are thinking of, my partner’s quite controlling, or gets too angry or sometimes shows his passion in different ways. You don’t think domestic violence. And from our analysis of the 200,000 conversations, most of them, over 99% of them did not use the word domestic violence or abuse because that’s not how humans think. That’s how policies are designed. That’s how programs are managed but not people. Also because perpetrators, it’s not safe to pick up the phone and call because they can be heard. So it’s a discrete messaging service where they send a text. You can go to www.highrainbow.org and get started, talk about how you can get access to your [inaudible 00:47:44] information about for example, marital rape is illegal now but people might not know about it. Different forms of abuse. Psychological abuse is abuse and making that conversation easier. And we find that when there isn’t a human on the other side, people tend to open up a lot more because they don’t feel that there is judgment or bias or that people in the community might find out because it’s a discreet number service.
Alejandro:: Are there future goals whether it’s with AI For Good or for yourself in where you like to…? What would you like to be in a number of years from now? Or what would you like to be tackling? Is it continuing to improve and solve this ethical technology? Or solve sex education in India? Or sexual abuse in Africa? It seems like there are so many important challenges and so many important problems that are taking place all over the world. How do you see yourself being able to tackle all this and what would be helpful?
Kriti: I don’t think I can. And this is why… I don’t think I can do it all and this is why where I’m now trying to focus my time and energy on is working with policy makers, with large donors and with governments to really embed the right practices, the right frameworks in place because I’m only one person. And maybe i can start a movement, maybe not. I’m on that journey but if we establish the right policies, the right legal frameworks, the right regulation if needed then we should be able to prevent a lot of these unethical technologies or irresponsible actions from happening.
And the second thing I’m really passionate about is creating the next generation of people who will be creating the future. And that involves, I’m doing a lot of work on… Because I don’t think just doing several, problem solving the same thing can only go this far. But if you think about it is a program to build capacity and create a new way of people to learn these skills, then you can achieve your goal much faster and at a bigger scale.
So I’m doing a project with young people who are teenagers really to help them prepare for jobs in the future that don’t exist today. And that involves learning how to teach them algorithms but also bringing the non-coder elements because I think we are too obsessed with as a society to teach everyone to code. Everybody must learn to code. And while that’s good, but there are also incredible people who are non-coders that they can bring so much value to society by bringing the two worlds together. And this is what I’m really encouraging training young people to think about the possibilities in the future. They will be doing new jobs that exist which could be around and you could be an activist working in sexual health but in your day to day job you might be working with technology, or machines, or training AI to get smarter so it can answer more questions from people or whatever that might be. Or just bringing people together from different fields and opening up their minds. And then I encourage those young people to imagine new possibilities, the new jobs or different forms of future jobs. I think we’ll create a more scalable, sustainable change.
Alejandro:: I love it. Well Kriti, I’ll be in my audience and everybody out there will be really excited to continue seeing all the different incredible things that you’re going to continue to do now and in the future. And I want to thank you for your time. Is there anything else I did not ask that you’d love to share? Anything that you wanted to just get out there?
Kriti: I would just love for people to get to… If you’re interested in this topic, I just did a TED talk at Event Live on www.ted.com. Come Friday, if you’re more interested in hearing about what you can do to take action, what are the possibilities in the future, check it out. And also, I believe that the next phase in the movement is going to be for all of us to start taking action and holding people to account. the impact of technology on our society is huge. Lawmakers and policy makers can quite often try to play catch up because technology moves so fast. And I think we as members of society, as voters can uphold them to account and make it a real agenda that the way these technologies impacting our society, it needs to be investigated. It needs to be done in a responsible way. We deserve to know. We deserve better.
And also, what’s going to happen to our future economy and society jobs. And that needs to be a big conversation topic.
Alejandro:: Wonderful. Thank you. Thank you, Kriti.