“You can’t live life without an algorithm touching it in some way.”
In this edition of Women in Tech, we talk with Leila Seith Hassan, Head of Data Science and Analytics at Digitas UK. Leila talks about the problems with bias in tech, why AI is not always the answer and the advice she would pass on as a successful woman in data and tech.
Tell us about your job
I’m the head of data science and analytics at Digitas UK. When I am describing this to people who are not computer literate or particularly interested in data, I describe it as ‘understanding what people do and why people do it and using data and computers to predict what they’re going to do next.’
In this role I oversee a few different kinds of teams, including data scientists and coders. We also have teams of analysts, so there are a lot of specialists.
Public perception of data work can be varied, and people are concerned about privacy and data ownership, does this come into your role?
There are different groups of people. Some are not bothered; they feel that they have nothing to hide and it’s just a way for data to be useful. Sometimes these algorithms really do make life easier. I like the way that Netflix learns my tastes and makes using it simpler for me.
But I also think there’s a broad change in public mindset, especially with documentaries and information available on what happens when data use goes wrong. People are starting to understand that it’s not all about convenience and it's not all good. At the same time, I don’t think people know remotely enough about when things go really wrong, because they don’t always understand that data is everywhere. People don’t realise the extent to which data science models and algorithms influence their lives.
This is the most important part of my role. Shedding light on where things can go wrong and what we can do to prevent that from a practitioner, regulatory and organisational perspective is key. Today, you can’t live your life without having an algorithm touch it in some way, unless you live completely off grid.
What do you love most about what you do?
I am always learning. When I started out, the computational power and types of data sets that existed were chalk and cheese. A lot of the algorithms then were based on mathematical models that were invented decades ago, but the ability to implement them at scale is something that’s become possible now because of the increased capacity of computers to store and process that data. I don’t see a world where I will ever stop needing to evolve and learn. If I did, I just wouldn't exist in this role.
I also like the ability to change the world, which sounds slightly ridiculous, but data is so ever-present and it offers the possibility to drive change. I am quite fortunate in understanding that when this goes wrong it often affects people like me. This changes the way I think in a world where not everyone gets that chance, it’s something that really motivates me.
When did you work out that you’re a lover of numbers?
I’ve always been good at maths and really liked it. Not so much English, it didn’t come naturally to me. When I think about numbers and maths, I think of the rational part, not just the abstract. When people say data doesn’t really exist, it does to me, it’s very real. It feels like I can touch numbers. You can be creative with numbers. It’s not obvious to everyone but it gives you a framework to solve problems and do the unexpected, I really like that
What’s your mission?
I am very much involved in helping clients and their businesses become more successful. Part of my role revolves around using data for positive outcomes and working to prevent the harm possible with the use of data. The goal is to work completely around using data for good and stopping some of the harm it could do.
How would you describe your journey to this point?
I went to university and through that, landed my first role in the industry. That was the path that I took, and I was lucky to be able to do that. Getting into this field, you have to be good with numbers and problem solving. You absolutely need training, but it doesn’t have to be based on tertiary education. You can learn through boot camps and a lot can be self-taught. For anyone who is thinking about this, there are many online courses and free ways to dabble in data and code to find something you really enjoy.
You don’t have to go to university. It’s a great option for some, not everyone – it might not suit the way they learn, and it can be inhibiting from a cost perspective. I’m pleased it’s no longer seen as compulsory.
Have gender and race impacted your career?
When I started out, I didn’t see a lot of people that looked like me and university was an extension of that. Now that my expectations are a little higher, I think I’d be frustrated going through that process today. It was all incredibly white and male oriented. I was lucky in that I didn’t feel like an outsider until I was much older. I was protected by a mother who stood up for me. She pushed back so I was insulated, and I was such a minority as this little mixed-race girl at school in Australia – there were three of us in a thousand and the other two were twins.
I was also lucky in that it was before the internet is as we know it today, and it was before there were a lot of people like me on TV in Australia. I was shielded until I moved to the US, which is where I realised how bad it was. It was such an eye opener. It was not debilitating but my confidence took a hit.
It was not until two years ago when George Floyd was murdered and it all erupted that I felt I had the words for what I had been experiencing. I had a ton of ingrained self-hate that I’d internalised growing up through subliminal messaging about things like ‘beautiful people not looking like me’, or that someone like me ‘looks like a criminal’ or ‘isn’t good for anything’. I took a knock but for young people coming up now, it must be very hard.
I don’t have the answers to make it all better, but I know it made me want to do more. This is why I care so much about using data for good and stopping some of the harm it could do, has done and still does today.
In my role today, I’m successful, I’ve broken a glass ceiling and achieved a lot, but things still sometimes feel difficult to navigate. There’s being a woman and constantly dealing with this self-doubt many of us experience, and there’s being a woman of colour, and everything that comes with that. Going back and forth between the two is tough.
I think younger people are better at navigating this. They are saying that this is not ok. I think the generation before me were not yet at the stage where they could push back and point out what is unacceptable, there were only a few and they were incredibly brave.
I believe my organisation has responded really well to the events of the last two years in recognising that they could have done more sooner and actually stepping up to do more. We’re also working on this with our clients, showing them that to be ready for the future, this issue is something they have to credibly address.
Have you worked on things that help organisations be better?
One of the things we’ve done is created a tool that anonymises CVs. It strips out all the details that can trigger bias in the hiring process, such as where the candidate went to school, career breaks and their gender. We also take out gendered terms in our job descriptions so we attract more people.
Many of our clients use AI and we always work with them to think about why, what it is, and about the outcome. Some people assume that there’s no harm in using AI in marketing and advertising, but there are situations, like targeted ads on social media, where you have to be cautious.
On the gender side, if you consider how some platforms approve ads for men's health versus women’s health, it’s fundamentally different. They still won’t approve a lot of ads around period products, they won’t let you show blood. They’re very reactive around nudity, even with pregnant women because historical data has linked naked women and blood to pornography and abuse. But if you don’t normalise things around periods or getting a breast exam, there can be horrific knock-on effects. Big advertisers don’t have problems with this, but start-ups need lobbying to get their ads through.
Another example is from 2016 when the first global beauty competition judged by artificial intelligence software, Beauty.AI was launched. Nearly all the winners were white because the AI had been trained on data that established lighter skin as a sign of beauty. It’s just marketing, but in both of these cases the outcomes and impacts are detrimental. This is why we work hard to make our clients aware of the possible outcomes and rigorously work with them to make sure they are making the right choices about technology.
It’s not just about getting a bad reputation for not thinking these things through, this is a business decision, where getting it wrong can have a financial impact, or a legal risk.
Does the perception of science need to be changed to attract and retain women?
There are studies that suggest that by the time girls get to a certain age they’ve already started to be convinced that STEM is not for them. It becomes ingrained because we give little boys toolboxes and Lego and we give girls dolls, but this is where problem solving should be nurtured. All kids love these things but if we keep reinforcing a message that it’s not for girls, we perpetuate the problem and it’s no wonder girls get to higher education thinking they don’t want to do a STEM subject.
There are some great programmes for kids and teens that can help get them interested in coding and data topics, such as Girls who Code and Goldie Blocks, which is targeted at really young girls. Then we need to make sure that teenagers see women in these roles so they can be inspired.
What can be done to eliminate bias?
I believe three things need to happen. Firstly, it starts with data practitioners because we can identify the biases that exist in data. I think young people, too, care about the impact of their work a lot more. It’s a knock-on effect of not waiting to do harm in the world, but also knowing that if your output and data set is full of bias, it’s not a good model.
Secondly, non-practitioners need to learn too. When they ask their data teams for something to do with AI, they might not understand the implications of what they’re asking for. Not everyone in the C-suite needs to be able to code or build something themselves, but they do need to understand the implications of what they are asking for and what they will be putting out into the world.
Thirdly, government regulation. Technology moves so fast, and it’s hard to regulate but we can do more. The EU has developed some guidance and the UK is catching up, but it is everyone’s responsibility to eliminate bias in tech. No one person can do it alone, it can’t just be the government, or just practitioners or companies, everyone has to get on board to stop harmful things from happening.
What advice would you give to women and girls looking for a career in tech?
Do it. It’s great, you can get an amazing job and it’s really interesting. You get to use your brain and have fun. You get to be creative and solve all sorts of problems, from smaller ones like purchase decisions to the big ones like helping to detect cancer. Don’t think about it, just jump into it and if you don’t like it, you can do something else. Search for those free coding programs and learn a little bit to get a feel for it. If anyone tells you that girls don’t have a predisposition for it, tell them to jog on.