Futuriceans coming together to address gender bias in tech
Futurice took part in the Girls Take Over Day by Plan Finland and organised a Solution Creation Jam for its employees
I was excited when I heard that Plan Finland had approached Futurice to partner with them for the Girls Take Over Day. The topic of designing products for men as the norm when it comes to users, whilst treating women as an exception (or ignoring them altogether!) was a topic close to my heart ever since I was studying at University. It was about time I got to do something about it in working life! As I had time in between projects, I helped to plan a Solution Creation Jam for Futurice employees as a part of the Girls Take Over day. Futurice made a commitment to take one of the outcomes from the jam and turn it into a real pro-bono project. In order to choose which idea would become a real project, we decided to make it a pitching competition. Plan shared with us reports on a selection of issues related to girls and technology, including references and some solution suggestions. We synthesized the research reports into three briefs for Futuriceans who signed up to take part and form groups to tackle the issues.
Male-majority / All-male teams creating services and products for diverse users
Up to this point, men have been steering the development of the new digital solutions that influence society and culture, with limited contributions from women.
Gender biased data & algorithms
Bias in / bias out, -the outputs from algorithms can only be as good as the data they are based on and as unbiased as their creators.
Technology - a girl’s dream job?
Why are there so few female developers in the tech industry? We are often told that “There are not enough qualified candidates.” Is it really so?
Here is a little synthesis of the problem areas:
Male-majority teams creating services and products for diverse users
Have you ever heard of the ‘pink tax’? This phenomena synthesizes the issue rather well: objects are designed for men and as that design is altered for women the price increases and the product colour changes to pink. An example of this would be making power tools for women by not taking into account that women have smaller hands to comfortably hold the tools in order to operate them fluently, but rather making the tool with the same measurements in pink and offering it as a women’s version. The phenomena isn’t as new as one might think: Dodge tried to appeal to female consumers with a pink car that came with a matching rain coat, an umbrella, and a makeup case already back in the 1950s. The design didn’t take into account actual important female needs like safety. Up to the current day women are still 47% more likely than men to get serious injuries in a car crash as most of the crash test dummies used are based on male bodies. (Invisible Women, Caroline Criado Perez, 2019)
Research conducted at the University of Washington on Google’s widely used speech-recognition software found that it is almost 70 percent more likely to accurately recognize male speech than female speech. When Siri was first launched, it also couldn’t recognize female voices as well as those of men. As the Apple health app was first launched, it didn’t include period tracking. Women are more likely to feel nauseous when using VR devices as male and female balance systems are different, and the list goes on. (Invisible Women, Caroline Criado Perez, 2019)
Forgetting female needs is also bad for business. Who would buy a game that makes them nauseous? Why would you invest in any devices with speech recognition if you have the frustrating experience of repeating yourself many times before being understood?
If new technology is primarily developed and tested by men, it is likely to better serve men’s needs. This results in more and more young girls feeling that technology is not for them, thus putting them off studying tech related subjects -and we have a vicious circle.
Gender biased data & algorithms
The lack of diversity in technology can have a serious multiplier effect as data, algorithms and AI become influential in day-to-day life. Artificial intelligence is now used to automate decision-making across the board, from the healthcare industry to the legal system, and may be responsible for making choices that affect people’s whole life trajectory, such as which medical treatment they receive, whether they are eligible for life insurance or a loan, or which jobs they are invited to interview for. When deep learning systems are trained on data that contain gender biases, these biases are reproduced in the software. Amazon’s AI recruiting software, for example, was found to downgrade resumes that contained the word ‘women’s’, as in ‘women’s chess club captain’, because it had been trained on men’s resumes. (Reuters 11.10.2018)
Let’s take a closer look at healthcare and medical data. If the collected data isn’t taking gender into account, the consequences can be fatal for women. According to research conducted in the UK, women are 50% more likely to have their heart attacks misdiagnosed than men. In 2016, the British Medical Journal published a research finding that young women are almost twice as likely to die in hospital after having a heart attack than men. Women are more likely to have their conditions misdiagnosed because their symptoms are not ‘typical’ - and that means typical male symptoms. Their symptoms might not be seen as severe enough to receive treatment because what is seen as severe is different for male and female bodies, and research is generally conducted on males. Typical female symptoms aren’t as widely taught or recognised. Women’s bodies also react to vaccines and medication differently. Still, human and animal males are used more in medical trials even when researching medications for illnesses that occur more often in women. This leaves women for example more likely to experience arrhythmia from medicines. How will a woman’s experience of getting a diagnosis and appropriate medication change as doctors lean more and more on biased data and algorithms when making medical decisions? Improving the quality of data in this kind of context could vastly improve women’s lives and potentially even save them. (Invisible Women, Caroline Criado Perez, 2019)
Girls don't see tech as their dream job
We can already see the marketing of technology and items thought of as technical, in childhood. A walk through boys' and girls' clothing or toy sections at a store is a wake-up call, when you start thinking about how certain things are aimed at certain children. Adults subconsciously (or consciously) place different expectations for boys and girls and children are vulnerable to pick up on that. There is plenty of data showing that closer to puberty, girls stop being interested in mathematics. Gender stereotypes don’t only hurt girls, but leave young boys with a narrow norm to fit into that might even lead to giving up friendships. (HS Poikien puolella 14.4.20)
Research shows that by adulthood women have learned to subconsciously filter certain job posts as not suitable for them based on masculine language. (Invisible Women, Caroline Criado Perez, 2019) How do we talk about tech as a career? How is it marketed? For example Facebook uses racial and gender stereotypes in their targeted marketing, which might lead to a situation where a young man and women see very different advertisements for career options. (CNBC 2019) The findings of the Swedish Youth Barometer study (Ungdomsbarometern, 2015), which surveyed 8,000 young people’s self-image in technology found that only 7% of the technology-interested girls describe themselves as technical, compared with 36% of the technology-interested boys, but that 23% of the technology-interested girls describe themselves as creative. Creativity is a key part of problem solving and creating digital technologies is by its very nature a creative process.
If a girl ends up choosing STEM studies and getting a place at a technical University, the chances are still high that she will not end up working with tech. Research has shown that women end up dropping out of STEM studies because of discrimination, competitiviness and lack of female peers. (EQUALS & UNESCO, 23-24) The recent news about the culture of some of the technical Universities’ societies in Finland have highlighted several of these issues. It is perhaps then not surprising that many women do not feel welcome in these institutions where older male students try to get the female freshers drunk to have a laugh at their expense. (HS, Nyt 2.12.20)
As Criado Perez writes in her book, once a woman actually starts a career in tech the struggle isn’t over. People are more likely to hire people that seem to fit the current crowd, are somehow known in the network and that the recruiters feel connected or similar to. This often leaves women worse off in the interviews. Research has also found that the people who think that they are objective and not sexist are more likely to actually not be objective and be sexist. Despite multiple research findings that show recruiting without knowing gender leads to hiring more women, interviewing is still the typical way to select employees. Coding used to be women’s work in the defence industry. When it became more associated with mathematics and the appreciation and salaries started to rise, men took over the careers. (Invisible Women, Caroline Criado Perez, 2019)
Women also leave tech jobs more easily. According to research conducted in the USA, 40% of women resign within 10 years compared to 17% of men working with technology. According to the Center of Talent Innovation women didn’t leave the field because they didn’t enjoy the work, but because of the atmosphere at work, discouraging managers or because of lack of career progression. (Invisible Women, Caroline Criado Perez, 2019) Furthermore according to the State of European Technologies (2018), as many as 46% of the women surveyed reported to have experienced discrimination in the tech sector. Women are also are more likely than men to cite gender bias, discrimination and harassment as their reason for leaving the field. (EQUALS & UNESCO, 25) (Plan 2019, 22) According to research conducted in the USA where 248 American IT companies’ job evaluations were analysed, it turned out that women face personal criticisms that men don’t. Several studies show that white men are more likely to get rises and bonuses than women or employees from ethnic minorities. (Invisible Women, Caroline Criado Perez, 2019) Transparency in career rises, levels and bonuses is a key to solving this problem so women don’t leave because of the comparatively lower salaries. Having processes in place to support women experiencing harassment at work is crucial to wellbeing.
The solutions
So, these were the challenges we were presented with. How about the solutions? We had very hard working teams representing different competences ideating for solutions to the briefs, validating them with a teen-aged female user, a pitching session where our two CEOs - Teemu Moisala and the 18 year old CEO-for-a-day Jessica Komulainen - asked questions and selected the winner. Even with the jam held remotely using video calls, Miro boards and other collaboration tools, the energy and motivation remained high throughout the day.
The winning solution was an answer to the problem of all male teams creating services and products for diverse users. As the Futurice Lean Service Creation toolkit is already widely in use, the team proposed making edits to it that will prompt solution creators to think whether their research and testing pools are diverse enough and whether they might have missed some insights or problems because of not taking diversity into account. The next step would be to create a diversity and inclusion toolkit and then a platform for matching potential users and solution creators.
Our other team focused on the challenge of getting more girls to study for, and take up careers in tech. Their idea was to create a platform that represents many of the roles that are available in the field of technology. The aim was to help girls make a connection to how their interests could be leveraged to find an inspiring career in the field of technology. Both of the concept ideas were handed to Plan as open source so that anyone can take them forward.
As the next step, Futurice will review and update the canvas sets. All of us involved in the Plan Girls Take Over Day certainly learned a lot! I would like to encourage any company to critically think about how they are working on improving diversity and inclusion. Especially if you think that you’re already doing great ;)
Special thanks to Plan and Päivi Korpela for putting together the research for our solution creation jam.
References
Invisible Women: Data Bias in a World Designed for Men by Caroline Criado Perez, 2019
CNBC 2019: Facebook delivers ads based on race and gender stereotypes, researchers discover. https://www.cnbc.com/2019/04/04/facebook-targets-ads-based-on-race-and-gender-stereotypes-study.html
Harvard Business Review (HBR), 2019: 4 Ways to Address Gender Bias in AI. https://hbr.org/2019/11/4-ways-to-address-gender-bias-in-ai
Longreads, 2019: Technology is as biased as its makers. https://longreads.com/2019/05/14/technology-is-as-biased-as-its-makers/
EQUALS & UNESCO 2019: I’d blush if I could. Closing gender divides in digital skills through education. https://unesdoc.unesco.org/ark:/48223/pf0000367416/PDF/367416eng.pdf.multi.page=74
UNESCO 2020: Artificial intelligence and gender equality. https://unesdoc.unesco.org/ark:/48223/pf0000374174
Young Josie 2019: Why we need to design feminist AI. Speech at TEDxLondonWomen. https://www.youtube.com/watch?v=E-O3LaSEcVw
ITU 2019: Measuring Digital Development. Facts and figures 2019. https://www.itu.int/en/ITU-D/Statistics/Documents/facts/FactsFigures2019.pdf
Reuters: Amazon scraps secret AI recruiting tool that showed bias against women, Jeffrey Dastin 11.10.2018 https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G
HS Poikien Puolella, Katri Jaalamaa 14.4.20 https://www.hs.fi/perhe/art-2000006470462.html
Plan International 2019: Programmed out: The Gender Gap in Technology in Scandinavia. https://plan-international.org/publications/programmed-out-gender-gap-technology-scandinavia
Plan International 2018: Digital Empowerment of Girls. https://plan-international.org/publications/digital-empowerment-of-girls#download-options
Natalia Salmela jakoi kokemuksensa teekkariyhteisöstä 13 vuoden takaa ja sai sadat opiskelijat puhumaan seksuaalisesta häirinnästä, Pirittä Räsänen & Hilla Körkkö 2.12.20 https://www.hs.fi/nyt/art-2000007656614.html https://futurice.com/lean-service-creation/
- Laura AhoSenior Service Designer