Invisible Women

Published:

Invisible Women: Data Bias in a World Designed for Men

By: Caroline Criado Perez

Allison’s Rating: :star: :star: :star: :star: :star:

Author Caroline Criado Perez takes the reader through a 6-part journey highlighting how a world that has become increasingly more data driven, has also become systematically blind to bias against women. This data gap is a, “consequence of the type of unthinking that conceives of humanity as almost exclusively male” (Perez) which leaves women at an unfair advantage when healthcare providers, employers and governments make decisions that are rooted in foundation of big data. Women have more complex, hormonal and statistically more sophisticated bodies. However, to properly research and conduct studies against women often takes more time, and consequentially more money.

Throughout the book, she indicates that this often because of two main issues: the default nature of human to be ‘male’ and the lack of sex-disaggregated data. She compounds the importance of this issue with examples of how algorithms written that are obfuscated from the public and do not take into account this male designed bias is endangering women’s ability to care for their family, excel in the workplace and most importantly stay alive. She challenges individuals who are in power-positions to consider the unconscious bias and the potential for leaving women out of significant decisions to help create change.

The first underlying issue Perez weaves throughout the chapters is when “women aren’t seen or heard the male data becomes the majority and is what is understood as ‘universal’ and female as niche” (Perez, 97).

For example, when individuals see author names, they are more likely to assume that a single letter stands for a male author than a female author. Harry Potter author J.K (Joanne) chose that ambiguous pen name because she didn’t think that boys would want to read a book written by (or about) a female. This is similarly highlighted in examples of anatomy books showing primarily male bodies, or air force wing suits not fitting women. Another fitting example from relevant recent news is that the first all-female spacewalk was cancelled because NASA didn’t have enough suits to fit women’s bodies. NASA’s fiscal budget for 2019 is $21.5 billion, and they didn’t have two space suits to fit women. As Perez quips, “Why can’t a woman be more like a man?” (Perez, 215).

The second issue, lack of sex-disaggregated data stems from the common practice in statistical data analysis to leave sex out of the study at all. In a world where the default is “male” this has the high potential to make data decisions that do not benefit almost 50% of the population. One relatable example of this was the designing the size of iPhone. Reading the book on my phone it made me chuckle that the size of the newest phones “statistically significant gender difference in the impact of phone size on women’s hand and arm health.” (Perez). However, that smile quickly faded when the same lack of sex- disaggregated data meant that car airbags and headrests were not properly tested on women. While the fact that Alexa will never recognize my voice or snow plowing of sidewalks might be a ‘funny’ examples of this issue, the fact that women are 47% more likely to be injured in a car accident and 17% more likely to die is a serious issue that must be addressed by those using data to make decisions.

What I liked the most

I loved this book, and never wanted it to end. I found myself reading this book in coffee shops and saying out loud “YES” or sighing loudly because of how I can relate the bias & t to my everyday life. The section that stood out to me the most was Part II: The Workplace particularly the chapter “The Myth of Meritocracy”. This chapter was riddled with examples and unfortunate truths I have also faced in the workplace. I have been told that I am “bossy, abrasive, strident, aggressive, emotional and irrational” (Perez, 93) but unfortunately at the time didn’t realize that it was because of universal gender roles & a result of male-default thinking in data bias that led to them being said. On a positive note, what I also liked about this chapter is I felt it gave tangible examples of how actionable steps can be taken. Particularly how algorithms that are designed and engineered to remove human bias are actually created with a bias against women (and several other minority groups). As a graduate student pursuing data analytics this provides me with a concrete example of potential bias and to consider how to challenge underlying assumptions about gender might play a role future projects.

What Next?

While I overall loved this book, I do find it difficult and overwhelming in what actionable steps can be taken. Particularly when society has a deep rooted bias that women are “too complicated to measure” (Perez, 30). If women start an upheaval to demand more square footage in the bathrooms (based on valid data), this might further the ingrained notion that just the weaker sex and should not be catered towards when addressing the ‘typical’ end user. How can steps be taken to remove this sexist bias from data analysis without having negative side effects? While Perez urges for data to be sex dis-aggregated my question is what next? Is this something that should be dealt with at the legal level (but most likely slow) or a grassroots movement (that might not impact all companies?). Overall, this book has left lasting questions for me, but most of all has encouraged me to share what I have learned with practically everybody I meet. I recommend this book to everyone, regardless of gender to help challenge bias.