“Deep machine learning, which is using algorithms to replicate human thinking, is predicated on specific values from specific kinds of people- namely, the most powerful institutions in society and those who control them.”
- taken from Algorithms of Oppression by Safiya Umoja Noble
Thanks to Simba + Sloane for sending me a copy of this book.
When was the last time you asked someone to “Google it”?
And don’t you find it fascinating how the word Google has now become a verb, often used interchangeably when talking about knowledge and information-seeking?
Author Safiya Umoja Noble finds this newfound verb less fascinating but alarming and deeply disturbing when it comes to racialising groups- she details why in her book Algorithms of Oppression.
The story of this book and I begins with reservations about Google’s near-monopoly status after reading Jamie Bartlett’s The People vs Tech. Concerned about the subject of implicit human bias in technology, Noble’s book provided me with a black feminist perspective, a book that very much reads as The People vs Google.
Noble’s well-researched arguments in the book challenge the idea that the internet is not the democratic and equitable place it is often lauded to be, arguing how search engines like Google reinforce racism in our society and why we should care. I was left shocked, stunned and appalled.
With Google’s dominance, it’s hard to overlook the fact that Google is not a neutral technology- it is a profit-driven advertising-first corporate company with its own goals and priorities. This is often forgotten.
As such, algorithms are not benign or neutral, the mathematical formulas to automate these decisions are powered by human beings.
“Human and machine errors are not without consequence”
The book investigates this myth of neutrality and urges people to not outsource all of their knowledge needs to commercial search engines. Popular search results on the front page don't necessarily mean credible.
One thing I found intriguing was that the book is published in the ‘Race & Ethnicity’ section of New York University Press. Why not the ‘Technology’ section?
More needs to be done in the way of educating those in these technological driving seats on how racism and sexism might be created and maintained using algorithmic searches. Noble rightfully argues for this in the book in order to avoid classification and cataloguing bias. She argues for more awareness of bias for those designing technology for all people, as without this- grave mistakes can be made with grave consequences.
This book was extremely thorough in its arguments and I took a lot from it. From the really shocking case studies and examples illustrated as well as the revelatory chapter “Searching for Black girls.”
More awareness is needed around implicit bias of search results, and this book does a great job in rallying troops for the cause. The author pushes for increased regulation for commercial companies such as Google as well as public non-commercial search engine alternatives. But a move to more ethical practices is a collective endeavour that starts with individuals recognising our reliance on these types of technologies. Would recommend.