top of page
Anna Mae "Anime" Yu Lamentillo Logo
Anna Mae Lamentillo MB profile_edited.png

Anna Mae Yu Lamentillo

May 30, 2025

Decolonizing the algorithm

Digital colonization isn’t a metaphor — it’s an unfolding reality encoded in every algorithm that shapes our lives. As artificial intelligence systems permeate decisions on credit, policing, healthcare, hiring, and beyond, they risk entrenching the same inequities and power imbalances that defined historical empires. Left unchecked, these digital systems standardize norms drawn from narrow cultural, racial, and socioeconomic perspectives, effectively “colonizing” minds, markets, and societies around the globe.


At its core, digital colonization describes the extraction of data—often from the Global South or marginalized communities—to fuel AI models designed and owned by a small cadre of Western tech giants. These systems are trained on vast troves of online content that over-represent English-speaking, affluent voices while under-representing indigenous languages, non-binary identities, and vernacular traditions. The consequences extend far beyond awkward translations or mislabelled selfies; they reshape reality in the image of the powerful few, reinforcing the cultural and economic dominance of those who already hold sway.


Consider the world of credit underwriting, where automated lending platforms learn from historical data that undervalues borrowers from low-income areas and systematically offer them worse loan terms — an echo of centuries-old redlining. In criminal justice, predictive-policing algorithms assign higher recidivism risk to defendants from Black and Brown communities, feeding cycles of over-policing and incarceration. Surveillance systems driven by facial-recognition technology misidentify darker-skinned and female faces far more often than light-skinned males, leading to wrongful stops and eroding trust in public safety. In hiring, résumé-screening AI favors profiles resembling past hires — often male and majority-ethnic — silencing qualified women and minorities before a human ever reviews their application. Even in healthcare, medical AI models trained primarily on Western patient records can misdiagnose conditions in populations with different genetic backgrounds or disease profiles, delaying critical treatment for those who need it most. Social media platforms that flag “inappropriate” content according to Western norms routinely censor indigenous languages, cultural expressions, and activist voices, erasing local knowledge from the global conversation. And political micro-targeting systems, skewed toward urban, affluent user data, exclude rural and marginalized voters from crucial civic messaging, deepening the digital divide in democratic participation.


Ensuring diversity and equality in AI isn’t a feel-good add-on — it’s an existential necessity. When algorithms determine who receives credit, who gets justice, and whose health is prioritized, bias scales rapidly, compounding discrimination at every turn. True decolonization of AI demands action on multiple fronts. We must first audit and enrich training datasets by forging partnerships with local communities, linguists, and civil-society groups to capture a genuine spectrum of human experience. Next, development teams themselves must embody diversity: engineers, ethicists, sociologists, and community advocates from varied backgrounds should have equal voice in design decisions to ensure that critical questions — “Whose reality are we encoding?” or “Who might be harmed by this decision?” — are never left unasked. Finally, robust regulatory oversight is essential: Mandatory bias audits, impact assessments, and transparency requirements should compel companies to disclose how their models were trained, what data they used, and how they perform across demographic slices.


Digital colonization is not inevitable. By insisting that AI systems reflect the rich tapestry of global cultures, languages, and identities, we reclaim technology as a tool for empowerment rather than extraction. In the fight for equality and justice, algorithms must become instruments of inclusion, not invisible chains that replicate centuries of domination. The future of our societies — and of democracy itself — depends on it.

Anna Mae "Anime" Yu Lamentillo Logo

Join our mailing list

Be a Night Owl Insider and Stay Informed About the Journey. Subscribe today!

By clicking Subscribe, I agree to allow Night Owl to use the information I provide to keep me updated on Night Owl updates via email.

Thanks for subscribing!

Content Preview

Night Owl logo

Night Owl: The Nationbuilder's Manual

  • Facebook
  • TikTok
  • Instagram
  • YouTube
  • Facebook
  • Amazon Music
  • Amazon Author
  • Spotify
  • Soundcloud
  • Apple Music
  • Apple Podcast
  • Tidal

© 2017-2024 Night Owl by Anna Mae Yu Lamentillo

Distributed by Manila Bulletin Publishing and Little Ninja, Inc.

bottom of page