Join us for Episode 8 of The Confessions of Angry Programmers podcast! In this episode David and Woody talk about:
- Dogfooding: Woody talks about debugging AWS LAMBDA’s.
- WTF Were They Thinking?: David discusses sites like Sticker Mule testing text messages on unsuspecting live users.
Guest
We are joined by our guest Adnan Masood. Adnan discusses ethical biases that can show up in artificial intelligence (AI) data sets and how they can affect end users.
Adnan Masood, Ph.D. is an Artificial Intelligence and Machine Learning researcher, software architect, and Microsoft MVP (Most Valuable Professional) for Artificial Intelligence. As Chief Architect of AI and Machine Learning, at UST Global, he collaborates with Stanford Artificial Intelligence Lab, and MIT AI Lab for building enterprise solutions
Author of Amazon bestseller in programming languages, “Functional Programming with F#”, Dr. Masood teaches Data Science at Park University, and has taught Windows Communication Foundation (WCF) courses at the University of California, San Diego. He is a regular speaker to various academic and technology conferences (WICT, DevIntersection, IEEE-HST, IASA, and DevConnections), local code camps, and user groups. He also volunteers as STEM (Science Technology, Engineering and Math) robotics coach for elementary and middle school students.
A strong believer in giving back to the community, Dr. Masood is a co-founder and president of the Pasadena .NET Developers group, co-organizer of Tampa Bay Data Science Group, and Irvine Programmer meetup. His recent talk at Women in Technology Conference (WICT) Denver highlighted the importance of diversity in STEM and technology areas, and was featured by variety of news outlets.Bill Wolff is an independent consultant, trainer, and architect specializing in Microsoft development technologies under the name Agility Systems.
He served as the SharePoint Practice Director at Capax Global, Solutions Architect in the Microsoft Practice at Unisys Corporation, and ran the Microsoft Alliance at LiquidHub. He ran the consulting firm Wolff Data Systems for 15 years and directed armies of consultants in the dot com world. Bill is founder and President of the philly.NET user group, a previous INETA board member where he served as Vice President of the Speaker Bureau, and involved in several other user communities. Bill was a contributing author on several books. His certifications include trainer, systems engineer, developer, and Microsoft MVP.
Resources
- David McCarter’s Books on Amazon
- Is Quality Part of Open-Source Projects Your App Is Using?
- Tweet from SmugMug (mentioned by David)
- AI Books
- Algorithms of Oppression – How Search Engines Reinforce Racism by Safiya Umoja Noble
- Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor by Virginia Eubank
- Weapons of Math Destruction: How Big Data Increases Inequality by Cathy O’Neil
- Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting by Shannon Vallor
- How to fight bias in algorithms: https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms#t-403514
- Fighting Algorithmic Bias in A.I.: https://medium.com/smileidentity/the-impact-of-algorithmic-bias-in-a-i-f69bacf04f5b
- On Countering Bias—in People and Algorithms: https://about.flipboard.com/inside-flipboard/on-countering-bias-in-people-and-algorithms/
- Controlling machine-learning algorithms and their biases: https://www.mckinsey.com/business-functions/risk/our-insights/controlling-machine-learning-algorithms-and-their-biases
- Human bias is a huge problem for AI. Here’s how we’re going to fix it: https://thenextweb.com/artificial-intelligence/2018/04/10/human-bias-huge-problem-ai-heres-going-fix/
- Machine learning techniques keep creating racist algorithms: https://theoutline.com/post/1571/the-fight-against-racist-algorithms?zd=1&zi=bb2e7dch
- Turns Out Algorithms Are Racist: https://newrepublic.com/article/144644/turns-algorithms-racist
- Are we programming gender bias into our future?: https://www.ey.com/en_gl/workforce/are-we-coding-gender-bias-into-our-future
- Even Imperfect Algorithms Can Improve the Criminal Justice System – A way to combat the capricious and biased nature of human decisions: https://www.nytimes.com/2017/12/20/upshot/algorithms-bail-criminal-justice-system.html
- Data Sheet—How to Fight the Growing Scourge of Algorithmic Bias in AI: https://www.affectiva.com/news-item/data-sheet-how-to-fight-the-growing-scourge-of-algorithmic-bias-in-ai/
- The Risk Of Machine-Learning Bias (And How To Prevent It): https://www.oliverwyman.com/our-expertise/insights/2018/mar/the-risk-of-machine-learning-bias–and-how-to-prevent-it.html
- Machine Bias – There’s software used across the country to predict future criminals. And it’s biased against blacks: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
- Biased Algorithms? Find out how they work and how to beat the system: https://www.psychologytoday.com/us/blog/culture-mind-and-brain/201709/biased-algorithms
- Want a More Diverse Workforce? How AI is Combating Unconscious Bias: https://www.delltechnologies.com/en-us/perspectives/want-a-more-diverse-workforce-how-ai-is-combating-unconscious-bias/
- Is AI capable of tackling biased opinion?: https://hackernoon.com/is-ai-capable-of-tackling-biased-opinion-8c764be8c4f3
- The bigot in the machine: Tackling big data’s inherent biases https://www.irishtimes.com/business/technology/the-bigot-in-the-machine-tackling-big-data-s-inherent-biases-1.3094978
- UK report urges action to combat AI bias: https://techcrunch.com/2018/04/16/uk-report-urges-action-to-combat-ai-bias/
- Removing gender bias from algorithms: https://phys.org/news/2016-09-gender-bias-algorithms.html
- These Entrepreneurs Are Taking on Bias in Artificial Intelligence: https://www.entrepreneur.com/article/319228
- AI is the future of hiring, but it’s far from immune to bias: https://qz.com/work/1098954/ai-is-the-future-of-hiring-but-it-could-introduce-bias-if-were-not-careful
- Is your software racist?: https://www.politico.com/agenda/story/2018/02/07/algorithmic-bias-software-recommendations-000631
- Ghosts in the Machine: http://www.pbs.org/wgbh/nova/next/tech/ai-bias/
- Removing BIAS from Machine Learning: https://www.graphcore.ai/posts/removing-bias-from-machine-learning
- The Foundations of Algorithmic Bias: https://www.kdnuggets.com/2016/11/foundations-algorithmic-bias.html
- Politics Podcast: Two Forecasts Diverged In A Wood Technology Is Biased Too. How Do We Fix It?: https://fivethirtyeight.com/features/technology-is-biased-too-how-do-we-fix-it/
- What You Need to Know About Racial and Gender Bias in Artificial Intelligence: https://nudest.co/naked-truths/bias-in-ai
You must be logged in to post a comment.