CM 066: Cathy O’Neil on the Human Cost of Big Data




Curious Minds: Innovation in Life and Work show

Summary: <a href="http://www.gayleallen.net/wp-content/uploads/2016/12/Blog-Post-Cathy-and-WMD.png"></a>Algorithms make millions of decisions about us every day. For example, they determine our insurance premiums, whether we get a mortgage, and how we perform on the job. <br> Yet, what is more alarming is that data scientists also write the code that fires good teachers, drives up the cost of college degrees and lets criminals evade detection. Their mathematical models are biased in ways that wreak deep and lasting havoc on people, especially the poor.<br> Cathy O’Neil explains all this and more in her book, <a href="https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815">Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy</a>. Cathy earned a Ph.D. in mathematics from Harvard, taught at Barnard College, and worked in the private sector as a data scientist. She shares her ideas on the blog<a href="http://mathbabe.org"> mathbabe.org</a> and appears weekly on the <a href="http://www.slate.com/articles/podcasts/slate_money.html">Slate Money</a> podcast.<br> Here are some of the things that came up in our conversation:<br> <br> The shame she felt as a data scientist working for a hedge fund during the financial crisis<br> How most of us trust and fear math to the point where we stop asking questions<br> How a faulty algorithm cost a high-performing teacher her job<br> How value-added models of evaluation miss the mark<br> How a mathematical model is nothing more than an automated set of rules<br> The fact that every mathematical model has built-in blindspots<br> What is hard to measure typically does not get included in an algorithm<br> The cost to colleges and applications of leaving price out of college ranking algorithms<br> Crime prediction models can fail because of incomplete data<br> The big error in the findings of A National at Risk report and how we still pay for it<br> How poverty lies at the heart of the achievement gap<br> What allows big data to profile people efficiently and effectively<br> Where we may be headed with individual insurance costs because of big data<br> Why we need rules to ensure fairness when it comes to health insurance algorithms<br> Data scientists have become de facto policy makers and that is a problem<br> The set of questions all data scientists should be asking<br> The fact that FB serves up an echo chamber of emotional content to hook us <br> How data is just a tool to automate a system that we, as humans, must weigh in on<br> Why healthy algorithms need feedback loops<br> Why we have a problem when we cannot improve a model or reveal it as flawed<br> Why we need to stop blindly trusting algorithms<br> Questions we should be asking to demand accountability of algorithm designers<br> <br> Important Links<br> <a href="http://twitter.com/mathbabedotorg/status/797088966281412608">@mathbabedotorg</a><br> <a href="https://mathbabe.org/">https://mathbabe.org/</a><br> <a href="https://www.washingtonpost.com/blogs/answer-sheet/post/firing-of-dc-teacher-reveals-flaws-in-value-added-evaluation/2012/03/07/gIQAtmlGxR_blog.html?utm_term=.df2ddf2f83bf">Sarah Wysocki</a><br> <a href="http://www.usnews.com/education/best-colleges/articles/ranking-criteria-and-weights">U.S. News &amp; World Report college ranking system</a><br> <a href="http://www.predpol.com/">PredPol</a><br> <a href="https://www.amazon.com/Rise-Robots-Technology-Threat-Jobless/dp/0465097537">Rise of the Robots by Martin Ford</a><br> <a href="http://www2.ed.gov/pubs/NatAtRisk/">A Nation at Risk</a><br> <a href="https://en.wikipedia.org/wiki/Achievement_gap_in_the_United_States">The Achievement Gap</a><br> If you enjoy the podcast, <a href="https://en.wikipedia.org/wiki/David_Steindl-Rast">please rate and review it on iTunes</a> – your ratings make all the difference. For automatic delivery of new episodes,