Perverse Incentives are a Big Risk for the growth of ML.

Marcus Samuel
2 min readMay 25, 2022

I have been looking deeper into how machine learning was used in Allegheny county and also how that statistical policing approach, Compstat, has worked out. We thought we were doing good work, but then some videos started showing up in my feed that made me question some of the things RGI has been doing…

Photo by Markus Spiske on Unsplash

Perverse incentives mean that for a police force depending on statistics to measure their performance, a simple way to reduce crime is to just not report it. Apparently there is a systemic motivation at play that makes a police force want to reduce the severity of crimes when filing reports so that they appear to be reducing crime (in terms of the statistical metrics they use to decide that). Imagine if a college said there is no more cheating happening, just because they stopped looking for it. We all know that’s not true!

Thinking about our own algorithms in STOP-aIT , it makes me wonder in which ways they might be biased, or just misused. It’s scary for me to think that I could make a perfect ML algorithm, with the best data I can get, and there’s still likely ways that will incentivize certain behaviours and outcomes that might hurt people. Imagine what can happen if we have bad data and rush through development and testing while we chase growth.

Should I feel responsible for the way my code is used? I don’t want to feel like I’m causing bad things to happen, but I don’t know what to do — I just bought a condo!

What do you think? Leave me a reply if you have any ideas on how to do this whole AI+Tech thing without hurting people. My bosses at RGI seem to think that it’s not our problem, that the regulators and governments need to figure it out, but there’s a union being formed at Google and a whole slew of documentaries questioning what and how we’re doing things. I just don’t know.

Marcus

--

--

Marcus Samuel

New-hire at The Rapid Growth Institute. AI Enthusiast. UKD Alumnus.