When it comes to determining someone’s life, using machine learning isn’t the go-to tool that comes to mind. So I would like to think, at least. The U.S., a country with the highest number of incarcerated persons is leaning more and more towards AI to help relieve its civil systems from the sheer volume of admin that’s associated with 1 in 38 adults who are in some sort of correctional facility. To be exact, 2.2 million adults were behind bars in 2016 with a further 4.5 million individuals in correctional facilities. But when a nation historically ridden with racial inequality and mass incarceration of people of colour turns to machine learning to make its most human and moral decisions, further injustice is only to be expected.
Over the past few years, since their introduction into the justice system, numerous prison reform and criminal justice advocates have protested the use of AI tools used in courtrooms to, supposedly, help judges make fair decisions regarding the reshuffling of incarcerated people across state or federal prisons or use facial recognition to identify criminals. However these tools, claiming to be unbiased, have been proven time and time again to have a consistent inclination to pick out people of colour and more so, be completely inaccurate, “even mistaking members of Congress for convicted criminals” as reported by MIT Technology Review’s Karen Hao.
But while this type of AI ‘aid’ is perpetuating injustice and racial bias on a criminal scale, facial recognition tools are, sadly, not the worst of it. Hyperbolically titled ‘criminal risk assessment algorithms’, this machine learning tool is used by police enforcement when a suspect is first arrested. What this algorithm does is digest the defendant’s profile—information such as background, family history, ethnicity, geolocation and so forth—and just as quickly regurgitates back up what’s called a ‘recidivism score’, which is a score number that determines how likely this defendant is to commit another crime in the future.
What’s done with the score is exactly what you’d imagine. The score is used by judges to determine what type of correctional facilities the defendant will be sent to, how severe their sentencing will be, and whether they will be forced to remain in prison until their sentence with bail, without bail, or with an extortionate bail fee. The idea behind this tool is that it both aims to reduce or eliminate bias in judges and also help predict criminal behaviour and thus offer the correct type of services and facilities for rehabilitation. Only issue is that this algorithm has been built on millions of racially bias and often inaccurate files spanning decades from a country that has been, for decades, throwing people of colour into prison for often times minor offences or completely wrong convictions.
The real problem with this algorithm is that it tries to solve a rotten system of injustice and mass incarceration by tackling the tip of it, only adding fresh ingredients to the already toxic rot at the bottom. These AI tools are anchored in an administrative foundation; one that seeks to relieve federal and state justice workers from the sheer mass of admin associated with a country that insists not to reform its justice system.
“We are not risks, we are needs”, said Marbre Stahly-Butts, executive director of Law for Black Lives, “a national network of radical lawyers, law students, and legal workers of color committed to building the power of the Black Lives Matter movement” as described on their website. Marching straight forward with the use of such AI tools is not only dangerous but utterly criminal and irresponsible. The U.S. in particular but other countries too cannot rely on data gathered from the past 30 or 50 years to build its supposedly non-bias algorithms because never in history or today have we been unbiased, especially when it comes to criminal justice.
When the first 3D printers appeared, people daydreamed about creating their own furniture, some went as far as 3D-printing whole villages, but very few expected the technology would add to the U.S.’ gun problem—and yet here we are. In 2012, Cody Wilson created Defence Distributed, a 3D-printing gun company, considered by many to be the driving force behind this niche industry. In September 2018, Wilson was arrested and charged with sexual assault against a minor, forcing him to step down from the company.
Defence Distributed ended up dying slowly after that, but not without a bang. The company still has many other ongoing legal battles. Why? Because it uploaded and shared 3D-printed gun blueprints online, enabling anyone who has a 3D printer to own a gun—which is now illegal in the U.S. if the gun is fully made of plastic, making it invisible to metal detectors. Last year, when Defence Distributed was submerged by lawsuits left, right, and centre, everyone—the American government included—eased up. The headquarters were shut down, and the leader put behind bars. What could go wrong now?
What if there was no headquarters, no trademarks, and no real leader? Then the government would be unable to trace back to the gun blueprints. That’s exactly the idea that Defence Distributed’s substitute company had. Named Deterrence Dispensed, it uploads files individually on media-hosting sites underpinned by the LBRY blockchain—meaning decentralised platforms owned by its users. Not only are the members of Deterrence Dispensed not waiting for any government’s approval of their blueprints, but they’re also modifying old ones and offering customers more choice.
In an interview with Wired, a member of the group known as ‘Ivan the Troll’ explained how Deterrence Dispensed is more than a big fuck you to the U.S. government, saying, “Even if there was no government telling me I couldn’t do this, I think that I would still do it. I like spending hours and hours drawing stuff on Computer-Aired Design (CAD).” Ivan the Troll does more than “drawing stuff” though, he creates gun designs, adding to the threat that guns already are in America.
3D-printed guns are made of plastic, meaning they’re also a single-shot, disposable device that really can only be fired once, and if not printed perfectly, could potentially misfire and cause injury to the shooter himself. Printers are starting to experiment with metallic parts, but we’re still far from being able to download a file for any kind of gun and just press a button, and let the printer do its job. That’s exactly the reasoning that pro-gun supporters have, but plastic or not, a gun is still a gun.
Mass shootings, gun-related deaths, terrorist attacks… Do we really need more guns, especially in the U.S.? To support his argument, Ivan mentioned the many police shootings of unarmed black men in America, implying that if you can get shot by the police for no reason, you should also own a gun. But a research from Harvard University shows that where there are more guns, there are more murders—simple as that. Sorry Judge Jeneane.
Apart from Deterrence Dispensed, there are thousands more 3D-printed gun enthusiasts worldwide, doing exactly the same, on a smaller scale. There is no way to stop this file-sharing disease. So where do we go from there? We need to talk about gun violence, and why this can’t be our new normal—in the U.S. or anywhere else. The clear uncertainness that surrounds the gun discussion is what blocks it from going somewhere. Then again, some might argue that guns are not the problem, people are.