By J. D. Heyes
The coming tsunami of devices equipped with â€œartificial intelligenceâ€ (AI) technology may seem like, to some, a welcome addition to all things that make life easier on planet Earth.
But to a growing number of critics, AI represents the next opportunity for big government to encroach upon our lives daily, and even prosecute and imprison us based on what a machineÂ thinksÂ we might do.
AsÂ NextGov reports, AI-capable cameras said to be able to detect â€˜behaviorsâ€™ that lead to crime are coming as the next evolution in surveillance technology that cities and governments deploy to monitor citizens 24/7/365.
â€œImagine it were possible to recognize not the faces of people who had already committed crimes, but the behaviors indicating a crime that was about to occur,â€Â NextGovÂ noted, adding:
Multiple vendors and startups attending ISC West, a recent security technology conference in Las Vegas, sought toÂ serveÂ a growing market for surveillance equipment and software that can find concealed guns, read license plates and other indicators of identity, and even decode human behavior.
A company calledÂ ZeroEyesÂ out of Philadelphia markets a system to police departments that can detect when a person is entering a given facility carrying a gun. It integrates with any number of closed-circuit surveillance systems.Â
Under the guise of â€˜ending mass shootingsâ€™ â€” itâ€™s always about gun control with the authoritarian Left â€” the AI community seeks to develop â€˜predictiveâ€™ technology that can not only spot certain behaviors, allegedly, that indicate a gun crime is about to take place, but other violent criminal activity as well.
At present, precursors to predictive criminal behavior â€” facial recognition and license-plate reader technology â€” are being met with stiff resistance from privacy advocates. And in some cases, at least, judges have sided with them in limiting the use of such technology in the public square.
As such, there can be no doubt that AI-driven pre-crime â€˜recognitionâ€™ will be met with similar resistance from pesky constitutionalists who still believe in privacy rights and our founding legal principle of innocent until proven guilty.Â
And why not? In addition to the probability of false positives, how long will it be before predictive crime technology leads to pre-incident arrests for criminal activities some machineÂ thinksÂ someone is about to engage in? (Related:Â â€œMinority Reportâ€ PRE-CRIME now real in Colorado as it becomes latest state to pass â€˜red flagâ€™ gun law.)
Farfetched? Hollywood has already made a movie about it, and not recently (â€œMinority Reportâ€ â€” 2002). As for pre-arrests for pre-crimes, Hollywood has that covered as well (â€œPre-Crimeâ€ â€” 2017).
In the latter, according to the seriesâ€™ official trailer, â€œA pre-emptive arrest is made of someone before they perform an actâ€ â€” all based on a computer algorithm (powered by predictive artificial intelligence).
â€œAdoption of pre-crime tech is beginning to trend in the U.S. PredPol, one of the leading systems on the market, is already being used by law enforcement in California, Florida, Maryland and other states,â€ reportsÂ Bleeping Computer. â€œAside from civil liberties concerns, however, aÂ flawÂ found in the design of the type of software used indicates that predictive algorithms are to blame for a whole new set of problems.â€
After U.S. researchers analyzed how PredPol actually works to predict crime, they discovered that the software initiates a â€œfeedback loopâ€ leading cops to being repeatedly directed to certain neighborhoods regardless of what the actual crime rates in those neighborhoods really is.
The problem? Human decisions go into designing software, including AI software, so flaws are inherent in them.Â
Still, based on trial and error, researchers are bound to come up with a technology that functions on a certain level. The problem then becomes what to do about being able to predict crime before it happens â€” and thatÂ will inevitably lead to pre-arrests.
A version of this story first appeared at NewsTarget.