AI is bringing a tech revolution to policing. But AI policing software can also reinforce existing biases and threaten our right to privacy. DW lays out the pros and cons of the technology.
Category
🗞
NewsTranscript
00:00Using AI to fight crime? Sounds great, right?
00:03Facial recognition could help catch criminals faster
00:06and predictive policing might even stop crimes
00:09before they actually happen.
00:10But it's not that simple.
00:12More security or a threat to privacy.
00:15Let's weigh the pros and cons of AI in policing.
00:19A recent study found that 75% of European citizens
00:22support the use of AI by police and military,
00:25for example, in surveillance.
00:27That's surprising, given how many innocent people
00:30have been harmed by faulty AI.
00:32Now, don't get me wrong,
00:33AI can sift through large amounts of data quickly,
00:37like databases of wanted persons or crime statistics.
00:40It can also draw conclusions faster than any police officer.
00:43But AI does make mistakes and can be misused,
00:47for example, in facial recognition systems.
00:5175% of Argentina's capital, Buenos Aires,
00:54is under video surveillance.
00:56The city rolled out a massive facial recognition program
00:59in 2019.
01:01Within months, the government claimed
01:02nearly 1,700 wanted criminals had been caught.
01:06But dozens of errors were also made,
01:08leading to unjustified police checks and even arrests.
01:12One resident, Guillermo Ibarrola,
01:14was wrongfully detained for six days.
01:16Data protection activists sued the city,
01:19which led to the system being shut down in 2022.
01:23And it's been in limbo ever since.
01:25Activists and city representatives
01:27are still debating over a legal framework.
01:29Because there are more concerns.
01:31The investigation found data not just on criminals,
01:34but also politicians, activists, and journalists.
01:37Were police using the system to track people illegally?
01:40An even bigger concern with facial recognition
01:43is that it can be used for ethnic profiling.
01:45China, for example, has used this tech
01:47to monitor and detain the Muslim Uyghur minority.
01:50And facial recognition also has a general flaw.
01:54It doesn't work equally well for everyone.
01:56Studies show it's less accurate for people of colour,
01:59women, and non-binary individuals.
02:01So there's a lot of work to be done
02:03before these systems can work without bias.
02:07Predictive policing.
02:10But what if crimes could be prevented
02:12before they're even committed?
02:14That's the idea behind predictive policing.
02:16With AI, large data sets can be analysed
02:19to spot patterns and trends humans might miss.
02:21In theory, this could make police work more efficient
02:24and reduce human error in decision-making.
02:27But accuracy and fairness of these models
02:29depend purely on the quality and diversity
02:32of the data that they are trained on.
02:34So the risk of reinforcing existing biases is high.
02:37When AI is trained on bias historical crime data,
02:41it can reinforce those biases.
02:43Over-policed minority neighbourhoods
02:45may appear to have higher crime rates.
02:47And as a result, predictive policing tools
02:49could then unfairly target these communities,
02:52only increasing inequality.
02:54That said, predictive models are already being used
02:57in certain fields.
02:58For instance, they help assess risks at large events
03:01such as football matches.
03:02This allows police to focus on areas
03:04where issues are most likely to occur.
03:07For example, fights.
03:09AI and police.
03:10Can it work?
03:11AI can save police officers time.
03:14For example, by creating case logs in the future.
03:16It can ensure that they're in the right place
03:18at the right time.
03:20And it could even lead to fairer decisions
03:22by removing human prejudices from the equation.
03:25But to get there, some obstacles need to be overcome.
03:28First, databases must be truly representative
03:31and diverse to ensure they treat everyone fairly.
03:34And second, there needs to be a clear legal framework
03:37on what data authorities can access.
03:39Abuse of this tech could threaten our privacy
03:42and civil rights.
03:43What do you think of the police using AI?
03:46Let us know.