WE HAVE MOVED!

"And I beheld, and heard the voice of one eagle flying through the midst of heaven,
saying with a loud voice: Woe, woe, woe to the inhabitants of the earth....
[Apocalypse (Revelation) 8:13]

Friday, February 17, 2017

Police State Facial Recognition: AI-Powered Body Cams Give Cops The Power To Google Everything They See

Police State Facial Recognition: AI-Powered Body Cams Give Cops The Power To Google Everything They See
The police body camera industry is the latest to jump on the artificial intelligence bandwagon, bringing new powers and privacy concerns to a controversial technology bolstered by the need to hold police accountable after numerous high-profile killings of unarmed black citizens. Now, that tech is about to get smarter.


Last week, Taser, the stun gun company that has recently become an industry leader in body-mounted cameras, announced the creation of its own in-house artificial intelligence division. The new unit will utilize the company’s acquisition of two AI-focused firms: Dextro, a New York-based computer vision startup, and Misfit, another computer vision company previously owned by the watch manufacturer Fossil. Taser says the newly formed division will develop AI-powered tech specifically aimed at law enforcement, using automation and machine learning algorithms to let cops search for people and objects in video footage captured by on-body camera systems.
Moreover, the move suggests that body-worn cameras, which are already being used by police departments in many major cities, could soon become powerful surveillance tools capable of identifying different objects, events, and people encountered by officers on the street — both retroactively and in real time.
MoreDOJ Study: Police-Worn Body Cameras Increasingly Recognize Your Face
The idea is to use machine learning algorithms to streamline the process of combing through and redacting hours of video footage captured by police body cameras. Dextro has trained algorithms to scan video footage for different types of objects, like guns or toilets, as well as recognize events, like a foot chase or traffic stop. The result of all this tagging and classifying is that police will be able use keywords to search through video footage just like they’d search for news articles on Google, allowing them to quickly redact footage and zoom in on the relevant elements. Taser predicts that in a year’s time, their automation technology will reduce the total amount of time needed to redact faces from one hour of video footage from eight to 1.5 hours.



Searchable video will also have major implications for civilian privacy, especially since there are no federal laws preventing police from trawling through databases to track people en masse.
Taser has previously expressed interest in adding face recognition capabilities to its body camera systems. A Department of Justice study published last year also found that at least nine different body camera manufacturers either currently support face recognition in their products or have the ability to add it later. And according to a recent Georgetown University Law report, roughly half of all American adults have been entered into a law enforcement face recognition database, meaning there’s decent chance that any random person walking down the street can be identified and tracked in secret by a camera-equipped cop.
A Taser representative told Vocativ that while Dextro’s computer vision technology will allow Taser’s law enforcement customers to detect faces for the purpose of redacting them from videos, it does not currently support face recognition.
MoreMemo: New York Called For Face Recognition Cameras At Bridges, Tunnels
“To clarify, Dextro’s system offers computer vision techniques to identify faces and other objects for improving the efficiency of the redaction workflow. AI enables you to become more targeted when needed,” Steve Tuttle, Taser’s vice president of communications, he said.
That means, he explained, that “you can show where a face starts in a video” to speed up a search, but that the technology “doesn’t identify individual faces or people.”
The company claims that its use of AI will be focused on “efficient categorization, semantic understanding, and faster redaction” of video footage as a method of “reducing paper work and enabling officers to focus on what matters.”
Using AI to optimize footage in this way is a logical next step for Taser, which has been positioning itself to become a one-stop shop for the capture, storage, and processing of video evidence by law enforcement agencies. The company’s Axon body camera platform currently handles more than 5 petabytes of footage, captured by officers and stored in a proprietary cloud locker called Evidence.com.
Taser has claimed that its platform prevents tampering with video evidence by logging every time a piece of footage is accessed. But critics have warned that such privately-owned systems are ripe for abuse because police and prosecutors still have exclusive control over the footage, as well as the system that processes it.
MorePolice Body Camera Company Is Facing New Scrutiny
In the future, Taser CEO Rick Smith said in a live webcast Wednesday, the company wants to expand these capabilities into a kind of AI “personal secretary” for police officers. Such a system would fully automate the collection of data during police encounters, using live voice transcription and image analysis to feed relevant information back to the officer.
“Police officers are spending most of their time entering information into computers” about their interactions in the field, Smith said during the webcast. “We want to automate all of that.”
But privacy and police accountability advocates are wary of letting a for-profit company like Taser dictate so much about how high-tech policing works, especially when no restrictions are in place limiting when or how often video archives and face recognition databases can be searched.
“We’re talking about a company making very far-reaching decisions about the use of emerging technologies in policing,” Clare Garvie, an associate at Georgetown Law’s Center on Privacy & Technology, told Vocativ. “It’s really an open question right now what controls will be put in place at the public agency level.”


Bishop Williamson on Orwell's 1984 & 9/11