Wed. Oct 20th, 2021


A security camera at the Port Authority Trans-Hudson (Path) station at the World Trade Center in New York in 2007;  Used here as stock photos.

A security camera at the Port Authority Trans-Hudson (Path) station at the World Trade Center in New York in 2007; Used here as stock photos.
Pictures: Mario Copper (Getty Images)

Sketchy Face Recognition Company Clearview AI raises its scrap stock More than 10 billion pictures, according to its co-founder and chief executive Juan Ton-That. What’s more, he said, the company has new techniques at hand, such as drawing details of blurred or partial images of faces using AI.

Clearview has reportedly signed a deal with AI Over 3,000 police and government customers Including 11 Federal Agency, Which means use technology to identify suspects This may otherwise be impossible. In April, a BuzzFeed report With a quote from a confidential source identified More than 1,800 public agencies Which has been tested or is currently in use Its products, everything from police and district attorney’s offices to immigration and customs enforcement and the U.S. Air Force. It is also known to work Dozens of private companies With Walmart, Best Buy, Albertsons, Right Aid, Massey, Kohls, AT&T, Verizon, T-Mobile, and the NBA.

Clearview has entered into such an agreement despite considerable legal issues regarding the unauthorized acquisition of these billions of photos. State and federal lawsuits Claim of violation Biometrics Privacy Act, A Consumer protection lawsuits It has been brought by the state Vermont, Of the company Forced departure from Canada, And at least complained to the privacy regulators The other five countries. There it is There have also been detailed reports about tOn-This is a historic relationship with right-wing extremists (Which he denies) And push against the use of face recognition by the police in general, which prohibits such use More than a dozen U.S. cities.

A Interview with Wire On Monday, Ton-That claimed that ClearView has now scraped more than 10 billion images from the open web for use in its face recognition database. According to the chief executive, the company is also introducing a number of machine learning features, one of which is to use AI to reconstruct masked faces.

In particular, Ton-That told Wired that Clearview is working on “Deblur” and “Mask Removal” tools. The first feature should be familiar to anyone who has ever used AI-powered image upscaling tools, taking low quality pictures and using machine learning to add extra detail. The mask removal feature can use statistical patterns found in other images to predict what a person might look like under Masks in both cases, Clearview will basically work with informed estimates. I mean, what could be wrong?

Wired as mentioned, quite a lot. There is a very real difference between using AI Higher Mario’s face Inside Super Mario 64 And using it might just make the suspect look like a cop. For example, there have been existing face detection tools Has been repeatedly evaluated Such as puzzles Ethnic, Gender, And Other biases, And the police Reported Extremely high failure rate Its use in criminal investigations. It’s hard to imagine such a feature being used by the police as an excuse to quickly lead an investigation, not even before you actually know what a face actually looks like before adding components to the software.

“I expect the accuracy to be very poor, and beyond the accuracy, without careful control over the data set and training process, “I would expect unintentional bias,” MIT professor Alexander Madrid told Wired. Even if it works, Madrid adds, “Think of people who wore masks to take part in peaceful demonstrations or were vague to protect their privacy.”

Clearview’s argument goes a bit like this: We’re just here with building tools, and it’s up to the police to decide how to use them. For example, Ton-That assured Wired that all of this is okay because the software can’t actually go there and arrest someone on its own.

“Any enlarged image should be mentioned as such, and extra care should be taken when evaluating the results as a result of the enlarged image,” he told Ton-That magazine. “… My purpose with this technology is always to keep people in control. When AI makes a mistake, it is tested by one person. After all, it’s not like the police. Long and two-story History of the use of junk science to justify abuse or Support arrest Based on vulnerabilities Evidence and casework, Which is often The court became questionable.

Ton-that is, Of course, It is not easy to think that the police will not use such powers as profiling or finding out evidence. Again, Clearview’s backstory is full of shaky bonds To right-wing extremistsLike Reactive tRoll And accused Holocaust denier Chuck C. Johnson– and the ton-track record is full of events where it’s as awesome to watch Exaggerated power Or intentionally stoking Debate as a marketing tool. Clearview itself is fully aware of the possibility of suspicious use by the police, for that reason Of the company Marketing once advertised that police “Wild can run” After with their equipment and company The building claimed to be Accountability and anti-abuse features after entering our justice system.

The co-founder added in his interview with the cable that he is “not a political person at all” and Clearview is “not political”. Ton-That added, “There is no left or right way to catch a criminal. And we engage people from all sides of the political corridor. ”



Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *