Minneapolis City Council Clearview AIHatmakerTechCrunch 2023
The Minneapolis City Council Clearview AIHatmakerTechCrunch has approved a ban on facial recognition software. Facial recognition technology has many negative aspects, including security flaws and privacy concerns. Invasive practices and the potential misuse of the information are also factors to consider.
Facial Recognition Ban Approved in Minneapolis
Minneapolis City Council Clearview AIHatmakerTechCrunch recently joined a growing list of major cities in the US that have implemented local restrictions on facial recognition technology. This is in response to the concerns of privacy rights activists and racial justice activists about the technology’s performance.
Facial recognition is an advanced form of surveillance that detects human faces from surveillance cameras and social media. Using complex algorithms, these systems try to match faces with names. However, critics say that the software’s accuracy is low, and that it tends to misidentify people of color and women.
Union of Minnesota
The American Civil Liberties Union of Minnesota recently launched a petition calling for a ban on facial recognition technology, and the Minneapolis City Council responded by passing an ordinance that bans the use of the technology by the police department and by government agencies. In addition to the ban, the ordinance requires city departments to request permission to use facial recognition programs.
The law also establishes an appeals process for exemptions. If a department requests permission to use facial recognition technology, it must follow a public process, which must be made transparent.
It is no secret that the facial recognition technology employed by the Minneapolis Police Department is a topic of conversation in and of itself. However, the company behind the technology has managed to garner the ire of privacy and technological zealots alike. That’s because Clearview has been found to be less than transparent in its dealings with the public.
History of Working
As well, the company has a history of working with entities outside of law enforcement. For example, it helped a Georgia bank cash a fraudulent check. This was in the late aughts. The company has also landed contracts with the Department of Homeland Security and the United States Immigration and Customs Enforcement (USICE). A recent study in Gizmodo has shown that the company’s facial recognition software is used by police agencies throughout the country, and that’s not even counting the number of private companies that are vying for access to the company’s data.
While it may be the case that the aforementioned tech will likely never be relegated to the backseat of the department’s budget, the fact remains that it could have been a good thing. After all, the technology does have a few perks, most notably, it’s ability to spot suspects in the heat of the moment.
The Minneapolis City Council Clearview AIHatmakerTechCrunch has recently trimmed its police department budget by less than five percent. One of the many reasons is the rise of facial recognition technology in Hennepin County. Indeed, the company has a whopping 3 billion images at their disposal. This means if you are a fan of a good selfie, you may have to be a fan of a bad one. Fortunately, there are ways to ensure you are not the victim of a high school crush.
The company is not alone in the facial recognition department. Last year, the city of Boston banned the use of the same technology. In fact, the state of Massachusetts has a law preventing public transit providers from displaying passengers’ faces. The best part is that the company has a nifty tool for the task. Its most impressive feat is its ability to ape human gestures and deduce facial attributes without any obtrusive snooping.
In June, Minneapolis voters voted to ban the use of facial recognition technology by local law enforcement. But despite the vote, the Hennepin County Sheriff’s Office continues to use the technology. This sparked fears among privacy advocates, who worry that AI-powered face recognition systems could be used disproportionately against communities of color.
The ACLU is suing Clearview AI, which sells access to a huge database of facial images. Its services include identity matching and biometric technology. Although the company says its use of the technology is for law enforcement, it has been selling access to billions of facial photos to private companies.
The company scrapes public images from many online sources, including major social networks, and aggregates them. According to the company’s legal team, the information it scrapes is not personal data. However, the Australian Office of the Privacy Commissioner (OAIC) found that Clearview’s practices violated the Australian Privacy Act.
The company has been order to cease selling its facial recognition database to third parties and to give individuals who request it an opt-out process. They also must stop providing free trials to police officers without department approval.