Acy’s, Bloomingdale’s and a dozen other major retailers are quietly using controversial face recognition technology to combat the increasing number of burglaries and robberies and other organized attacks.
It is part of a larger campaign to turn retailers’ security cameras – those ubiquitous black gadgets producing thousands of hours of useless images – into a sophisticated artificial intelligence system, which can automatically detect people, digital numbers and other information can be used to warn store employees about potential threats and ultimately prosecute .
“There is a big game of using AI right now,” said Read Hayes, director of the Loss Prevention Research Council. “Vendors have cameras everywhere.”
Technology is appealing at a time when theft and violence are rampant, especially in organized crime groups who steal millions of dollars and sell products online. Organized store crime has increased by 60 percent since 2015, according to the National Retail Federation, with about 70 percent of retailers reporting an increase by 2021. Approximately $ 69 billion worth of products are stolen from national retailers each year, or 1.5% of sales, according to estimates from the Retail Industry Leaders Association and the Buy Safe America Coalition. A total of 523 people were killed during a robbery and other violent trade incidents in the US in 2020, involving 256 customers and 139 employees.
This month, the federal government fined 29 people for stealing $ 10 million in over-the-counter medicines and other items from Walmart, Costco, CVS, GNC and others, and reselling products on sites like Amazon and eBay.
“What retailers are really trying to do is provide the police with more evidence of who is doing this,” said Adrian Beck, a professor at the University of Leicester whose research focuses on ways to combat shoplifting.
Facial technology is controversial because research has shown that it is often inaccurate in identifying people of color and women. The worst technology has an error rate of up to 35% when scanning black women, but less than 1% for men with fair skin, according to one research report. The reason? Early versions of the algorithms were trained using the distorted images of white and male celebrities.
The companies were working to address racism, and between 2014 and 2018, face recognition software improved 20 times on a website search to find the same image, according to the National Institute of Standards and Technology. However, since 2019, the government has found that some software still mistransts African-Americans and Asians 10 to 100 times more often than white people.
“People have been unjustly imprisoned because of this technology,” said Jay Stanley, senior policy analyst for the speech, privacy and technology project at the American Civil Liberties Union. “It’s not ready for the first time.”
There has been a push to curb the use of face recognition by law enforcement, as cities such as San Francisco, Minneapolis and Boston bar police from using it. Companies like Amazon and Microsoft have stopped selling technology to the police.
However, face-to-face companies market their technology to retailers, who use it to help gather evidence from repeat offenders before finally sharing it with local law enforcement.
“There has been a change where the seller does one of these jobs,” said Tony Sheppard, director of loss prevention solutions at ThinkLP. Legal resources are sometimes small, he added.
FaceFirst, which claims to work with one-fourth of North America’s largest retailers, includes time-stamped incidents and calculated past losses that are packaged to enforce the law. “When you send this to the police, it ‘s a strong news board,” FaceFirst president Dara Riordan said at a recent industry conference.
Vendors should create their own checklists, a manual process that involves a person identifying a video after an incident and asking the software to notify them the next time that person enters one of their locations. It will not help if one steals for the first time, but it can catch repeat offenders.
“The demand for our product is growing,” said Dan Merkle, CEO of FaceFirst, now using more than 12 trillion comparisons per day for its customers, up from 100 million in 2017. It says its technology has a 99.7% accuracy level. without discrimination on the basis of gender or race, and could reduce theft by unemployed people by 34% and store violence by 91%.
It helped one salesman catch “The Philly Fanatic,” a thief known for donating gear to the Philadelphia Phillies, so as not to confuse a baseball team mascot with the same name. He has been hitting the same vendor every two to two days with different co-workers, secretly filling the cupboard with $ 2,000 or more on furniture, closing it with dual tape and buying the closet as usual. The goods were then returned to the “bank,” for resale. The software helped identify the suspect and sent a real-time warning to the seller, who arrested him and handed him over to police, FaceFirst reported.
The UK-based Facewatch said business had doubled during the violence, with a very strong demand for easily accessible stores. Its largest shipping is in a store with more than 100 locations.
Sales are also part of a growing New York-based Oosto customer base, according to chief marketing officer Dean Nicholls. Both regional and national retail chains use its own technology, with some shipping it to hundreds of stores. The 150-member company has raised $ 350 million.
Vendors also try to determine when to contact employees if a burglar has entered a store and how to order them in a safe manner.