Bombshell new reporting from 404 Media found that Flock, which has its cameras in thousands of US communities, has been outsourcing its AI to gig workers located in the Philippines.
After accessing a cache of exposed data, 404 found documents related to annotating Flock footage, a process sometimes called “AI training.” Workers were tasked with jobs include categorizing vehicles by color, make, and model, transcribing license plates, and labeling various audio clips from car wrecks.
In US towns and cities, Flock cameras maintained by local businesses and municipal agencies form centralized surveillance networks for local police. They constantly scan for car license plates, as well as pedestrians, who are categorized based on their clothing, and possibly by factors like gender and race.
In a growing number of cases, local police are using Flock to help Immigration and Customs Enforcement (ICE) agents surveil minority communities.
It isn’t clear where all the Flock annotation footage came from, but screenshots included in the documents for data annotators showed license plates from New York, Florida, New Jersey, Michigan, and California.
Flock joins the ranks of other fast-moving AI companies that have resorted to low-paid international labor to bring their product to market. Amazon’s cashier-free “just walk out” stores, for example, were really just gig workers watching American shoppers from India. The AI startup Engineer.ai, which purported to make developing code for apps “as easy as ordering a pizza,” was found out to be selling passing human-written code as AI generated.
The difference with those examples is that those services were voluntary — powered by the exploitation of workers in the global south, yes, but with a choice to opt out on the front-end. That isn’t the case with Flock, as you don’t have to consent to end up in the panopticon. In other words, for a growing number of Americans, a for-profit company is deciding who gets watched, and who does the watching — a system built on exploitation at either end.



The term “AI” has been misused and misrepresented so much that it’s more of an aesthetic than a technology now.
🌏👨🚀🔫👨🚀 (currently since 2023)
The main thing flock is really supposed to do is capture and match pictures of license plates at different locations. It’s not even complex.
So how tf did they get the green light for the first government contract if they never even had that capability?
Either they conned the government org in charge of purchasing it, or that org just didn’t care enough to look deeper. They got a professional-looking demo that made it look like the tech worked, and signed the contract without a second thought.
The history of the organization seems very odd
https://en.wikipedia.org/wiki/Flock_Safety
What?? How did a detective use it to solve a crime? Who was he? And based off of this one dude you all 3 just quit your jobs??? What??
Then we just jump ahead to 2022 and these cameras that didn’t even work had raised over $380 million in venture funding?
Then by the next year they were being used to sub for actual police due to a shortage of police officers?
So they just go from the Hardy Boys help solve a mystery in Georgia in 2017 and then suddenly by 2023 Marc Andreesen (big surprise) is suddenly funnelling millions into their business.
Oh, good, this citation will probably help make clear what the fuck actually happened between 2017 and 2023: Flock Safety. “Media Kit: Our Founding Story”. Flock Safety. Retrieved April 8, 2022.
Yeah I don’t understand how a private company deploying cameras on the side of roads is even legal.
Does that mean I can build solar powered raspberry pi units with cameras that do the same thing and pepper them around the country without question?