A tech company that boasts about its ability to use artificial intelligence to predict crime is in the midst of a privacy lawsuit with Meta, formerly Facebook, that wants it banned from the social media platform.
The New York City and Los Angeles police departments, two of the U.S.’s largest police agencies, are among a growing list of law enforcement agencies in the U.S. and around the world to contract with Voyager Labs.
In 2018, the New York Police Department agreed to a nearly $9 million deal with Voyager Labs, which claims it can use AI to predict crimes, according to documents obtained by the Surveillance Technology Oversight Project (STOP), The Guardian reported.
The company bills itself as a “world leader” in AI-based analytics investigations that can comb through mounds of information from all corners of the internet – including social media and the dark web – to provide insight, uncover potential risks and predict future crimes.
But Meta says in a federal lawsuit that Voyager Labs created at least 55,000 fake accounts on Facebook and Instagram to collect personal data “to uncover … behavior patterns,” “infer human behavior” and “build a comprehensive presence” on their target(s).
That includes 17,000 fake accounts after Meta revoked Voyager Labs’ access after filing the federal lawsuit on Jan. 12.
Essentially, Voyager Labs can use someone’s social media history to retrace anyone’s steps and potentially predict their next movements, according to Meta.
An NYPD spokesperson told Fox News Digital in an email that it “uses social media analytics tools to aid personnel in uncovering information relevant to investigations and to address public safety concerns.”
That includes gun violence and “various other threats against people, places and events,” according to an NYPD spokesperson, who specifically said the department “does not use features that would be described as predictive of future criminality.”