Image via City Life/Vida Urbana
Mass uses questionable tools to investigate eviction relief applicants
State officials are spending more than $750,000 of COVID-19 aid on corporate consultants who will investigate applicants for eviction relief by running analytics on applications that housing advocates say can prevent residents from getting the relief they need. And while officials said that they do not plan to scrape applicants’ social media as part of those investigations, they redacted information in the contractor’s agreement that says it does use comparable tools. One company listed in the contract previously attempted to sell the LAPD on identifying potential terrorists through Twitter.
The state Department of Housing and Community Development has hired Deloitte to investigate potential fraud in the state’s distribution of federal Emergency Rental Assistance money, according to bid documents. Although the state is scheduled to stop taking new applications for those funds on April 15, it is paying Deloitte $128,000 a month for the next six months to investigate fraud in existing applications.
When DigBoston requested documents showing Deloitte’s scope of work for their investigations, DHCD responded with a redacted application that concealed some of the company’s tactics and partners, saying the disclosure “is likely to jeopardize public safety or cyber security.” We then obtained an unredacted copy, which shows Deloitte using tracking of metadata on applications and “statistical anomaly detection methodologies” to look for outliers in applicant demographics, occupations, and earnings, among other techniques.
The DHCD statement also says Deloitte partners use “open-source intelligence tools and databases” to gather online information, including social media investigator Voyager Analytics, which collaborates with law enforcement and has boasted of using its AI to scrape social media to determine users’ “ties to or affinity for Islamic fundamentalism or extremism.”
State officials said the redacted list of partners was a sample only and that there were no plans to use vendors other than Deloitte to investigate applicants. But civil liberties advocates said potentially using companies like Voyager—and concealing that information—is a big concern.
“This is the creeping of government surveillance,” said Mukund Rathi, Stanton Fellow at the Electronic Frontier Foundation, adding that the EFF just filed a federal lawsuit to obtain records about the Department of Homeland Security’s program to spy on immigrants’ social media. “We have a lot of concerns with government surveillance of social media across the board. Especially when it comes to aid applicants—they’re already in vulnerable positions and don’t have much leverage against the government.”
Paying to probe
Mass has received millions of dollars in federal Emergency Rental Assistance money to prevent evictions during the COVID-19 pandemic. But the state is closing applications for that aid on April 15, which housing advocates said will hurt people looking for aid before state Residential Assistance for Families in Transition money becomes available in July.
And while the application process will be closed, officials put out a bid two months ago to hire consultants to look for fraud. Deloitte, which says it has managed ERA fraud investigations in Arkansas, Nebraska, and Texas, was awarded a contract to review applications through October, and will be paid $768,000 from that ERA pile.
DHCD officials have not said how many fraudulent applications they’ve seen. In the bid, released two months ago, the agency said it has received between 3,000 and 3,500 ERA and RAFT applications a week for the past 10 weeks, but doesn’t say how many were suspected of fraud. Deloitte’s statement of work assumes it will investigate 90,000 applications and the “fraud rate” will not exceed 2% of applications—about 1,800 in total—but DHCD officials said those are assumptions.
DHCD officials said fraud investigations would ensure money went to those in need instead of people scamming the system, and that Deloitte is taking care to keep legitimate applicants from missing out. A Deloitte spokesperson said the company is committed to helping residents secure and maintain affordable housing and that aid was only paid to eligible applicants.
Andrea Park of the Mass Law Reform Institute said paying aid money to Deloitte diverts it from people in need, noting it is unclear how much fraud is actually taking place.
“It’s disappointing that the Commonwealth is spending such significant resources and time on private consultants for fraud detection. While we want to ensure these resources get to those who need it most, there’s been very little transparency about whether problems have actually arisen,” Park said in a statement. “On the contrary, we’ve heard anecdotally that urgent applications for assistance have been held up as potential fraud simply because peoples’ lives are complicated. Instead of taking steps to further simplify the process, we are concerned that more investigation will delay or prevent eligible people from getting the help they need.”
Deep dives, questionable information
The signed statement of work between Deloitte and DHCD includes Deloitte’s response to the bid, where the company lays out its investigative plans. Deloitte says it will check metadata on online applications, like IP addresses and devices used to submit, that can “be used to identify individuals attempting to submit multiple fraudulent claims.”
The company will cross-reference that information with known cybercriminal activity, but also use other analytics that focus on “on identifying groupings of claims that, when viewed individually, do not appear to indicate any [integrity for ERA] concerns,” according to the bid. “However, using sophisticated aggregation and unsupervised modeling techniques, Deloitte can reveal trends, patterns, and lessons-learned over time related to emerging fraud schemes.” This includes looking for outliers in claim information, including, “resident demographics, industry/occupation, wage amounts” for claims.
Rathi questioned that kind of analysis, saying those techniques can take in inaccurate data about people which can be used against them.
“Going to apply machine learning on unstructured data—unstructured data typically signals that you’re not really doing anything to verify this information or put it into a format and then audit how the algorithm came up with the answer,” the EFF fellow said. “Part of this investigation is identifying fraud trends. Assuming that there are fraud trends to identify and fitting the mass of data to that assumption, then people get caught in the net.”
When DigBoston requested documents regarding Deloitte’s scope of work, DHCD redacted those tactics, saying making them public could jeopardize public safety or cybersecurity. After we obtained the unredacted version on a government website, DHCD said publishing Deloitte’s tactics could alert “bad actors” and help them avoid getting caught.
Rathi said that didn’t hold up.
“If they reveal what type of information is being collected and matched, that might tip off the fraudster and they’ll try to evade—I don’t think that’s a good argument, it’s not describing a specific investigation being carried out. It’s not a secret government technique fraudsters aren’t aware of,” he said. “The whole point of records laws is to be aware of these things. We really don’t have specific info about what they’re doing.”
To scrape? Or not to scrape?
In describing its anticipated activities under the statement of work, Deloitte says it will “Perform investigative review procedures and document procedures performed including the incorporation of open-source intelligence, summary of findings, and final recommendation.” But DHCD redacted the names of those open-source intelligence partners.
The unredacted version says Deloitte uses databases like assessment and property tax records and public records aggregators like LexisNexis. It also uses Sprinklr, described as a “Social listening tool that aggregates hundreds of thousands of data sources from news, social media, forums, etc.,” and Voyager Analytics, which searches for data on Facebook, Instagram, Twitter, and other sites and “applies machine learning, natural language understanding and analytics to understand behaviors in near real-time from unstructured data.”
Voyager has aggressively marketed itself as a tool for law enforcement, according to a report released by the Brennan Center last year. The center used freedom of information requests to obtain Voyager’s e-mail correspondence with the LAPD, which gave the company a four-month trial for some of its products, and Voyager boasts of its abilities to identify potential terrorists through social media postings.
“VoyagerCheck allows our clients to gain immediate insights about a social media user’s ties to or affinity for Islamic fundamentalism or extremism. The results are color coded (green, orange, and red), based on Artificial Intelligence calibrations, to allow for a result within minutes that does not require any intervention or assessments by an analyst or investigator,” the company wrote, according to the emails. “This provides a flag or indication for further vetting or investigation, before an incident has occurred, aspart of an effort to put in place a “trip wire” to indicate emerging threats.”
Voyager did not respond to a request for comment.
Although the signed statement of work lists Voyager as a partner, DHCD officials said the list was a sample only and that there were no plans for Deloitte to use them in Mass, adding that any subcontracting work would have to be approved by the state. Rathi questioned why a listed partner was redacted.
“The further down the rabbit hole you go, the more disconnected it becomes from oversight and due diligence,” Rathi said. “It’s a big concern that they redacted the identity of the subcontractor, that prevents the public from knowing who the government is partnering with and investigating them.”
Kade Crockford, director of the Technology for Liberty program at ACLU Massachusetts, said there needs to be more oversight of how the government uses technology in general.
“Corporations and the government are increasingly collaborating to obtain and analyze sensitive information about people accessing government services, and use that information to make decisions about whether people should be granted access to those services. Far too often, these processes are shrouded in secrecy, and not subject to any kind of external review,” Crockford said. “Most people subjected to these surveillance and analysis programs probably don’t even know it’s happening.”
Crockford said the ACLU is backing legislation to create a commission studying how algorithms and automated decision systems are used in the state’s government.
“Before deciding how to regulate these technologies, the legislature needs a comprehensive accounting of how and where these systems are in use in our state,” Crockford said.