7 ways government is using predictive tech

published on STATESCOOP

Predictive technologies, in which organizations use data to identify trends and make judgements about the future, are growing in government.

While discussions of predictive policing, the most prominent example, are bound to devolve into screaming matches, there are many more government uses that go unnoticed. Whether it’s collecting trash, managing the power grid, providing safe water, responding to a pandemic or providing services to children and families, agencies are finding new ways their data can help avert danger and save money.

Bill Eggers, executive director of the Deloitte Center for Government Insights, told StateScoop predictive technologies are guiding decision-making at all types of agencies.

“You can’t predict everything, but you can get a lot better at getting anticipatory in some of these areas, whether it’s preventing deaths, preventing child abuse, but also making sure your resources are deployed toward where they’re needed most, whether they’re in the form of police officers or inspectors or social workers,” Eggers said.

 

Predictive policing

Predictive policing can take several forms, but the most common application draws on historical data to make decisions about deploying resources — like sending more police officers to a particular neighborhood on a Friday night because the data shows that’s where and when violent crime is most frequent. Oakland, California, last year became the first major city to implement an outright ban on predictive policing tech (along with biometric surveillance), following concerns of bias and racism embedded into its algorithms. Joe DeVries, the city’s chief privacy officer, told StateScoop he wanted “a flat-out ban” so the city wouldn’t need to reboot the debate for every new tech acquisition.

 

Reducing recidivism

A less controversial use of data can be found on the other end of the justice system, with more judges using software to help inform decisions on sentencing and early releases. The software is also being used to allocate services for returning citizens to help them avoid reoffending. Zack Goodman, a senior data scientist with the criminal-justice data platform Recidiviz, said that rather than making predictions, Recidiviz provides states with a “ground truth” about their populations, which in turn can be used to make decisions. “I think the idea there is when you have finite parole officer resources, knowing how much one person needs attention versus another person can help you allocate the finite resources a little better,” Goodman said.

 

Sensitive trucks

Cities are sticking sensors everywhere — garbage trucks, snow plows, waste and recycling bins. The data these devices collect used more often to inform the public, like snow-plow tracking maps that many cities published during winter storms.. But this data, when paired with other sources, like social media, can also be used to optimize routes and make predictions about future capacity. Akron, Ohio, outfitted its trash trucks with radios that take energy grid readings. Researchers there said the city was in one case able to predict an outage and prevent nearly 300,000 customers from losing power. “Smart” and AI-powered trash cans, too, which help agencies plan pick-up schedules and predict need, have become a common fixture in major cities, including New York and Los Angeles, and on college campuses like MIT.

 

What a waste

Even before people show signs of being sick, virus shows up in the wastewater. During the COVID-19 pandemic, universities and cities have deployed tools to test their water systems to detect coronavirus particles, often predicting new waves of the outbreak. Ari Goldfarb, CEO of Kando, said his company is helping governments around the world analyze their water to guide public health policy, including an early warning system on airplane toilets that’s now being tested in Israel. He said his tech is also detecting upticks of the polio virus in Israel, which is helping the government direct resources to unvaccinated neighborhoods. In the U.S., Los Angeles and El Paso, Texas, are drawing on Kando’s tech and historical data to monitor wastewater and get early notice on pollutants that could damage treatment plants.

 

Watch out

When it comes to monitoring sex offenders, the public tends to be a little more permissive about predictive policing. That accounts partially for the success of OffenderWatch, a tool that serves as a public database of offenders, an app that detects when offenders are nearby and a predictive tool that helps law enforcement keep tabs on offenders. The company’s founder, Mike Cormaci, told StateScoop his tech is used by law enforcement at all levels of government, up to the U.S. Marshals Service. Several sheriffs’ offices in Ohio, New York and Wyoming have signed up to begin using the software within the last few months. Information on offenders — like employment status, family relationships and missed check-ins — is run through an algorithm that tips off officers when an offender is calculated to be at high risk of reoffending.

 

The hottest technology

Some fire departments, including New York’s, use early-warning systems to flag dangerous buildings before fires occur. FDNY’s FireCast tool in 2017 reportedly improved the accuracy of fire inspections by 20%. Oakland, California, began developing a system that could help the city find dangerous buildings after a fire killed 36 people in a warehouse in 2016. The State of California is trying to use tech to get a handle on its wildfires, too — Gov. Gavin Newsom has approved contracts to test predictive modeling software and drones equipped with sensors to be used during active fires. Palo Alto, California, is also developing an early-warning system that one official told StateScoop could use infrared imaging and machine learning to detect blazes in the foothills around Silicon Valley.

 

Won’t somebody think of the children

Machine learning algorithms are also used by child welfare and family services agencies to predict which children may be at greatest risk of abuse. In 2014, Los Angeles County started testing a system called Approach to Understanding Risk Assessment, or AURA, for such a purpose, but dropped it three years later officials said it was returning thousands of false positives. But according to the American Civil Liberties Union, child-welfare agencies in at least 11 states last year were still using predictive analytics to help children suffering abuse. One is Allegheny County, Pennsylvania’s, screening tool, which since 2016 has generated “family screening scores” that help case workers make predictions on long-term likelihood that children will need to be removed from their homes.

Your wastewater contains the data.

We just need to extract it so that you can optimize your operations.

Contact us and a member of our team will get back to you as soon as possible.

-->

Sign up to Kando Newsletter

Request a Demo