
Using critical data sources to assist one state's COVID-19 response
google cloud and slalomAt a glance
What we did
- Google Cloud Platform BigQuery
- Geospatial analytics
- Looker
- Data science and predictive modeling
- Data engineering
- Data visualization and dashboarding
Facing new challenges
At the onset of the COVID-19 virus in 2020, the leaders of this state government leapt into action to slow the spread. They knew they needed relevant, valuable, and timely information to enable situational awareness, inform planning efforts, and broadly improve decision-making.
But much of the information they needed didn’t currently exist or wasn’t widely available, and the state officials didn’t have the data science and engineering bandwidth or expertise required to curate and analyze complex data.
All hands on deck
The state leaders’ challenge was to understand how their citizens were responding to the new COVID-19 guidelines in near real time, but without surveillance—or the appearance of surveillance. Our team was part of a task force brought in from all over the country to inform and plan the state’s emergency response.
We were tasked to find any open datasets that didn’t pose a privacy threat to individuals, and that would give us an idea of the population’s adherence with social distancing and other guidelines.
For help finding a fast and creative solution, we turned to Google Cloud.
Hours from the first call to project launch
Data sources engaged
Data visualization dashboards created
Daily insights for the state
Pattern recognition
We found 10 trustworthy data sources, including those that counted cars on the highway, reported traffic incidents, estimated walking traffic, and even tested air quality. All these metrics gave us an accurate, high-level picture of the population’s movement and transportation habits, without being able to identify individuals in any case.
The goal was to find patterns and trends in population mobility (or lack thereof) that we could present to the governor’s office to help them:
- provide context for COVID-19 case rates,
- fact-check news reports,
- work to improve equity in people’s ability to follow social distancing guidelines,
- track reduction in climate change gases,
- inform policy changes, and
- inform recommendations to employers.
In partnership with Google Cloud, we researched, ingested, curated, and analyzed the data, then presented it in easy-to-read reports using Looker, a data visualization product within Google Cloud Platform (GCP).
The state officials’ efforts weren’t private during that eight-month period, but they weren’t publicized either. It was important to all parties involved that if the project drew attention, there would be no data breach or privacy implications, and thus all data cleared for use in this project had to be anonymized and aggregated.
Slalom’s initial data curation and dashboarding efforts were so valuable to the state government that the governor’s office requested daily reports on an ongoing basis, lasting about eight months in 2020.
Government collaboration at scale
At the outset of the project, the state government didn’t have a scalable collaboration environment. In partnership with Google Cloud, we built out an environment that supported file sharing and data gathering, analysis and visualization.
Due to the emergency situation, the government offices didn’t close on the weekends, so neither did we. Our team worked in shifts around the clock to deliver the daily insights that were requested of us.
First, we built ingestion pipelines, each of which pulled in data from our 10 sources. Because few of the sources used cutting-edge technology, we also built custom integrations to incorporate the data from each pipeline into our overall database.
We then brought all the data into BigQuery, a powerful GCP database product, where we sliced and mixed it to get to insights at the aggregate level. After that, we pulled the insights into Looker, where we built visualizations and graphics for our reporting.
Every day we responded to a new request for a new type of analysis. We had to stay very agile and flexible.
Finally, we worked with our team of partners and clients to decide what to highlight in the copy surrounding the data visualizations. This storytelling component was a key part of the project’s success.
We collaborated with the team to make sure the reporting represented a diverse range of perspectives within the government and constituency, and that all the content stayed on message. Each daily report went through a strict filter before being delivered to the governor’s office, and we left a large amount of content on the cutting-room floor to ensure only the most pressing and relevant insights made it through.
Data with real-life consequences
While the central purpose of the project was to track COVID-19 case numbers, there were many situations in which our job was to fact-check media reports about people’s behavior throughout the state.
How many people were actually showing up in public places? Which nightlife spots were opening up in violation of state regulations? Did even necessary and important gatherings cause a spike in COVID? We needed to answer all these questions with data, knowing that everything we reported to the governor’s office could have an impact on the people living in the state.
Because of its highly sensitive nature and human consequences, the data needed to be accurate beyond any doubt. One of our core values at Slalom is “Do the right thing, always.” We never missed a single deliverable and never had to retract a number that we reported. We felt confident throughout the project that we were doing ethical reporting, and we were proud to be a credible data partner to the state.
Normal life is intense in this region. Things happen that are emergencies unto themselves, outside of COVID-19.
Ready for the next time
The state government is still using the Google Cloud–enabled digital collaboration environment we built, and they’re also growing the capability in different ways. As the pandemic winds down, the state’s leadership is now equipped with a multiuse tool that will help them respond to the next catastrophic event by understanding patterns in the behavior of people in their state.
From the start of the project, our plan was to hand off the whole cloud environment, piece by piece, to the Department of Technology and the Office of Digital Innovation, and train people within the department to take it on and run it by themselves. We wanted them to be able to keep the project going independently of us, and to be empowered to gather and analyze data in a potential future crisis.