If an agency is compiling tons of big data, it better know how to use it, according to panelists at the Association of Government Accountants’ National Leadership Training discussion on unlocking the potential of agency data.
Government experts from three agencies shared their challenges and successes with big data at the event held Feb. 11 in Washington, D.C.
“If no one knows how to get to data, it ends up being no more valuable than the old hard copies that were two or three weeks late,” said Scott Davis, acting vice president and controller at the U.S. Postal Service.
USPS has experimented with big data to make its 30,000 offices more efficient, productive and frugal. The agency intended for individual offices to make queries from the data and use it to improve their work. However, providing mounds of data for postmasters to sift through acted as more of a barrier than an impetus.
“We thought it would be like the [movie] ‘Field of Dreams’ — if you build it, they will come,” Davis said. “When we tried to make everything available, it was not as helpful.”
Only once USPS compiled specific reports for individual offices with information on productivity did big data actually help. The jumbled mass of data was too much for individual offices to handle.
Other agencies felt the same pressure. Ginnie Mae, which works to promote homeownership, guarantees backs $1.5 trillion in mortgage-backed securities and works with 8 million loans.
“When you have 8 million of anything, you can start to imagine how much data we’re looking at,” said Gregory Keith, senior vice president and chief risk officer at Ginnie Mae.
Ginnie Mae uses its data for risk assessment and for understanding the trends of the past to predict the future. But in its fight to compete with private sector firms, it needs good interpreters of data to make sense of it.
“We need the people to interpret big data and tell us what the data means to us,” Keith said.
The post Agencies take home lessons from big data appeared first on FedScoop.