Mining Big Data for What It’s Worth

Mining Big Data for What It’s Worth

By Chuck Brooks

When it comes to the collection, digitization and organization of data, federal agencies today have their work cut out for them responding to a twofold challenge. In one respect, as we’ve discussed previously, agencies have increasingly followed suit in the face of the administration’s Digital Government Strategy and evolved from paper-based to electronically-based systems of document management.

The benefits are clear – improved accessibility to electronic records, reduced costs through faster document capture and recognition, and a more seamless customer experience, among others. We’ve seen the Department of Homeland Security taste success in automating paper-based processes, cutting processing time and costs while expediting the process for identifying dangerous foreign visitors. The significant challenge for all federal agencies, though, has become how to extract relevant information from decades’ worth of documents, without overloading databases with unnecessary information.

We can now circle back to the second part of our twofold challenge in managing data – sifting through the mounds of complex information collected each day from a variety of sources – combining to form the all-encompassing term: big data. As noted in Government Executive, the public sector is in the early stages of capturing large datasets, with big data programs expected to emerge in the vast majority of government agencies in the next few years. Agencies are already collecting much more data than they used to, from a wide variety of sources including blogs, emails, videos, social media and photographs – primarily technologies that were nonexistent twenty years ago. Agencies have been told how important big data is, but according to Nextgov, its true value remains elusive for many because managers find it either unreliable, insufficient or overly complex.

The term big data can be (rightly or wrongly) applied to so many functions and processes that it often feels like an abstract concept. But utilizing big data – again, the preponderance of data that agencies have access to – has led to significant innovation in federal projects related to healthcare, national defense, energy, financial oversight and fraud detection, and back office projects. From the breadth of data available, agencies can arrive at powerful insights around supply and demand projections, behavioral trends, market risks and externalities, to cite a few.

Federal agencies must address the issue of how to make sense of the substantial business data it has accumulated and turn it into smart information; in other words, they need to employ data analytics. To maximize the value of data in their possession, agencies must extract relevant information and find patterns in order to optimize business processes and improve efficiency, all while reducing overheads and eliminating overlapping deliverables. Realizing this potential will require overcoming many challenges, including digitizing data from legacy systems, accounting for duplicate and unstructured data, and navigating the volume, variety and velocity of data in today’s digital world.

For any monumental task like this, an agency needs the right personnel. The key is developing and promoting data-savvy executives and managers, who can then lead training programs for front-line and customer-facing employees so they’ll know which data is relevant and how they can drive value from it. For example, a potential customer’s purchasing habits and history of interaction with the company mean little to a call center associate unless this associate can access this information and glean actionable insights to stimulate the customer relationship. The combination of advanced analytics and proper education have made answering calls predictable and scalable, helping agencies staff call centers appropriately and cut down on inefficiencies.

The President’s Council of Advisors on Science and Technology has decreed that all federal agencies need a big data strategy. Implementing a data analytics solution that meets organizational needs can seem daunting, but by applying the right technical staff and methodologies, agencies can achieve their desired results.

Thank you for sharing!

Surekha Archer ASA

Benefits Actuarial Analyst at Microsoft

10 年

I would love to have access to all that data to find meaningful nuggets of truth that could help our government server its people more efficiently and effectively.

Bruno Codispoti

Electrical Engineer at NDI Engineering Co

10 年

Not convinced that Bid Data is the way to go.. A relational database has been around for a while, so it has been tested and it works well. If necessary, how about distributing database or data warehousing.

Charles Fiori, CFA

Based in SW FL, SME for data/elec trdg/sales & mktg. Seeking project work in writing or customer pipeline development.

10 年

Very good to see that federal agencies have been directed to have big data strategies. Co-ordination of those strategies would be advisable and helpful, of course, but before our new national chief data officer tries to co-ordinate strategies, let's get the individual ones right and appropriate for the individual agency sponsors. I doubt they will be that different across the entire federal spectrum.

要查看或添加评论,请登录

Chuck Brooks的更多文章

社区洞察

其他会员也浏览了