CIO Steve Phillpott lays out his company’s approach to big data analytics and the need to weave it into existing corporate culture.
At a recent conference entitled “Designing and Deploying Data and Analytics-Enabled Business Capabilities” held at Stanford University, HGST CIO Steve Phillpott outlined his recent experiences in building out a big data environment. And it turns out that issues like communication and coordination are just as important as the technical challenges when it comes to a successful implementation.
Like most big data projects, HGST’s goal is to compile data feeds from a wide variety of resources, convert them into valuable information and then get that information into the hands of key decision makers. For a multinational company like HGST, this requires harnessing multiple data silos throughout the world in addition to a wide range of operational challenges, such as failure testing, enhanced data predictability and process automation.
Phillpott began his presentation by noting that big data is more than simple analytics and warehousing but encompasses an entirely new way in which disparate data sets can be coordinated and manipulated to further business processes. It’s about marrying transaction data, customer data, public sources of information and a host of other sets to create deep understanding of markets, business processes, risks and customers.
To get there, HGST decided to reverse-engineer its platform by identifying key capabilities it needed to support, and then worked backward from there. First, the company wanted a system that could be up and running quickly, but it did not have many in-house data scientists. So IT brought in big data specialist, Think Big, and within 90 days HGST was pulling upwards of 20 million files per day from a half-dozen manufacturing plants. As well, there was a need to support multiple data access models, as some users preferred to use a web interface while others preferred to leverage the data via their analytics tool of choice.
But beyond simple technology and architecture, there were some fundamental aspects of big data manipulation to be learned. The first is that data management and governance are critical; as even the most powerful solution will fail in the absence of good data stewardship. This can be accomplished by establishing a Center of Excellence that supports cross-functional data management spanning business, IT, and other areas.
To foster the communication necessary for this broad collaborative capability, HGST set up an internal Jive site dedicated to big data and analytics. The system enables real-time interaction between stakeholders, with local access for anyone from anywhere at any time. In addition, HGST established a robust self-service portal that helps push data to the end user. After all, information is useless unless it gets into the hands of business leaders.
But probably the most important aspect of the projects was cultural. There will always be corporate naysayers that seek to halt projects before they become disruptive to vested interests. The key is not to fight these forces but to invite them into the change process, giving them both a voice and a stake in changes to come. Without a doubt, this can be difficult, but it is necessary in order to maintain a cohesive, coordinated strategy for next-generation data functionality.
The over-arching goal of HGST’s big data architecture is flexibility. With multiple analytics tools at all levels, and new tools being added to the mix on a regular basis, the ability to meet new challenges and adapt to changing conditions is crucial.
Above all, the data-driven enterprise requires innovative solutions to problem-solving, and that requires coordinated efforts that span data, platforms, infrastructure, business processes and, perhaps most importantly, people.
How far along are you in your big data journey?
Download this whitepaper to avoid common roadblocks that might stall your big data project.