Following this, it is important for companies to search for the relevant data that can help solve their business problems.
This is exactly what lies behind Intel Corp's internal big data initiatives. Some of the work has been in helping the Intel sales team to find the right resellers and suitable products for them. In 2012, this project helped generate an estimated $20 million in new revenue and the value of opportunities, with more expected for 2013.
|
|
From a technology standpoint, companies must be fluid, flexible and ready to move to a different solution if the need arises. For example, the database architecture built to collect smart grid energy data in Austin, Texas, with Pecan Street Inc, a nonprofit group of universities, technology companies and utility providers, is now in its third iteration.
As smart meters generate more and more detailed data, Pecan Street is finding new ways for people to use less energy as well as helping utilities better manage their grids. But Pecan Street also had to be flexible and keep changing its infrastructure to meet demand.
The bottom line is companies should be ready to adapt to situations and necessities. If you think you know what tools you need to build big data solutions, a year from now, it will be a different story altogether.
Having said that, companies should also take care to connect the dots or, in other words, match the data with the core operations. At Intel, we realized there could be tremendous benefit in correlating design data with manufacturing data.
A big part of our development cycle is "test, reengineer, test, reengineer". There is value in speeding up that cycle. The analytics team began looking at the manufacturing data - from the specific units that were coming out of manufacturing - and tying them back to the design process. In doing so, it became evident that standard testing processes could be streamlined without negatively affecting quality.
We used predictive analytics to streamline the chip-design validation and debug process by 25 percent and to compress processor test times. In making processor test times more efficient, we saved $3 million in costs in 2012 on the testing of one line of Intel Core processors. Extending this solution into 2014 is expected to result in reduced spending of $30 million.
We are only at the beginning of understanding how we can use big data for big gains.
The author is general manager of big data solutions for Intel's Data Center and Connected Systems Group. The views do not necessarily reflect those of China Daily.
Chinese Internet firms dig big data gold |