Intel investing in Big Data to make Chip Testing faster


intel hqBig Data is clearly going to play a huge role in delivering working products to customers where structuring of data from research and customer feedback needs to be done faster and correctly. Intel is one such company that relies in processing this data to deliver better than their competition in a futuristic way. They aim to reduce costs while at it and Big data will be an enabler as the system they developed, Apache Hadoop that works together with a Big Data platform will see Intel reduce processor test costs by $30 million.

Time of Chip testing will also go down by 25% resulting in the reduced costs of production. The amount of quality checks done on a single chip is rigorous and a series of tests have to be done. The new platform has been used to gather historical manufacturing information and new sources of information that were so unmanageable to use added up to it resulting in a 5 man team reducing testing costs by $3 million for a single line of Intel Core Processors.
Extracting insight from unstructured data can be quite a task requiring the assistance of a Data Scientist and Intel is working with Apache to develop a hybrid cloud infrastructure around Hadoop. WIth this, Apache will understand how to utilise Intel hardware features and collect feedback on Hadoop support requirements.

Source: Techworld