Intel Corp has launched its latest processor, the Nervana NNP-I and Nervana NNP-T, which use artificial intelligence (AI) to function. The first Intel AI chip launch is based on a 10-nanometre Ice Lake processor. This means, the systems that use the Intel AI chip will be able to handle high workloads and consume little energy, the company has said. Companies will not have to use Xeon CPUs for AI and machine learning (ML) tasks, which was the case in the past.
Enterprises are continually working to improve their data and intelligence. AI enables companies to use technology to drive more reliable and intelligent results. In this sense, the new Intel AI chip launch will help companies handle AI workloads in data center environments. Computers also need to pace up with the large number of AI applications that are being developed. Hence, this move from Intel was long pending.
It was a long wait for the Intel AI Chip
Developed at the Intel facility in Haifa, Israel, this new chip is aimed to service large computing centres. The company also reported that this AI chip — the first AI product from the technology major – comes after they have invested more than $120 million in three different AI startups in Israel.
With the chip, Intel has also designed two CPUs. They will work for large computing centers and use AI. Nervana NNP-T has been codenamed Spring Crest. The purpose is to use it for training. Spring Crest also comes with 24 Tensor processing clusters designed for power neural networks.
Intel’s new system on a chip (SoC) provides users with everything they’ll need to train an AI system on dedicated hardware. The Nervana NNP-I has been codenamed Spring Hill. This inference System on a Chip (SoC) uses a 10 nanometer process technology to help users engage trained AI systems.
Reports also suggest that Nervana would help Intel Xeon processors working in large companies. They will need help with complicated computations that will arise from the developments in AI.
What Intel Thinks
Intel’s AI products general manager Naveen Rao said, “In order to reach a future situation of ‘AI everywhere’, we have to deal with huge amounts of data generated and make sure organisations are equipped with what they need to make effective use of the data and process them where they are collected.”
With the new technology, Intel aims to “address the crush of data being generated”, Rao said. He also said the Ai chip will help enterprises use the available data more efficiently. They will be able to process it where it’s collected. It will also be easier to make sense of the data in a smart manner.
These new CPUs from Intel will compete with the Google Tensor Processing Unit, Nvidia’s tech that uses NVDLA, and Amazon’s AWS Inferentia chips.