Technology companies are seeking to transfer the functions of artificial intelligence to smartphones and other wearable devices. For example, it is convenient to carry in your pocket the ability to show mechanics how to fix an engine or tell tourists in their native language what they see and hear. But there is a problem: you can not manage a huge amount of data that make these tasks possible without slowing down the device and not discharging the battery in a few minutes.
For many years, the central processors of Intel, ARM and others, Work of devices and servers all over the world. But the rapid development of artificial intelligence over the past five years has led to the fact that some traditional chip makers have faced real competition. The increasing capabilities of AI are largely related to neural networks that analyze patterns and participate in them. Universal processors used on PCs and servers do a poor job of handling multiple threads simultaneously.
On July 23 at the CVPR2017 conference in Honolulu, Hawaii, Microsoft announced the second version of the Holographic Processing Unit (HPU) chip for HoloLens glasses. HPU 2.0 is an additional AI processor that analyzes everything that a user sees and hears directly on the device, rather than wasting precious microseconds to send data back to the cloud. Now HPU 2.0 is in development and will be included in the next version of HoloLens. This is one of the few cases where Microsoft takes part in all stages of development (except production) of the processor. Representatives of the company declare that this is the first chip designed specifically for a mobile device.
Microsoft has been working on its chips for several years. The company has created a processor for tracking the movement for the Xbox Kinect, and recently used customizable chips – user-programmable gate arrays – to apply AI capabilities to real-world tasks. Microsoft buys chips from Altera, a subsidiary of Intel, and then adapts them for its own purposes using software.
In 2016, Microsoft used thousands of these chips to translate the entire English-language "Wikipedia" into Spanish – three billion Words in five million articles – in less than a tenth of a second. In 2018, the corporation plans to allow cloud computing users to use these chips to speed up their own AI tasks: recognize images from large data sets or use machine learning algorithms to predict various economic and other models.
Promotional animation HPU 2.0
Microsoft has a lot of competitors in this business: Amazon uses user-programmable arrays and plans to use a new developed Nvidia chip For AI microarchitecture Volta, and Google created its own semiconductors AI – Tensor Processing Unit. Creating chips inside a company is expensive. But Microsoft claims that it has no choice, because technology is developing very fast.