Intel's CES 2018 keynote focused on its 49-qubit quantum computing chip, VR applications for content, its AI self-learning chip, and an autonomous vehicles platform.
Intel has announced a 49-qubit quantum chip at CES 2018, with CEO Brian Krzanich calling it a major breakthrough in quantum computing and the next step to "quantum supremacy".
Intel has announced a 49-qubit quantum chip at CES 2018, with CEO Brian Krzanich calling it a major breakthrough in quantum computing and the next step to "quantum supremacy".
During the Intel keynote, Krzanich said Intel's labs and researchers are "committed" to advancing quantum computing, with a Netherlands-based lab specifically testing and building quantum computing systems.
Intel did not disclose any timeline details for the quantum chip.
Other advanced computing systems being tested by Intel include neuromorphic computing the form of its artificial intelligence (AI) test chip Loihi, which was announced in September.
According to Krzanich, Intel now has a fully functioning neuromorphic chip that after a few weeks is already performing simple object recognition in the labs. In the coming years, Krzanich said Intel will put Loihi in the hands of partners to explore use cases.
Krzanich had kicked off his CES 2018 keynote by addressing Meltdown and Spectre, saying it is "truly remarkable" how so many tech companies have come together to research and resolve these issues.
"As of now, we have not received any information that customer data has been breached," he added.
"We expect some workloads may have a larger impact than others, so we'll continue working with industry to minimise the impact on those workloads over time."
Krzanich also discussed Intel's role as technology partner for the 2018 Pyeongchang Olympic Winter Games, saying it would provide the largest ever VR experience across a total of 30 events both live and on-demand using its Intel True VR solution.
The solution involves the placement of multiple 360-degree cameras along the perimeter and interior of playing fields and ski runs. When stitched together with software, the footage allows fans to look around the field and choose what camera position they want to view events from.
This "immersive media" viewing experience is also being expanded by installing cameras in players' helmets in the NFL to provide viewers with their perspectives, Intel announced.
Intel is additionally extending this volumetric technology to content creation such as movies, where viewers can "be the actor". By using hundreds of cameras, a scene can be viewed from any viewpoint or angle after just one take.
Krzanich said this will allow audiences to choose which character they want to view the movie from, and can be extended to such use cases as TV, advertising, and gaming.
As part of this, Intel announced an "exploratory partnership" with Paramount Pictures, with the latter company's chair Jim Gianopulos saying that such technology is "the key to our future" in the creation of a new form of entertainment.
As audiences move from flat screens to immersive experiences involving VR, Gianopulos said Paramount will be able to create content that's closer to reality than has ever been possible before by placing audiences inside the movie itself.
Krzanich also briefly addressed Mobileye's new autonomous driving platform, which he said brings autonomous vehicles "closer to reality than anyone realises"; the Volocopter drone taxi service; and the use of its Shooting Star mini drones to create light shows without the use of GPS.
Intel also extended its promise to use only conflict-free minerals in its micro-processors to include every product being labelled conflict free. Its promise to spend $300 million over five years to improve diversity in the workplace will also reach fruition by the end of 2018 as it reaches "full representation", two years earlier than its original commitment.
Intel also used its CES 2018 keynote to showcase how its Location Technologies SDK 1.0, Shooting Star quadcopter drones, RealSense Vision Processor D4 series cameras, 8th-gen core processors, Movidius Myriad X VPU running an AI engine, and SoundVision software can be combined to create a theatrical performance
During the performance, musicians "played data" via gesture control while wearing smart gloves; drones and AI musicians played music learned in real-time; and location technology was paired with sensors and cameras to present data collected from dancing and acrobatics.
Other technologies used for the show were the Unity3d game development platform for AI playback; Intel data-enabled StretchSense gloves and drumsticks; Intel processor-based servers for music generation and data visualisation; the Yamaha DC5Z Disklavier; the Derivative TouchDesigner for visual development platform; Cycling 74 Max MSP for data routing; Autodesk Maya for 3D character creation; Ableton for audio sample workflow and playback; and the Pixologic Zbrush digital sculpting tool for avatar creation.
Intel also used CES 2018 to announce its 8th-generation core processor, combining AMD's Radeon RX Vega M Graphics and 4GB of second-generation high-bandwidth memory (HBM2) and its new mini-PC NUCs during CES 2018, which pack its 8th-gen core i7 processors and are aimed at VR applications.
No comments:
Post a Comment