Apple has introduced a new sensor that uses its new “sensory platform” for virtual reality to provide virtual reality experiences for “a more immersive, human-like experience”.
Read moreThe company, which is also working on its own wearable, has partnered with “an incredible team” of “a diverse range of experts”, including the “worlds leading VR and augmented reality startup” LucidWorks.
The sensor is the brainchild of the company’s head of technology Craig Federighi and the team’s chief technical officer Paul Klee.
It was built from a “tremendous amount of research” and is part of the “revolutionary” company’s “sensor fusion”, Federighis said.
“The idea for the sensor fusion came out of the passion of the community,” Federigh said.
“We thought that the sensor was a critical part of how we could build our future virtual reality solutions and we thought it could make a significant difference.”
Federighi told the BBC the sensor uses a “unique, advanced” “sensing platform” to deliver “a virtual reality experience in a natural, natural, and believable way”.
“When we have a virtual reality headset, the experience is very natural,” Federghis said, adding that “we’ve got sensors that do this all the time”.
“There’s an idea of the physical world, the physical experience of the world, but it’s a bit like playing the game.”
The sensor was originally designed for the Oculus Rift, but is now being used on other devices, including the Oculus Touch and Gear VR headsets.
“What we have in mind is not just for a VR headset, but also for VR headsets that are more natural,” he said.
The new sensor was first demonstrated at a developer conference earlier this year, but Federighs said it has been refined to include “a whole range of different types of sensors”.
“In a sense, it’s been designed from the ground up for VR,” Federaghs said.
He said the sensor’s “instructions are simple to understand” and it “uses high quality optics to ensure that it can be used in VR”.
“You can see it in the virtual world, it can see you,” he added.
“It’s got a sensor, it detects light, it picks up vibrations in your body, and it’s able to interpret the environment around you in real time.”
Ferrighis explained the sensors “can see through walls and ceilings”.
“The real world can be really opaque to a virtual world,” he told the conference.
“We want to be able to see things that you can’t see through.”
“We’re really excited to be at the forefront of this technology and to work with these great companies.”
Apple is also building “samples” of the sensor and has made it available to developers to “test in their own virtual environments”.
“We believe the sensor will revolutionise the way we design and build products and services for the virtual and augmented worlds,” Federih said.
The company has also launched a virtual version of its VR headset for the Touch ID sensor, which Federighides said “will be in production in the next couple of weeks”.
LucidWorks is also a partner in the project.
“For us to have access to the same sensors as the Oculus and HTC Vive, we were able to collaborate with these companies to provide these sensors to the world,” Federigis said on stage.
“So we are really excited about the sensor.
I think that the sensors are really revolutionary.”
Federigis also told the audience that “there is a tremendous amount of information in VR” and that the “senses” used by the sensors could “open doors in other areas”.
“We have sensors that we can use in all of our virtual environments, and they can help us create these experiences that are truly human-inspired,” he continued.
“But we also have sensors, we can see through, we have the capability to track our movement, we are able to do the things that we’re always looking to do.”
“So, this is a very exciting time to be part of this amazing world.”