Apple is releasing more pieces of its future AR headset puzzle

Apple is releasing more pieces of its future AR headset puzzle

This story is part of WWDC 2022CNET’s full coverage from and around Apple’s annual developer conference.

What happens

Apple did not announce a virtual or augmented reality headset at its recent WWDC developer conference, but some new software tools were still announced that further drive AR and social connectivity.

Why it matters

Apple’s expected headset is likely to work with Mac, iPad and iPhone, and is based on ideas already developed on the company’s software. At the same time, many other companies, including Google and Qualcomm, are developing their AR ambitions.

What comes next

A VR headset may not arrive until 2023, although it is possible that one may be announced earlier.

Everyone who follows Apple latest developer conference for news about the long-awaited AR / VR headset was certainly disappointed. The event had news about Macs, iPads, iPhones, apple clockthe smart home and even bilarbut AR was barely mentioned.

Following reports of delays, Apple may not release this headset until 2023. It could mean another year of waiting … or it could mean an event later this year that previews what’s to come. Right now we are left with a lot of unknowns, but that does not mean that there were no clues and new software that seems to be very useful for a headset that is coming.

Apple already has a well-developed set of AR tools for iPhones and iPads, and depth sensing suffer scanners which can do the kind of real “masking” that VR and AR headsets need to convince virtual objects. There is a set of tools that recognize text and objects from camera feeds, much like Google Lens. Apple Watch even has some accessibility-focused gesture recognition.

What WWDC 2022 showed was a handful of improvements that, the more I think about them, seem aimed at laying down additional groundwork before a headset arrives.


SharePlay that comes into messages makes it a connected social framework that can be useful in a VR headset.


FaceTime and messaging: tools for Apple’s metaverse?

As companies have shifted to metaverse messaging over the past year, it has generally been code to reconsider massive social interaction across platforms. “Social” is a weird thing for Apple, which is not a social networking company like Meta or Snap or Niantic.

Apple has FaceTime and Messaging, which forms a proprietary network of Apple devices that can be glued to connect with people on headsets. Apples SharePlay the framework, which was introduced in 2021 iOS 15, tries to collaborate and watching / playing content with others feels more immediate. SharePlay’s reach has been expanded in iOS 16. Much of it looks like the kind of glue that Apple’s metaverse ambitions need.

Apple already has Memoji avatars, but Apple has also increasingly added sharing tools that link apps and collaboration content via messaging and FaceTime. These added features in iOS 16, iPadOS 16 and MacOS Ventura can make it easier to share things, but on a headset they can be important shortcuts to quickly connect to others. Meta’s VR headsets rely on friends and party-based connections via Messenger for social; Apple may be heading in the same direction. SharePlay coming to Game Center, Apple’s overlooked social gaming hub, seems to be an equally useful tool for future cross-play experiences.

A room is scanned by telephone, with furniture and walls displayed in luminous lines.

RoomPlan scans a large room plus its furniture and creates a 3D scan.


Can RoomPlan be a springboard to mixed reality in the room?

Apple announced a sneaky new tool in its upcoming ARKit 6: a room scanning technology called RoomPlan. At first glance, the feature looks like Apple’s own version of room scanning technology using lidar, similar to what other companies has developed. The tool recognizes furniture, walls, windows and other room elements and also quickly creates a 3D scan of a space for things like home improvement or construction.

Or, perhaps, it may enable new forms of mixed reality. “Do not forget to use human surroundings as a canvas for your experiences,” says the WWDC developer video, which describes RoomPlan and adds that you can “even include human spaces in the game you build.” While a depth map with lidar could already overlay AR objects, what this new technology can enable is to include a room in i VR and put virtual things in that layout. I saw this idea a while ago in VR headsets that used cameras to scan environments and took them into VRwhich creates a kind of mixed reality feeling, and it may well be the kind of mixed reality that Apple’s expected camera-packed VR headsets are starting to enable.

A virtual pirate ship is sitting on a real pond, seen on a phone screen.

The background video quality for AR will be better in ARKit 6.


4K video for AR sound as a way to headset

Another new ARKit 6 feature seems remarkable: AR effects will now work with 4K video input. It’s a strange feature at first glance for phones that have too small screens to appreciate 4K AR. To capture video with AR effects superimposed, it can be useful. But increasing the video quality for AR would be extremely useful in VR headsets that use built-in cameras to combine VR with video streaming from the outside world, a technology called passthrough mixed reality.

Mixed-reality VR headsets, such as the hardware that Apple is eventually expected to release, rely on a high-quality video stream to make superimposed virtual objects look realistic. On Oculus Quest 2, that video stream is grainy and in black and white. On the professional class Varjo XR-3, where I have had a chance to try a more advanced passthrough mixed reality, it is in color and much higher resolution.

The new ARKit features also appear to be faster at recognizing room details for fast AR overlays and motion-capture tracking. Both would also be useful and necessary in a headset with AR features.

A 3D map of an airport.

Apple is adding more cities to its upgraded, location-based AR-ready set of destinations in Maps.


Apple’s still expanding cities where site-specific AR could work

A number of companies have recently expanded their mapping initiatives to work with AR so that future glasses can recognize “permanent” AR objects in real places. Google is expanding AR to work on many of its features Street View-enabled mapsand Niantic’s building one crowdsourced map of playable areas for AR games. Snap has been scan cities with suffering. Apple has scanned cities with suffering likewise, but only a handful. More will be added this year, but that means there are only certain places where site-specific AR will work reliably with Apple’s AR toolkits. Apple is not expected to have any daily portable AR glasses for long, and it makes sense: in addition to concerns about battery life, cost or safety, AR glasses will need a worldwide map framework that is only half-built right now.

Still a lot we do not know

Despite many reports that a headset is imminent (or arriving in 2023), we still know nothing definite about what Apple has planned. These little speculations from Apple’s new software tools are hardly proof of anything … but they show that many of the necessary parts are put together in clairvoyance.

#Apple #releasing #pieces #future #headset #puzzle

Leave a Comment

Your email address will not be published.