Apple has some exciting things in store for augmented reality (AR) developers, with a new report claiming that an updated ARKit framework will be able to detect human poses. Moreover, WWDC 2019 will see Apple’s new app for visual AR content creation and developers will be getting a dedicated Swift-only software framework for augmented reality.
Here’s the relevant excerpt from today’s 9to5Mac reporting by Guilherme Rambo:
AR on Apple’s platforms will gain significant improvements this year, including a brand new Swift-only framework for AR and a companion app that lets developers create AR experiences visually. ARKit gets the ability to detect human poses.
Rambo also added that game developers will love support for gaming controllers with touch pads, such as Sony’s DualShock, as well as stereo AR headsets that bring sharper images with better depth perception in iOS 13 and macOS 10.15. Details are scant but it seems Apple will give developers writing AR and VR apps more options in terms of supported headset.
Currently, people who create AR and VR content are required to use HTC’s Vive headset in conjunction with SteamVR, Unity or any other development environment for macOS. With that in mind, support for stereo AR headsets should tie nicely with the rumored new app for visual VR content creation.
We expect to learn more details during the WWDC 2019 keynote on June 3.
What do you make of this report?
Let us know in the comments down below.
Recent Comments