Many people expected to that Apple will push the Augmented Reality feature in their new devices. However, the ARKit was largely used as a footnote on Tuesday’s event where Apple focused more on the features like Animoji and Face ID.
The most remarkable announcement about this ARKit was that the company will bring face-tracking support to the Augmented Reality platform on iPhone X, this will then allow developers to gain access to depth images and front color from the cameras even though tracking face position and expression in an actual time.
While Apple acclaimed that their dual-camera feature at the back of the iPhone X and iPhone 8 and 8 Plus were AR-calibrated, it is unquestionably true to any camera that uses the ARKit. Apple made a demo on some Snapchat filters that are tracked well but are perhaps unnoticeably way better than the existing RGB face tracking feature of the camera. As to other companies who haven’t yet invested on RGB face-tracking tech like in Snapchat, they’ll become much more noticeable.
The reason why Apple did not bring some of its camera sensor feature in the rear camera, which could actually allow AR capabilities –is most likely to be a battery issue, however, it could also be the efforts of Apple in keeping the ARKit experiences equally uniform in all devices. The True Depth camera system is most likely to have brought stuffs like the environment mesh to the platform so that the device couldn’t only detect surfaces but also complex shapes. This will then allow other developers to build AR experiences for ARKit while the users engage in the sme multiplayer experience.