ARKit

It’s just a matter of last year when the tech giant Apple unveiled ARKit in WWDC event to bring AR into mainstream consumers by allowing the developers easily bake the impressive augmented reality apps directly on iPhones and iPads that deliver a breathtaking experience. Inevitably, the step taken by Apple helps in fueling up the AR experimental projects and locking-in the Apple fans in its ecosystem by engaging user experiences.

The notable results had proven that Apple’s step towards the game-changing technology has really paid off. In order to compete with Google’s ARCore and court the developers, Apple has updated its most touted ARKit with iOS 12 release in the WWDC event 2018.

It’s considered as the ARKit release extended AR capabilities to millions of iOS users, but packed with some limitations. Unarguably, the ARKit 2.0 version with new additions and key improvements vastly improved how iOS will handle AR technology in bits and pieces, which has created ripples in the AR app development market.

Let’s take a look at what’s new in the upgraded ARKit version that will breathe new life into the AR apps:

Improved face tracking

Apple has put a lot of time in enhancing the face tracking capabilities. Mainly, of the two new features included – one is tongue detection and other is eye detection that is splendid.

A new blendshape coefficient termed as TongueOut is added for the tongue detection, which detects the movement of the tongue and how much the tongue is stuck out. The coefficient describing the extension of tongue returns the value between 1.0 and 0.0 based on the length of the tongue perceived as stick out.

The ARKit 2 can also track the eye movements through the device’s front-facing TrueDepth camera using Transform matrix. The matrix by indicating the position and orientation of the eyes makes it easy to identify where the eyes of the users are pointed at and at which position of the object. The eye gazing tracking definitely bring the impressive future applications where the interfaces will be eye-controlled and hands-free, or the expression and emotions will turn into information.

The brilliant face tracking is already in use for the creation of memojis that are more expressive and immersive, and have garnered a huge user traction.

3D object detection

Another huge thing comes up with ARKit 2 is improved detection power, where the 3D objects can also be detected by the apps. The app’s camera senses the object of any shape, measure it precisely and augment the virtual content over it.

To build compelling AR experiences, the 3D interactive visualization is added to the ARKit 2 that makes the app scan and record the spatial feature of the real-world objects and then trigger the AR content. Also, the reflections can be applied to the AR objects which makes the AR experiences more realistic.

For the precise measurement, the Measure app is in-built in the ARKit 2 that’s akin to measuring tape which helps the app in taking the virtual dimensions of the 3D objects in addition to the 2D objects. Moreover, it can measure the distance from different objects when the users drag their finger across the screen.

The 3D object detection is going to be a huge thing and have myriad of the use cases. One is playing a digital Lego bricks game with friends become possible.

Persistent experiences

Taking a layer of total immersion upward, Apple has introduced persistence of the virtual content even after the app is closed. It means if the user leaves in between after the app has started augmenting the virtual overlays over the real world environment, then the next time they don’t require to open the app and can experience the augmented world from the point where they left.

It has sprung up new opportunities for the developers to build amazing applications. For instance, the user can start building the house using Lego bricks and come back to it later in the same state. It’s also great for the AR puzzle games which the users have started solving over a course of weeks ago and again start solving it from the same state.

Shared Experiences

In order to enable the multiplayer feature in the AR apps, Apple has brought shared AR world where the different users playing the same AR game on different devices will view similar virtual environment, but from their own perspective. For instance, in Lego game, maximum four players can play the game together in the same virtual environment combined with physical space, but on separate devices where each of them can see the characters and the elements that other player is adding. Certainly, playing a game in the room for a long time is tiring, but sharing experiences make it fascinating.

With shared experiences, the digital enhancements to the world that’s AR content allow the users to interact with another and can even collaborate on different projects. Going further, the shared experiences are also persistent in nature that indicates the experience will exist even after closing the application and thereby it can be saved as well. It works brilliantly for the education, training or home renovation apps as it keeps everyone on the same page.

For instance, during the home renovation, multiple contractors and the homeowner can collaborate on the project in AR to view the potential adjustments before they occur. Plus, the homeowner can view the renovation work done by the contractor after a couple of days it’s accomplished.

USDZ format

Presently, there is no way available to share the interactive model of an object to augment. To make the sharing of AR content across devices more feasible, Apple has provided native support to a new file format that’s USDZ format after teaming up with Pixar. The format packages all the Universal Scene Description (USD) elements into a single package with enough information that is required to display an AR content and then uncompressed the zip file to make it ready for sharing.

Each USDZ file has all the elements wrapped in it which applications reference in the real-time to make the object appear in augmented reality situations on the screen. With Apple ARKit 2 integration, the USDZ format will be available in creative cloud apps like Photoshop, a photo editing app. Adobe has also created one such project named Project Areo that allows the developers to create digital experiences beyond the screen that’s immersive AR content. The AR content will remain consistent across various editing applications.

Wrap up

Apple’s most rumored AR glasses have not yet made their way into the market, but with the upgraded ARKit, Apple has baked the strength of AR glasses into the iOS devices. It’s an evidence that Apple slowly, but surely is looking forward to augmented reality development as a futuristic technology.

The iteration of ARKit takes the things a step further by unlocking the golden opportunities in the form of added capabilities that developers can tap into for crafting the applications that render realistic experiences.

Enhanced face tracking, 3D object detection, realistic rendering, persistent and shared experiences and USDZ format support in the ARKit 2 has paved the way where Apple headsets are expected to go. We will not take a deep dive for the later AR headsets, it’s a phase of taking a sneak peek inside iPhone AR.

So here, what’s the take for the developers? Tighten your seat belt and start with iOS app development powered with AR to enjoy the app’s great feat along with the mass market adoption.

Author’s Bio: I’m currently working as Content Manager with YTPals. I have a great passion for digital marketing and I help small and medium-sized businesses improve their online presence and grow their revenue by formulating effective digital marketing strategies to get free guest posting for them. Apart from Digital Marketing, I have a keen interest in Entrepreneurship, Online Brand Management, Tech Consultancy, etc.