What’s New in iOS 11 Development Tools & API

What’s New in iOS 11 Development Tools & API Image
Published: 6 June 2017 Content: PixelForce

WWDC 17, an Apple conference for developers around the world, showcased a number of hardware updates and additions from the new iMac Pro to the HomePod, but the real focus is on software. Many of the upcoming software updates are to officially integrate existing concepts into the iOS platform, most notably artificial intelligence and machine learning. With iOS 11, developers are now equipped with official APIs and Swift integrated ARKit and Core ML enabling the set of core AR technology to the common app developer.

ARKit brings a documented AR foundation to iOS

Apple is introducing a new set of tools for developers to help bring high quality native AR experiences to iOS using the inbuilt camera and motion sensors. ARKit allows developers to tap into the latest computer vision technologies to build incredibly detailed and captivating virtual content on top of a real world environment. The technology will make AR objects look like they’re actually placed in real space, as opposed to simply ‘hovering’ over it.

By blending digital objects and information with the environment captured by the camera, augmented reality takes apps beyond the screen, freeing them to interact with the real world in entirely new ways. Many social media platforms have already incorporated AR capabilities but Apple’s hardware and software hasn’t been specifically built to enable it until now. ARKit it will bring AR to the general population of apps.

Core ML will intelligently process data on the fly

CoreML makes it easy for developers to create smarter apps with powerful machine learning that predict, learn and become more intelligent. Designed for iOS, this new framework for machine learning lets all processing happen locally on-device, using Apple’s custom silicon and tight integration of hardware and software to deliver powerful performance while maintaining user privacy

The key benefit of Core ML will be speeding up how quickly AI tasks execute on the iPhone, iPad, and Apple Watch. This could cover everything from text analysis, voice and facial recognition, motion and facial movement tracking, and the new ARKit.

iOS 11 has a lot of new technology implemented behind the scenes. We don’t expect every app to suddenly have to incorporate these new technologies but we anticipate a steady adoption of AR technology in general consumer apps, outside of major social media platforms. To find out how AR can fit into your business model, give us a call or email and we’ll discuss the best strategy to make the most out of iOS 11.