The latest demo comes courtesy of Osama Abdel-Karim, who uses ARKit to virtually paint on a notepad using his fingers.
According to Abdel-Karim, an iOS 11 library named Vision was used to develop the virtual drawing feature. Vision includes an object tracking feature that is able to detect the thumbnail of a finger and track its movement to enable the drawing.
Abdel-Karim has outlined the steps he used to create his ARKit demo and provided the full source code for the project.
When it launches this fall, ARKit is positioned to become the largest AR Platform in the world, using the camera, processors, and motion sensors in the iPhone and iPad to create some incredibly impressive augmented reality interactions.
As outlined in our video covering ARKit, the feature uses technology called Visual Inertial Odometry to track the world around an iPad or iPhone, allowing a device to sense how it moves in a room. ARKit automatically analyzes a room's layout, detecting horizontal planes like tables and floors, which then allows virtual objects to be placed upon those surfaces.
The first apps and games with ARKit won't be available until iOS 11 is officially available to the public, but we've seen what ARKit can be used for, with developers demonstrating everything from live filter applications in a recreation of A-ha's Take On Me video to live measurements of furniture and room spaces.
Check out all of our previous ARKit coverage below to see what else developers can do with it:
- ARKit Roundup: Turn-by-Turn Directions, Precise Room Measurements, and Pac-Man
- Apple's ARKit Used to Recreate Classic A-ha 'Take On Me' Video
- Apple Users' Mixed Reality Future Teased in Latest ARKit Demo
- Latest Apps to Showcase Apple's ARKit Include Simple Measuring Tape and Minecraft
- Developers Share First Augmented Reality Creations Using Apple's ARKit
Discuss this article in our forums