InstaSaber: Transform a piece of paper into a lightsaber with mobile machine learning and AR

Machine learning is often viewed through the lens of transformation, whether it’s reshaping an industry, automating tedious tasks, or inspiring immersive experiences.

The latter transformation is front and center in Hart Woolery’s InstaSaber. It’s an ML-powered iOS app that lets users take a rolled up piece of paper and, by holding it in front of the phone’s camera, generate a realistic augmented reality (AR) lightsaber that’s highly-responsive to real-time hand movement.

To everyone’s delight, shenanigans have ensued — Hart set up a Pinterest page to showcase how people are using InstaSaber in their everyday lives. Office pranks, Jedi cats, and plenty of procrastination.

But as is often the case with experimentation and innovation, InstaSaber was largely a byproduct of a bigger problem-solving effort. Hart was trying to solve the difficult computer vision task of tracking hand movements on-device when he stumbled upon the fun, unique user experience.

“The original intention [was] to use it for YoPuppet, an app I’m working on now,” Hart said. “Since a piece of printer paper is universally easy to get ahold of, and is uniform in shape and color when rolled up, I thought that would be a good place to start. The idea to add a lightsaber popped into my head when I was waving it around.”

What was originally meant to be a tech demo suddenly morphed into a full-fledged experience that users could easily engage with, without any technical expertise or know-how.

Tracking Real-World Performance to Ensure a Consistent User Experience

As it turned out, learning how to track hand movements was only part of the InstaSaber equation. Achieving consistency was difficult, and Hart needed a way to determine if his hand tracking model was performing well. And with more than 15,000 app installs in the first couple of weeks, this meant he needed to monitor performance on a bunch of different devices and in different conditions. Without a team of developers at his disposal, Hart needed some help.

The result is a high-quality user experience that showcases what’s possible at the intersection of ML and AR, two technologies that, when working together, engage users in ways that are at once unique and intuitive.

We’re in the early days of this intersection of ML and AR on mobile, and there’s more work to be done to realize its full potential. But with solutions like Fritz, developers are able to focus on building the next generation of immersive mobile experiences without worrying about managing the more complex aspects of how ML models perform on-device.

“I think the process of improving is more art than science,” Hart said. “But it’s nice to have tools like Fritz to validate your work.”

Interested in more cool mobile experiences that combine ML and AR? Check out Hart’s newest project, YoPuppet, which is now available on the App Store. Here’s a look at what you can expect:

And for a deeper dive into the tech behind InstaSaber, check out Hart’s blog post that details real-time 2D/3D feature point extraction.

Fritz

Our team has been at the forefront of Artificial Intelligence and Machine Learning research for more than 15 years and we're using our collective intelligence to help others learn, understand and grow using these new technologies in ethical and sustainable ways.

Comments 0 Responses

Leave a Reply

Your email address will not be published. Required fields are marked *

wix banner square