Skip to main content

Niantic and 8th Wall add hand tracking to AR toolkit


Niantic and 8th Wall announced today that they’re adding hand tracking to the latter’s AR developer toolkit. According to 8th Wall, developers will now be able to build hand and wrist-based AR applications. They can incorporate real-time APIs from the open web to incorporate textures or videos unique to each user. It’s available to 8th Wall developers now.

The new tools contain a proprietary hand model with 36 attachment points across the palm, knuckles, wrist and fingers. This gives developers fine-tuned control over the position and point of contact for AR items on the hand. The Hand Tracking tool also has an adaptive hand mesh that forms to the shape of the user’s real hand.

According to 8th Wall, Hand Tracking allows users to interact with, move and alter objects in AR apps as if they were real, and they can also puppeteer hand-shaped objects that follow their movements. Developers can also attach special effects to users’ hands or allow them to “try on” real world objects within the bounds of the app. Developers can access Hand Tracking by cloning a sample project in the library.