Unity allows developers to animate with Augmented Reality

You can know animate with your own face.

Unity Allows Developers to Animate with Augmented Reality - Made With Unity
This post might contain affiliation links. If you buy something through this post, the publisher may get a share of the sale.

The release of ARKit and the iPhone X have allowed for Unity to take the next step in augmented reality technology. They now feel that they can use this technology to more greatly capture different facial recognition that can be augmented through their software.

Unity Labs had partnered with the animation team behind the short film Windup in order to see how far they can push their real time animation technology. The result of their effort was a new device called the Facial AR Remote which has been described as “a low overhead way to capture performance using a connected device directly into the Unity Editor”.

 

The remote was found to be useful for not just for animation but also for shaping the character models and rigging them to the puppeteer. Unity states that the remote is able to take a record of an actor’s face and then quickly fix them to a specific character model. Furthermore, it can be used to update the model or change the animation to fit another character without the need to do extra takes with the actor. This accomplishes Unity goal of making the workflow lower and allowing creators to be more flexible and experimental with how the shots and animation are handled.

According to Unity the way the remote works is that the remote is “made up of a client phone app, with a stream reader acting as the server in Unity’s editor. The client is a light app that’s able to make use of the latest additions to ARKit and send that data over the network to the Network Stream Source on the Stream Reader GameObject. Using a simple TCP/IP socket and fixed-size byte stream, we send every frame of blendshape, camera and head pose data from the device to the editor. The editor then decodes the stream and to updates the rigged character in real time.”

By partnering with Windup, the Unity Team had access to their previously made tech demo and was made to use its high-quality assets. This means that they had more time to experiment with creating the different tools to best blend the models. The main focus of the trials was to work out the problems with “jitter, smoothing and shape tuning” the models. One of the hardest tasks for the team was simply figuring out how to make a young girl smile.

The Unity team hopes the AR technology that they have developed can break the limit of what developers can do with animation and make it more accessible to people of different ages even if they are not developers such as a child puppeteering a model of their favourite cartoon character.

If you would like to learn more about the intricacies of Unity’s AR technology, check out the original blog post by Jonathan Newberry which goes into even more detail about the inner workings of the .

Alex Briggs is an intern working at IGN Southeast Asia with a passion for JRPGs and platformers.

This post might contain affiliation links. If you buy something through this post, the publisher may get a share of the sale.
More Like This
Comments