Google is Exploring Ways to Let You Animate Your ‘Blocks’ Creations
Google’s team at Daydream Labs have been prototyping ways of animating objects and characters in Blocks, the company’s recently released VR modelling tool. In a recent entry on the Google Blog, senior UX engineer Logan Olson describes how it could give users the power to “create expressive animations without needing to learn complex animation software.”
With its low-poly aesthetic and simple menu systems, Blocks is perhaps the least intimidating 3D modelling tool currently available for VR, and the Daydream Labs team looked to retain that approachability as they prototyped animation systems during their ‘one-week hackathon’. Olson explains that this boils down to three steps: preparing the model, controlling it, and finally recording sequences for playback.
Firstly, the static models created in Blocks require some ‘prep’, adding appropriate control points and joints for inverse kinematic techniques (for models with a rigid skeleton), or for a ‘shape matching’ technique that works better for ‘sentient blobs’ or anything with a less defined shape, good for ‘wiggling’. Olson explains that there is a short setup process for shape matching but it “could eventually be automated”.
Once prepared, controlling the movement is where VR is at its most intuitive, as the motion-tracked hardware means that a simple form of motion capture is readily available, although it’s not always appropriate, depending upon what’s being animated. Olson references the creative app Mindshow that embraces this ‘puppeteering’ technique, due to launch into open beta soon. “People loved ‘becoming’ the object when in direct control,” writes Olson. “Many would role-play as the character when using this interface.”
Source Article: https://www.roadtovr.com/google-exploring-ways-let-animate-blocks-creations/