2014 Conference Summaries

Emerging Tools | Outils émergents, Neil MARSDEN, Alexandros GOUVATSOS, Akouvi AHOOMEY, Andrew TOUCH, Tarik ABDEL-GAWAD, Tom BOX, Aurélien CHARRIER, Grant GILBERT, (modérateur Cédric GUIARD) © D. Bouchet/CITIA

Emerging Tools

  1. Speakers
  2. Moderator
  3. Optimising TV series pre-production: Redboard
  4. Akeytsu: 3D software 'totally attuned' to artists
  5. Unity Technologies: the new animation 'engine'
  6. DBLG and Blue-Zoo build their own 3D-printed bears
  7. Bot & Dolly: the staged robot


One of the obvious purposes of the conference on emerging tools is to assess new tools while yet another is to examine everything covered by the notion and definition of animation. The British studio Hibbert Ralph Animation developed Redboard, an animation previz tool that can be used from storyboarding on. Nukeygara is a young French company that markets the Akeytsu 3D software package to streamline animation by moving away from traditional rigging. The field of stop-motion animation is also innovating, thanks to the DBLG and Blue-Zoo Animation companies and 3D printing. Video gaming is likewise a source of inspiration. Recent advances by the Unity platform prove this: visual prototyping, interactive storytelling and extended storytelling are all new horizons which include the game platform. Last of all, the Bot & Dolly robots which magnificently served picture-making in Gravity subtly combine virtuosity and fine-tuned precision.



Key words

Bot & Dolly, Box, robotics, Gravity, Unity, visual prototyping, interactive storytelling, extended storytelling, Nukeygara, Akeytsu, 3D, rigging, Blue-Zoo, DBLG, Bears, stairs, 3D printing, Hibbert Ralph, Redboard, previz, layout, storyboard 

Optimising TV series pre-production: Redboard

The British entity Hibbert Ralph Animation presented Redboard, a storyboard and production management tool that goes beyond 3D storyboarding, using video gaming as a basis. When applied in the pre-production phase, it makes it possible to integrate all elements of a 3D scene (backgrounds, characters) within a single interface. "Next, the storyboarder can interactively fine-tune the layout by using imported models, placing them where wanted and locating the cameras as needed," explains Neil Marsden who perfected this software along with Alex Gouvatsos. Once the elements are in place, an additional layer is positioned above the scene to draw in any character details, expressions and poses needed for the action. Besides this option of "hand-drawn art", Redboard also supports Photoshop editing tools so that levels of detail can be added according to production requirements. Redboard is compatible with most editing software (Apple Final Cut Pro, Avid Media Composer, Adobe Premiere, Sony Vegas). Beginning with one of these software options Redboard can suggest a standard EDL (Edit Detail List) to do basic editing on the animation timing or to automatically generate dope sheets.

Once the storyboard is green-lit, Redboard goes about exporting the entire scene and consequent elements to Autodesk tools (Maya, 3ds Max, Softimage XSI): all the data is loaded during export, including the camera locations and – of course – the additional layer of drawing.

Hibbert Ralph Animation has also participated in the research of the Centre for Digital Entertainment of Bournemouth University which is perfecting tools for optimising production flows, especially for the pre-production phase and TV series. By using Kinect or PlayStation Eye "we can easily and less-expensively do the motion capture without markers, to create low-def riggings," Neil Marsden explains. "If an actor is playing a storyboard streaming past, the tool can capture the motion, edit key poses, and then play them directly in Maya."

Akeytsu: 3D software 'totally attuned' to artists

Nukeygara was started by Aurélien Charrier and Yannick Rousseau. This company develops CG software and its top tool, now in development, is called Akeytsu. Aurélien Charrier says: "I began by analyzing the complexity behind today’s market software, especially concerning rigging, and then designed a more artist-friendly alternative." He believes that the tools now being used by animators were not initially designed for them, turning artists into de facto technicians.

Akeytsu won the 2013 Ganuta prize awarded by Imaginove, the Rhône-Alpes competitive cluster. This software makes it easier to "quickly create and change characters, free of complex rigging systems. You start with a single skeleton or rig, a single skinning and several solvers to begin the animation work. Thus within about two days a model can be readied for animation."

One of the key points of the software is that it starts with the mesh: a click on the left leg’s mesh, for example, and the entire zone is highlighted and open to myriad options. "The advantage of tweaking the mesh is that it gives a more fluent approach. After that, the software has IK (inverse kinematics) solvers which integrate options for advanced animation functionalities, like any other software" (stretch, auto-reverse-foot, etc.).

Akeytsu has a squeaky-clean interface with a viewport encompassing the whole window, and an animation bank to store dailies, at the left edge of the interface. There’s also a spinner, with a wheel design, developed for use as "a hub for handling and transforming 2D. You just position it next to your character and play the desired effects through a simple click, whereas the other software packages require incessant rotation on the axes."

The stacker on the right side of the screen is a key frame editor that enables users to view any and all associated data for each animation key: number of key frame, timing, etc. Changes are not requested through a time bar but on the numbers keypad, so as to offer more precision. "We began with the old dope sheets as we knew them, to offer an intuitive tool… even if there’s still the option of working through a time bar, which doesn’t automatically pop up but can be chosen," Aurélien Charrier explains. The software’s input and output format is FBX, compatible with Maya.

Unity Technologies: the new animation 'engine'

Unity is real-time 3D software often found in the video gaming field but more and more frequently in animation now. The sector tends to use this tool in what Andrew Touch, in charge of product promotion, calls "non-play". Andrew hopes to encourage productions to build their work on this platform and, as such, he presented several recent applications in fields which are generally not related to game play. The first example is Nvyve, based in Ontario, which has created immersive 3D content mainly for the architecture sector: "While they may use Maya and 3dsMax to build their models, Unity is their real-time engine allowing free-willed movement through a given environment," Andrew Touch notes.

In 2013 the Royal Philharmonic Concert Orchestra played a re-orchestrated Four Seasons with a giant screen as a backdrop. "The Play Nicely company prepared 3D animation for this – 3ds Max for the models, Unity for the animation – based on the sonnets and adapted to the music. The animation interacted with the musicians’ volume and intensity/variations, as if responding in real time." The medical sector, likewise, is also adapting more real-time viewing especially for operating theatres, to better assess organs during surgery.

For Andrew Touch, the animation sector could particularly benefit from Unity’s potential in three areas: visual prototyping, interactive storytelling and augmented or extended storytelling. In the first area: "The fact of integrating Unity into a production pipeline means that you can navigate through a 3D scene built with any off-shelf software," meanwhile modifying whatever is needed directly in Unity Maya. "We move in FPS mode or, in other words, real-time subjective mode."

Yet another interesting angle: interactive storytelling. As a case in study, Andrew Touch mentioned World of Violet, a cartoon-like children’s app (Android, Windows). This app was developed by the Brotherhood of Skills collective and is based on a capability to interact directly with stories and always in real time, through episodes, colouring, or plot piece modification.

Unity is available in free or pay versions (depending on company turnover) or via perpetual licence.

DBLG and Blue-Zoo build their own 3D-printed bears

In 2012, the DBLG graphic design agency tendered for a revamp of the Animal Planet brand, a Discovery subsidiary featuring animal and nature videos. The founder of DBLG, Grant Gilbert, remembers: "We’d worked on a number of concepts but we needed a partner to create short 3D animations to help dust off the brand." The agency called on Blue-Zoo Animation who then used 2D drawings to design matching 3D models. "Grant was trying to get an overall origami look for it, meaning that the models had to be low in polygons," continues Tom Box from Blue-Zoo Animation. "The animals, especially the bears and penguins, were then integrated into equally low-polygon environments before they were animated in loops, like an animation GIF."

Because they had decided to keep a physical record of these animals, DBLG purchased a 3D laser printer (MakerBot Replicator 2) and soon the idea of creating a model-print based story sprung to life: Bears on Stairs. "We first created a section of moving staircase, for various stages of climbing," Grant Gilbert tells us. He also contacted Blue-Zoo to reprocess the Maya-modelised bear character. "When Grant briefed us on his idea of doing stop-motion with 3D-printed elements as a starting point, having a bear climb a stairway, we had to entirely redesign the character, from rigging to animation, so that his upward lumbering would appear lifelike," Tom Box says.

The Bears on Stairs film has 50 frames (about 2 seconds of animation) or 50 3D print-outs, at the rate of 2½ hours for one sculpture. The finished art work can be seen at: dblg.co.uk. It’s has been viewed more than 500,000 times on Vimeo and according to Grant Gilbert: "Some people first thought that these were 3D models, but given the unusual skinning, with all of the grooves from the 3D printing, it would have been virtually impossible without countless hours of work to reach the result in that manner. We took our time to experiment with 3D printing and stop-motion, and it worked."

Bot & Dolly: the staged robot

Bot & Dolly is a subsidiary of Autofuss, founded by Jeff Linnell and Randy Stowell in 2010. For applications in cinema, architecture and others, the company develops automated arms and like solutions based on an IRIS system controlled by specialised software. Although the company’s first commission came from the Louis Vuitton company for an advertising clip, it was Google that contacted Bot & Dolly in 2012 to ask them to perfect an interactive experience for the Nexus Q, its spherical media centre. In spite of the fact that Nexus has now definitively left the scene, Google was sufficiently interested by Bot & Dolly’s work to acquire them in 2013, promising them total freedom of movement.

In 2011 Bot & Dolly virtually blossomed with work on Alfonso Cuarón’s film, Gravity. For the project they were asked to choreograph numerous tasks: moves for the motion control cameras – mounted on robotic arms – to be perfectly synchronised with the acting, the directing and the digital sets. "Rather than have an actor move through a given environment, the robots were able to control camera moves, sets and background elements and the lighting around them," explains Tarik Abdel-Gawad, Bot & Dolly Creative Technologist.

The latter next presented Box, a promotional short film featuring an actor (himself) surrounded by rectangular panels anchored to robotic arms: "We tested the projection mapping on moving objects by projecting highly-graphic elements which gave a feeling of depth and displacement."

To obtain this result combining pure illusion and technological innovation, Bot & Dolly began by installing two robotic arms and their tracks inside a warehouse. Meanwhile, using their in-house modelling and animation software called BDMove, they completely re-created the set, the two robots and the panels where the digital images conjointly produced in Maya were to be projected. "We also used Houdini and Alembic, Google’s interchange framework, to match the virtual camera moves, the arm movements and the projected 3D elements." The result is extraordinary, and can be viewed at botndolly.com/box

Drafted by Stéphane Malagnac, Prop'Ose, France

Translated by Sheila Adrian

The Annecy 2014 Conferences Summaries are produced with the support of:

DGCIS      Ministère de l'économie, du redressement productif et du numérique      Région Rhône-Alpes

Conferences organised by CITIA CITIA

under the editorial direction of René Broca and Christian Jacquemart

Contact: christellerony@citia.org