Categories
Maya VFX Fundamentals

Week 10: Face Animation Render in Maya and compositing in After Effects

In this class, we set up the lightning and colour corrected our model to get it ready to render and compositing later on After Effects.

To create the illumination of my scene, I wanted to recreate the real lighting of the scene I took as reference. I added 3 spotlights to my scene: one is in front of the model to illuminate the face, and the other two are behind as backlights.

Since the lip sync looked a bit odd without a tongue, I modelled one, added a standard surface material with some shiny highlights (to give a wet look) and animated it in the parts with ‘L’, ‘D’, ‘S’, and ‘T’ sounds.

Using the ‘Hypershade’ editor, I added two colour correct nodes: one linked to the base colour and coat of the skin material, and the second linked to the specular colour of the skin material. I added a soft yellowish base colour to the skin as is shown in the reference clip but then added a blue/purple highlight with the specular colour to make it a bit more interesting. I also added a subsurface scattering to make the skin a bit more translucent, so it looks more realistic. 

After I was happy with my lighting, I set my project render details to half resolution to have a relatively quick render to review the final look. Thankfully, I did not have to change anything as I like how it ended so I set my project at HD 1920×1080 resolution and render the final lip sync animation as an alpha.

Final render

The final render turned out to be good and it is ready for compositing with a background and some nice VFX. So I imported it to After Effects and searched for a proper background for the scene. I found this already blurred background with a futuristic look so I added it to the comp and colour corrected it to make it darker, more saturated and showing more blue hues (I added colour balance and brightness and contrast effects).

Then I also colour corrected the face as it seemed a little flat so using the ‘Levels’ effect increased the contrast and added a bit of more shadows and saturation.

Finally, I also scaled up the face and I scaled down the background towards to end to make the scene more dynamic and simulate the camera movement.

Final Comp

I think that the final scene turned out good, as I like the contrast between the blue/purple background, and the yellowish skin colour of the face with the subtle blue/purple highlights on the head and forehead. Overall, I think the scene looks realistic and the lip sync and head movements looks pretty natural.

References

topntp26, freepik. Blurred abstract background interior view looking out toward to empty office lobby and entrance doors and glass curtain wall with frame. [Online] Available at: Available at: https://www.freepik.com/free-photo/blurred-abstract-background-interior-view-looking-out-toward-empty-office-lobby-entrance-doors-glass-curtain-wall-with-frame_1254627.htm#query=building%20indoor&position=36&from_view=search&track=sph [Accessed 12 December 2022]

Categories
Nuke VFX Fundamentals

Week 10: Real Scenarios in Production and Balloon Festival Comp Review

In this lesson we analysed the different scenarios we can face in production as a VFX compositor and then we reviewed the final composition of our Balloon Festival project.

In production for film, the stages followed are:

  1. Temps/Postviz. The temps are the preview of how the movie is going to look in low quality and the postviz is the preview of the movie but with higher quality (even there are specialized companies in this).
  2. Trailers. It shows the several shots of the movie that are finished at a good level of quality.
  3. Finals. Final product of the film. Usually it is exported to EXR., and two different Quick Times with specific settings ready for being reviewed.
  4. QC. The quality control of the final product is done by the VFX Supervisor, and they decide which one is the best product to send to the client.

There is specific software for the project management that improves the organisation and communication between the team, such us Google docs and sheetsFtrack, and Shotgun. They are useful to publish the final scenes that are ready for review, to request tasks, to agree meetings, etc.

The production roles existing in a film are:

  • Line Producer. The person that is below the Producer and is in touch or checking in with the VFX supervisor, director, editor, internal producers, producers, and artists. They manage the client, the timing, and the budget.
  • VFX Producer. This person makes sure that the studio completes the project, that they comply with the deadline agreed with the client, and that it is completed within the budget set.

A way to share and review a project development is to set VFX dailies. This is an important meeting to see that everyone is in the same direction and to receive the feedback of the film director, the client, the producer, and/or supervisor. It is usually written and recorded, and what it is agreed there cannot be changed later outside that meeting.

Once we have finished the scene we were assigned to, we will publish it so the lead or VFX supervisor reviews it. A good habit to develop is to make sure that what we are publishing is final and it does not have any errors that makes the review difficult as the schedules and deadlines in films use to be tight. Before publishing a scene, it is good practice to follow this tech check process:

  • Check notes for the shot
  • Compare new version with old one
  • Check editorial (shot that editor sent to take as a reference with the original)
  • Check if there is any retime in the shot
  • Check that our shot has the latest ‘Match move’
  • Write in the comments if we have any personal notes
  • If we have any alternatives for one shot, inform the line producer before adding this to our published scene.

Balloon Festival Comp

Once, we analysed the different scenarios in VFX production, we proceeded to review the final compositing of the balloon festival project.

For this project I modelled my air balloon in Maya (as shown in Week 3: Maya Modelling Tools Overview Part 2 – Air Balloon and Week 4: UV Set Up & Texturing in Maya).

During Maya lectures, we learnt how to model a 3D hot air balloon and animate it with a simple 360° spin animation. Then, using this and a mountain shot provided by the professor, we were asked to composite a short sequence for a ‘Balloon Festival’. There was no rules, just to put in practice all that we learnt and to have fun with the compositing.

Since I really enjoy designs with an 80s neon style with a dark background that highly contrasts with neon colours, I decided to focus in this thematic. I started trying to colour correct the scene as I wanted a night ambience and the scene was shot in plain daylight.

I tried colour correcting the scene following a tip the professor taught to us in class about separating the colour correction process in primary colours, secondary colours, and shadows. I rotoscoped some sections of the mountains and colour corrected them separately to create a bit more depth and tying to avoid a ‘flat’ look. Then I checked how it would look like with a grey background an refined the roto.

I reformatted the video to fit the size of the main comp, and then I retimed it because, since it was a time lapse video originally, it was playing way too fast. Then I linked it to one of the trackers previously created so it follows the movement of the main plate. Also, I colour corrected it slightly to make it look a bit darker, and also created a roto so it did not overlap with the mountains (created an alpha of the roto with ‘shuffle’ node and copied it in the sky nodes trail).

I did not look the look of the grass in the foreground as it had some gold colour from the original lighting so I decided to add some coloured fog in front to disguise this. So I found a fog video online and added to the comp. I also colour corrected it and made it purple so it matched the palette of colours I wanted to achieved (black, blue/green, and purple).

Following on, I added my 3D air balloon model to the comp. I added four air balloons with different scale position and movement, and also I colour corrected them adding some purple and blue highlights and making them a bit darker. To make the comp a bit more interesting, I also added like magical and colourful trails to two of the air balloons, again, with purple and blue tones.

Then I wanted to add the text ‘Balloon Festival’ like if this were the promotion video of an actual festival. I created a neon effect adding a ‘Glow’ node so the middle of the type is white and the borders have a blue glow. I also used ‘Neon 80’ font to make it look more realistic. Then I added a roto mask to the text to create the transition of the air balloon passing and the text appearing behind it.

Moreover, I added a frame of blue and purple animated neon lights with a foggy texture that I found online. Like I did with the fog and the colour trails, I merged them to the main plate using ‘screen’ option in the ‘merge’ node so the black background is not visible and it only shows the neon lights.

Since Nuke is not very good working with sound, I exported the final sequence with the write node and imported it to After Effects to add the sound. I could also have done it with Premiere Pro, but I was having some problems with my version of the programme so I decided to use After Effects as a quicker solution. I found an 80s style royalty free music ironically called ‘stranger things’ (Music Unlimited, 2022), so I imported it to After Effects and just added a fade out at the end.

Final result

The final result has a funny and eye catching look and the 80s music sets the ambiance suitable for the style. It has been a long and hard process for me as I was struggling a bit with the order of the nodes and when to add certain nodes like ‘premult’, ‘shuffle’, ‘copy’, and when to link nodes using the ‘roto’ mask link or regular link. At the end of the day, with practice everything started to make sense and now I can say that I feel comfortable with Nuke’s compositing process and structure.

References

Apisit Suwannaka. Drifting Smoke Motion Design on Black Background Free Video [online]. Available at https://www.vecteezy.com/video/2973097-drifting-smoke-motion-design-on-black-background [Accessed 19 November 2022]

Distill, 2016. Time Lapse Video Of Aurora Borealis [online]. Available at https://www.pexels.com/video/time-lapse-video-of-aurora-borealis-852435/ [Accessed 19 November 2022]

John Studio. Beautiful colorful particles or smoke abstract background Free Video [online]. Available at https://www.vecteezy.com/video/3052087-beautiful-colorful-particles-or-smoke-abstract-background [Accessed 19 November 2022]

Mim Boon. Neon frame background animation Free Video [online]. Available at https://www.vecteezy.com/video/12276978-neon-frame-background-animation [Accessed 19 November 2022]

Music Unlimited, 2022. Stranger Things [online]. Available at https://pixabay.com/music/synthwave-stranger-things-124008/ [Accessed 27 November 2022]

Categories
Nuke VFX Fundamentals

Week 9: Blur, Defocus, and 2D Clean-up in Nuke

In this session, we learnt how to use ‘Blur’ and ‘Defocus’ in a scene and how to do a 2D clean-up using ‘Roto Paint’, ‘Difference’, ‘Regrain’, and ‘Grain’ tools.

In order to add realism to a scene, it is a good technique to add some ‘Blur’ or ‘Defocus’ to it. However, depending the desired effect, we use one or the other. We use ‘defocus’ to emulate what happens with a real lens when unfocused, so since this is a more realistic and natural effect than ‘blur’, this is more commonly used for a more cinematic and more visible effect. On the other hand, ‘Blur’ is used when we need to defocus a colour or something minimum that is going to be barely visible (more for correcting purposes rather than effect wise).

‘Z Defocus’ and ‘Z Blur’ are used to defocus or blur specific areas of the plate and can be also used to blur or defocus taking in consideration the depth of the shot when the alpha is converted to depth. With these nodes we could also defocus or blur following a shape like a disc, bladed or following a roto we made. These nodes can be used together with ‘Convolve’ node in order to defocus or blur with a roto shape in different forms.

Nuke is also used for cleaning up a scene. This can be made using ‘Roto Paint’ node with which we can paint, clone, blur, dodge, and burn specific areas of he shot. After this, we could add a ‘Difference’ node to subtract the alpha taken from the ‘Roto Paint’ area followed by a ‘Copy’ and’Premult’ nodes. Also, we could add a ‘Frame Hold’ node to freeze the reference frame where we are going to do the roto painting.

Once we added the patch or correction to our shot, it is good practice to add a ‘Grain’ effect to match the grainy texture of the video and the patch blends in. We can use ‘Grain’ node which is applied through the alpha so it does not affect to whole plate but just the alpha area, or the ‘ReGrain’ node which will affect the whole plate as double grain (so it cannot be applied multiple times).

This week’s task was to practice what we learnt today trying to do a clean-up of the school shot provided: removing some papers that are on the wall, adding some roto paint in the side of the lockers, adding something to the background door (in my case I added some animated text), adding something in perspective in the left side doors (I added a video of what it looks like a circular magic portal), and adding something interesting on the floor (I added another magic portal).

Original Plate

To start, I wanted to remove some papers from the pin board on the right. To do so, I added a ‘Roto Paint’ node and used the ‘clone’ tool to paint on top of the papers using the texture of the board. Then with a regular ‘Roto’ node I created the alpha of the painted area followed by a ‘Filter Erode’ to soften the edges and a ‘Premult’ to transform it into an alpha. All of this has been done with a ‘Frame Hold’ node so it is easier to build up the roto. Then I tracked the area with 4 tracker points and created a ‘Transform (Match Move)’ tracker node to match the move of the scene. Finally, I added the ‘Grain’ node to match the grain of the Roto Paint with the scene grain and merged it with the main comp.

Secondly, I added an animated ‘Roto Paint’ to the side of the lockers. I used the already existing tracker node that was used to remove the poster that was in the same place that I wanted to add the new ‘Roto Paint’. I created a ‘Transform (Match Move)’ tracker node and attached it to the ‘Roto Paint’ node with the animation. To animate the painting, I played with the colour and opacity adding key frames in these features. Then, I linked this the the main comp.

Thirdly, I added some text in the back doors tracking the area first and then adding ‘Corner Pin 2D’ first baked to fix the frame and then another one to match move the scene movement. I also added an animation to the text key framing the colour section, and the merge it to the main comp.

For both magic portals I used the same technique that we used last week with the ‘Planar Tracker’ and creating a ‘Corner Pin 2D (Relative)’ to fix the image to the area selected. I reformatted both clips and corrected the saturation and grades. Then I merged them to the main comp using ‘screen’ option so the black background disappears and there is an transparency effect in the colours.

‘Merge (screen)’ node
Categories
Maya VFX Fundamentals

Week 9: Speech Lip Sync in Maya

In this class, we learnt to synchronise the mouth and facial expressions of our face model with a short speech that we selected.

First, to capture a video from internet (YouTube) in this case to use it as a reference to build up our speech lip sync, we used ‘OBS Studio’. With this programme we can record the screen and sound of our computer’s desktop and then edit the length and export in the desired format in Adobe After Effects, or Adobe Premiere Pro. The sequence that I chose shows Charles Xavier talking to Magneto in X-Men Apocalypse (Movie Scenes, 2021).

Magneto’s Final Talk With Charles Xavier | X-Men Apocalypse (2016), (Movie Scenes, 2021)

Then, we imported it to Maya as an ‘Image Plane’. It is important to have set the timeline with the same fps (frames per second) to our clip, otherwise, it will not be in sync with the audio. In my case, I exported my clip at 30 fps so set Maya with the same value. To import the audio, we right click in the timeline and import it from there. If we want to preview the clip with the audio to double check that they are in sync, we can use ‘Playblast’ feature which will show a low-resolution preview. 

Once we had our reference clip set, we started to animate the mouth creating new targets in the blend shape previously created to fit the mouth shape to each sound of the speech. Then, using keyframing we set the exact movements we wanted and smooth them editing and adjusting the key frames in the ‘Graph Editor’. A useful reference to build the mouth shape depending on the sound of each syllable, I took as a reference an online image from Preston Blair Phoneme Series (Martin, 2018).

I did not create all the phonemes as not all of them were used and I actually blended 2 or 3 phonemes at the same time to create a different phoneme with this. I also animated the tongue, the eye brows, the jaw, the head rotate, and the neck (Addam’s apple). Below there is a sequence of screen captures of all the movements together and the targets that have been used.

I really enjoyed modelling and animating the facial expressions. I definitely need to improve and refine the animation part as some of the phonemes are not as polished as I would have liked but to be my first time animating a face and lip synching a speech I think it looks really good. I feel like this is an area that I would like to explore further.

References

Martin, G. C., 2018. Preston Blair phoneme series. [Online] Available at: http://www.garycmartin.com/mouth_shapes.html [Accessed 12 December 2022].

Movie Scenes, 2021. Magneto’s Final Talk With Charles Xavier | X-Men Apocalypse (2016). [Online] Available at: https://www.youtube.com/watch?v=1gZqgfiWDh4 [Accessed 28 November 2022].

Categories
Nuke VFX Fundamentals

Week 8: Planar Tracking in Nuke

In this lesson, we checked further nodes in Nuke and we learnt how to use a planar track to add a flat image to a sequence.

We reviewed nodes such as ‘Reformat’ (to change sequence format to match main plate), ‘Crop’ (to crop an image or a video as required), ‘Merge’ (we saw how to use it to fix the size of the bounding box of a sequence to the Alpha layer or the Background layer), and ‘Shuffle’ (to add or remove channels – R, G, B, Alpha, and Depth).

We also learnt how important is the concatenation in a Nuke comp. Concatenation is the process of moving pictures/frames in a sequence. Nuke does calculations that need to follow a logic and if this logic is broken, the final result will not work. Following on this, we analysed several ways to organise the nodes in Nuke so they follow an order and, therefore, we achieve the desired result without any error.

Finally, we also studied how to use the ‘Planar trackers’ to add a 2D image to a 3D space and how to make it follow the movement of the sequence. First we added the ‘Planar tracker’ node, select the area we want with tracking points and track like we do with a regular ‘Tracker’ node. Then we turn on and align the grid to the tracking points to create the perspective desired, and finally, we create a ‘CornerPin2D (absolute) to create the tracker node that we are going to link to the image that we want to add. We can track translation, scale, and rotation together or separately if desired. When there is an object in front of the area that we want to track, we can track the object separately with another ‘bezier’ in the same ‘Planar tracker’ node, so Nuke recognises that object as an area of exclusion (so it does not take it in consideration when tracking the area that we want to).

As a homework, this week we were asked to add an image to the following sequence using what we learnt today in class.

First poster planar tracker showing bezier and grid lines adjustment

I added both posters using a ‘planar track’ node to track the plane where I wanted to add the poster. For the left poster I just tracked it, adjusted the grid lines to the perspective plane I wanted, and then created a ‘corner pin 2D (relative)’ that will be linked to the poster. This node will let the poster or image added to follow the movement of the shot that we have tracked.

For the second poster, it was necessary to add second bezier that tracks the pole that passes in front of the poster so the programme understands that the area of the second bezier does not have to be taking in consideration when tracking the first bezier area (it is excluded). The roto of the pole was already added in the comp by the professor so I just had to ‘merge’ the second poster ‘corner pin 2D’ to the main comp. I also adjusted the ‘grade’ and ‘saturation’ of the posters, skewed them a little bit with ‘transform’ node to fit 100% the perspective, and added some ‘blur’ to remove the sharp edges from the posters and blend them in to the comp.

My Nuke comp with both poster’s added
Poster’s in street added using ‘Planar tracking’ technique

This practice seemed pretty easy to me compared with other assignments as ‘Planar tracking’ is a straight forward tool. However, at the beginning I had a problem with the middle poster that has the pole obstructing part of the view in front of it. The ‘Planar tracker’ was not reading the area properly as the tracking points were jumping from the area selected to a completely different area and was not keeping the perspective I wanted to keep. I solved this making the tracking area bigger so the programme had more information to create the track along the frames. I also colour corrected the posters to blend them with the scene and make it more realistic. Overall, I am very happy with the result.

Categories
Maya VFX Fundamentals

Week 8: Facial Expressions with Blend Shapes in Maya

In this week’s lecture, we learnt how to edit blend shapes to create facial expressions and how to make those actions interact with each other so we achieve a natural movement when animating.

Creating blend shapes and targets inside these, we can animate our face’s expressions. One by one I created the expressions as full smile, half smile, frown, half frown, blinking eyes, and open mouth. I built up the expressions using the brush and relax tools. I also added a correction in the mouth for when it is opened to relax the sides of it and make it look more realistic. Since this correction is only needed when the mouth opens, I set the mouth correction target as a driven key and the jaw joint as driver and key framed the animation.

Mouth opening with final frown animation experimentation

Initially, I struggle a bit creating the facial expressions as sometimes I forgot to switch the ‘Edit’ button of the target I was manipulating at the time and I had some errors when animating that affected previous targets. Also, I seemed to have a duplicated head node in the background and some other duplicated ‘set’ nodes that were giving me an error message all the time. After asking Nick for help, he found this duplicated nodes in the ‘Node Editor’, and after delete them, my face model started to work better. I had to delete all the targets and the blend shape I had and start from scratch. I did this numerous times before finding the real error, and it was really time consuming, but at the end of the day, it was a really good practice and I learnt how to solve the error myself for future projects.

Categories
Nuke VFX Fundamentals

Week 7: Match moving – point tracking in Nuke

In this lesson, we learnt how to stabilise a shot using 2D tracks

With a ‘2D Track’ node, we can track the camera movement of a scene frame by frame to match it with another element. Then with a ‘Transform’ node we can change the translation, rotation, and scale of the frame to stabilise it. We can also create several tracking nodes from the main ‘2D Track’ node to automatically stabilize the scene, to match-move the scene, and to remove or add jitter.

Sometimes the scene has too much noise or grain and the 2D tracker is not able to track it properly.  In this case, we can use a ‘Denoise’ node to reduce the image noise or grain, so the camera tracker does not struggle the read the pixel sequence in between frames. We can also use ‘Laplacian’, ‘Median’, or ‘Grade Contrast’ to correct the grain.

As usual, it is important to set a Quality Control (QC) backdrop so we can check that the tracking or any rotoscoping added is properly done.

The assignment of this week is to stabilize the IPhone shot and to add the phone’s screen animation with ‘Rotoscoping’ and ‘Corner Pin’ nodes.

Full comp
Final result

Iphone comp improved: I tried to improve the fingers roto using the green despill set up that the professor sent to us and also improved the screen animation using the ‘curve editor’ to soften the starting and end points of the movement.

Improved comp with green despill
Improved comp

I struggled a bit with the fingers rotoscoping as when the fingers are moving faster, it is hard to roto the motion blur. The green despill set up we got from he professor helped a bit but I still do not fully understand how it works so I am sure that I could improve this comp once I learn how the green despill technique works.

Categories
Maya VFX Fundamentals

Week 7: Facial Animation Set Up, Hierarchies and Rigging

In this session, we reminded how to create blend shapes to animate the facial expressions and how to create a ‘rig’ or ‘skeleton’ to animated the head and the mouth of our model.

Using the ‘Shape Editor’ tool, we can create a blend shape or shape variation in order to set the facial expressions of our model. On each blend shape, we need to add ‘target’ points with which we could create our movements or reshapes, for example the eyes opening and closing, the mouth smile, or the eye brows frown.

In order to animate the head and the mouth opening, we created a ‘rig’ or ‘skeleton’ that will determine the joints of the neck and jaw. After setting the rig, we bound the skin of the model to our rig and painted the skin weights to add the influence parts of our model (the parts the will be more influence by the rig movement).

Lastly, we created the model’s set of teeth and added them to the rig influence.

Rigging
Teeth wireframe

I struggled a bit with the painting of the weights to open the mouth of the model. I had to adjust my mesh with the ‘soft brush’ and ‘relax’ tools so it started to respond appropriately.

Categories
Nuke VFX Fundamentals

Week 6: Merging and Colour Matching in Nuke

In this lecture, we learnt how to colour correct a sequence, the different colour spaces of a file, and how to import and export it.

We saw how to use ‘Grade, ‘ColourCorrect’, ‘Toe’, and ‘Blackmatch’ nodes to correct the colour of a sequence. These nodes can be used to correct specific parts of a sequence using rotos or to colour grade an alpha. Alphas need to be premultiplied to be added to the background plate, however, some alphas already come premultiplied, so in this case, we will add an ‘Unpremult’ node, then add the ‘Grade’ and/or ‘ColourCorrect’ nodes and then ‘Prebuilt’ node again.

It is also important to take in consideration the codec or colour space and linearization of the file imported as depending of what we are going to use the file for, we will need more information preserved in the file or a smaller size file. The files in a film production can be shared with compositors as LUTS, CDLs or graded footage. We also discovered the new ‘OCIOColorSpace’ node, which is used when the footage provided has already been graded.

And lastly, we saw proper ways to build up a map for grade and colour correct a footage, separating the primary and secondary colours correction, and then correcting the shadows in the last step. This way, if more amendments are requested, we can make the changes quicker.

The assignments of this week were to colour correct and airplane alpha to match its background and to carry on making some colour corrections in the previous mountains video using the roto created last week.

We also were asked to plan our air balloon sequence which we will be building up until the end of the term 1. My main idea for my air balloon video is to add a dark style, with neons and glowing lights, and add mist and thunders around the mountains.

Categories
Maya VFX Fundamentals

Week 6: Facial Detailing, Texturing and Animation Setup in Maya, and Texture Correction in Mudbox

This week, we learnt how to add a UV texture to an organic model, a human face in this case, using both Maya and Mudbox.

In Maya, we imported the skin texture to the project. Then we created a UV map from the model in the ‘UV Editor’, and using the ‘grab’ tool, we started to adjust the UV map to the texture imported. Since the texture imported was designed for models with opened eyes (ours had the eyes closed), we exported it to Mudbox and, using the stamp tool (similar tool to Photoshop stamp), we edited the texture to match the closed eyes of our model. Once finished, we imported the edited texture back to Maya and re-adjusted it. Since the texture was looking completely flat, we added a bump map using the ‘Hypershade’ to add the pores, marks, and facial lines effect to the skin. Finally, we also added a UV map texture to the eyes and, using the ‘Animation editor’ we opened the eyelids of the model so we could see the eyes’ texture.

I had some issues with the UV map as my model mesh needed to be adjusted in the middle part of the nose (I had some triangulated mesh there so needed to make it squared and follow the rows and columns of squares, to make it more symmetric). Once adjusted, the UV map started to respond better and I could adjust the skin texture more accurately.