In this first lecture with Dom, we learnt how to use 3D Equaliser to track a scene’s camera movement and lens distortion, and then export it to recreate the 3D space of this shot. We also organised the assets and tasks of each member of our group and started to think about the components our scene needed.
3D Equaliser Lecture
A camera track is used to recreate the camera movement and lens distortion of a scene shot. This is then used for adding CG elements to it that will follow the same movement of the scene. There are different types of camera track such as:
- Facial tracking. To track facial movements and expressions.
- Object tracking. To track movement of objects in the scene.
- Rotomation. To match movement of objects and actors, and then adding CG to live action.
It is also important to track the lens distortion of a scene so when adding CG elements to it, they look like part of the shot as if they were filmed with the same camera lens.
In order to track our scene, we will use 3D Equaliser which is the standard camera tracking software used within the industry. The 3D tracking process is made following these steps:
- Set up camera and lens.
- Track scene.
- Lens distortion.
- 3D orientation.
- Check scene.
- Export.
This programme feels a bit more accurate than Nuke regarding the camera tracking as it has less errors when tracking and we can also add the exact points we want to track in the scene. First, we would have to import our footage to the programme, and then play around changing lighting, gamma, and contrast, to increase tracking accuracy depending of the point we want to track. Normally, when there is more contrast and there is a clear pattern to track, the programme would have less problems to track. After we have tracked our points, we do click on ‘Calculate All From Scratch’ or press ‘Alt+C’ so the programme can calculate the tracking points in 3D space.
Once we have our tracking points, we will start with the clean up process. This means to get rid of trackers that do not work that good, or smoothing the tracking lines in the ‘Deviation Browser’, or double checking the there are no tracking points outside the shot.
Then we can proceed to set the lens distortion by adding the type of camera lens details (can be found online at https://vfxcamdb.com), so then the programme can calculate the trackers in 3D space again but using the distortion details.
Following on, we can switch our view to 3D space (F6) and convert the tracking points into 3D models or mesh. This 3D model can then be exported to Maya, Nuke, and other 3D programmes, ready to be used as reference for our 3D scene.
I exported mine to Maya and added simple shapes to recreate the scene in 3D space. This would be then used as a reference for 3D artists to set their models positions and perspectives in the scene.
Steampunk project – Organising tasks and assets list
This week we decided on the assets we needed for our scene and how many assets each look dev artist was going to take care of. Since I love sci-fi and everything related with cosmos and spaceships, I decided to model the spaceship that is going to be seen from the window of the scene. Also, taking advantaged that in my collaborative unit from the last term I got to test some hologram effects in Maya using MASH, I also decided to try model the planets hologram. And lastly, I will also be taking care of design a radio which I find really interesting as I love classic 1800s radio models which are perfect for the steampunk look we are pursuing.
I then put together a moodboard of each model to see how the style of each one would be:
References
Foundry. Camera Matching (online). Available at: https://learn.foundry.com/modo/14.1/content/help/pages/rendering/camera_matching.html [Accessed 23 April 2023]