top of page
Hallway Digital.png

This page will document my compositing project based on gaming in Augmented Reality, the client is 'Hallway Digital'-VFX Channel.

"YouTube VFX team Hallway Digital have asked for a sample VFX short film as part of their recruitment process for a junior role to join the team on a permanent basis. The specification asks for entrants to merge videogames and real-life for a short film of no longer than 3 minutes. The film must use 3D compositing techniques and 2D motion graphics. The short can be inspired by, but must not directly infringe on, existing intellectual properties. The scenario involves a protagonist testing a new augmented reality headset technology that appears believable, but only momentarily. Glitches begin to occur to the environment around them. The player sees an objective marker in the distance to end the test session, but an opponent stands in your way. The tone of this piece should be fun and humorous."

This brief requires me to create a composite from raw footage in mostly 1st person view (through a VR/AR headset), the tone of the film needs to be relatively light-hearted and humorous so I wanted to choose a fun theme as opposed to a horror/sci-fi genre I'd normally gravitate towards. I will need to create several assets similar to the Nintendo characters/objects without completely copying them. Most of the changes will be around colour or logo design as these can be easily manipulated and distorted without changing the visual style too much. The glitches will be difficult to achieve so I will need to look up different effects that I can apply to animations or through colour correction/manipulation of footage in After Effects or possibly Nuke. I will be building my assets in Blender or 3DSMax, I have researched some 3D model downloads but plan to create my own for the actual composite. I will need to source a VR headset as mine has a lot of wires and doesn't scream 'new technology', I'll also need to plan where to shoot and the kind of on set lighting I need. 

VFX Development Blog

Theme Board

Mood Board AR Game.png

Project Development

I first looked at what already exists in the world of Augmented Reality gaming, I created a mood board from these ideas and started to think of assets I could source or create from scratch...

I looked online for inspiration on digital looking/gaming visual effects that could be achieved on Adobe After Effects and Nuke, many of these included 3D tracking techniques and masking which would be interesting to experiment with. I also wanted to incorporate 3D models/2D assets, some if which I'll attempt to make from scratch in Blender or 3DSMax.

I was able to track a piece of test footage and add in an Augmented Reality menu, I also downloaded a 'Mario coin' from a free 3D asset site which I edited in Blender (scale and shape adjustment) and animated. I exported the PNGs  and then loaded it into AE as a full sequence. 

Mario.png

From these tests I was able to evaluate how well I achieved a realistic looking camera track and enhancing effects. The track worked well with the position data I was able to harvest but in future when filming the actual sequence I need to be more aware of unintentional camera movement/shake and lighting choices. 

I had added a drop shadow filter and glow effect to the text which I want to develop further as well as changing the font/layout. 

The animation of the coin was fairly similar to the Nintendo/Mario style (which was intentional) but the spin wasn't totally consistent in speed which I'd want to fix. This could be done by copy/pasting keyframe sets rather than individually keyframing every spin. I also want to render in a more powerful engine than Eevee (Cycles will be much more detailed). I am going to research better lighting set-ups as I couldn't seem to get enough light onto the subject which was a shame as it was a gold shiny coin. I also tried to include a gold coloured lamp under the coin (for a glow but this didn't seem to affect it too much).

Footage Tracking Tests

nuketracknew1.png
nuketrack2.png
nuketracknew2.png
nuketrack4.png
booking form.png

I was able to film a couple of test clips to track in 3D through Nuke, I used several similar tutorials to help me set up the nodes correctly but due to the difference in hotkeys used by the users and software versions my results weren't as exact as theirs. I was able to learn how to perform a 3D track of footage, solving that track and refining the results to exclude bad tracking points. I experimented with the amount of markers as some tutorials suggested 500 and others 800-1000 (obviously this affects the overall speed of the tracking process but allows the user to track and extract more potentially useful data. I found that both tracks had an onscreen error over the footage node after the initial track but after searching online I was still yet to find an explanation for it, I decided to look through the camera track settings and play around with the amount of tracking dots as well as the settings for number of frames tracked. I searched online for any other advice but couldn't get a specific enough answer so continued through the basic tutorial for 3D tracking to see if it made much difference to the outcome. 

booking form2.png
location risk1.png
location risk2.png

As part of the pre-production process, I filled out equipment booking forms as well as a risk assessment sheet for the location shooting; as the location shoot would be in a relatively quiet housing estate I didn't need to include too many potential risks. The main issues would be moving vehicles in and around the car park, members of the public walking through set, weather conditions (cold evening time) and general trip hazards. I filmed my footage with another member of the VFX class which allowed me to be the main character in my film and have help setting up the equipment (tripod, props and camera). We needed to ensure all footage was filmed in the 30 minute window at dusk to get the correct lighting- I later made the decision to re-film some 1st person footage due to lack of usable tracking points although some footage was still usable. I also assisted in his filming and several other members of the VFX class, this let me understand camera/tripod set-ups much better and assist with ideas/help directing/general production tasks.

nodes_edited.jpg

I also created a shot list/storyboard as a guide for filming, ultimately I adapted this list and added/changed some shots to better fit my brief. I wanted to have mainly 1st person perspective shots throughout the sequence as the focus needed to be on the VFX and integration of digital elements within the scene. The fact that it's an AR/VR experience adds to this argument.

Initial Footage Comp

This sequence was put together in Premier Pro using shots taken on the initial filming day. I was able to understand which elements didn't work as well and re-evaluate my shot list to include more opportunity for VFX elements to be shown off. I made sure to white-balance the camera before use on set and shot several of each sequence to try and get a variety of movements. On review I found that the footage was quite tricky to 3D track, especially in AE which didn't recognise any usable tracking points. 

3D Asset Development and Animation

I decided to download a free Blender file for a Super Mario style coin which I could experiment with and understand more about the reflective, shiny shaders/UV wraps involved in creating a realistic end product. I decided to use the animate function within Blender to create a short PNG sequence of how I remember Mario coins appearing in the game. I have uploaded this as a guide for my own model.

COINBLENDER TUTORIAL1.png

(Super Mario style golden coin | 3D model, 2021)CGTrader. 2021. Super Mario style golden coin | 3D model. [online] Available at: <https://www.cgtrader.com/free-3d-models/character/fantasy/super-mario-style-golden-coin> [Accessed 26 October 2021].

(Making and animating 3D Mario World Items in Blender 2.8, 2021) - Tutorial used for my own model

asset list screenshot.png

I made an asset list to help organise what I needed to source/make for the composition. I ended up making every asset myself in Blender except the Fire Flower Power-up as it was quite complex and I wanted to spend more time on my animation and main composition. I developed each asset in Blender either with the help of a tutorial or from what I had already learnt, I made sure to use reference images which helped me scale/position the models correctly. I animated all the models in Blender and exported them to Nuke and AE to practice my 3D tracking and asset placement.

COINBLENDER TUTORIAL2.png
COINBLENDER TUTORIAL4.png
COINBLENDER TUTORIAL3.png
COINBLENDER TUTORIAL5.png

Once I'd made my own coin model I was able to use the animation functionality to create a PNG sequence to export to After Effects, I repeated this process with my other built assets and began to track my footage. I experimented with Blender, Nuke and AE for tracking to find the best way to incorporate my models convincingly although I found that AE worked most consistently for me in the end. I had to try and match the frame reate of the animations with my footage but compensated this by selecting the optical flow option on AE when exporting a full comp- this works differently to the standard frame sampling/blending options and accounts for mismatched frames, creating a smoother video. I exported my compositions in HD 1080p as this is currently standard output for most digital viewing on television/YouTube.

When exporting my Blender model animations I made sure to check the transparent background box, this meant that I didn't have to try and remove the alpha of each clip/make more time consuming amends to it after export.

FLAGBLENDERTUTORIAL4.png
plant finished.png

I was able to add some basic armature to my plant model which allows it to  move for animating. I had a few issues with understanding how meshes needed to be linked/separated to allow for the movement but was able to get the important parts moving. I used pose mode to move the model around and keyframe. I started importing the finished animations into the main AE comp and tracking them into specific locations.

cube2.png

I was fed back advice from a viewer of the initial composite to ensure my animations were timed correctly with the footage which I have tried hard to correct. I also adapted my lighting setup in Blender as I went to try and match the light in the real scene as best as I could.

I was able to roughly time my Blender animations to the frame rate of the AE composition but I had to edit them several times to get a good export. I used the 3D camera track function to plot targets which could be transformed into a plane which showed the PNG sequence, it did rely on me being able to judge the angle of the camera/model in Blender. When importing it and scaling to fit the composition, I turned the flat plane into a 3D object to allow me to position it better. In order to find the best position for the models in AE, I had to 3D track each clip and assess the dot formations/potential flat target planes to find the most stable or suitable area to add the PNG sequence to. I pre-composed the dot formation as a solid layer and could then add the model in before adjusting in the main comp as a 3D shape, subtracting the block colour stand-in plane (effectively a greenscreen)

This export from Nuke (above) shows a mis-match in output quality compared to the composition settings, I managed to track a single coin into the shot but the animation isn't long enough for the footage, I was able to correct this and more accurately time it for the next export which I used AE for (It worked out to be a little easier to add PNG sequences to for me). I also needed to adjust the frame rate in order to get a smoother more fluid playback. I added an output node to the lower quality footage which upscaled it but in the end decided to re-film the outside AR elements with higher quality camera settings. This drastically improved the tracking of any 3D elements as well as adding more consistency to the grain/frame rate and colour balance of the remaining shots which I edited in AE. I was unable to re-film the first intro clips as the VR headset prop wasn't available for hire at short notice, I would have also altered the note shot (piece of paper) left with the headset to make it easier to 3D track - I believe higher quality footage would have also helped this - I couldn't 3D track it unfortunately which meant I had to manually track the photoshop file on top of the paper, this came out quite shaky despite easing keyframes, adding guide lines and motion blur/opacity blending. 

trello planning.png

I used Trello to organise my tasks and keep up to date with my plan, this functioned as my own personal production pipeline and could be easily referred to and updated as I went along. I also made various rough lists to remind myself of tasks relating to editing/exporting files but I found the permanent site easier to view and update as needed.

I created the mosaic effect on several planes in the sequence to add a 'glitchyness' to the scene, this was done through the effects plug-ins in AE, I followed this tutorial to help me:

 (After Effects Tutorial - How To Create An Animated Pixelated Background, 2021)Youtube.com. 2021. After Effects Tutorial - How To Create An Animated Pixelated Background. [online] Available at: <https://www.youtube.com/watch?v=HSajZVrsFoU> [Accessed 12 November 2021].

 

I was careful to pick colours already in the composition so the effect would seem more realistic/effective. I also adjusted the amount of 'pixels' and their motion intensity/speed. In order to try and organise my footage I edited/tracked my letter in a seperate cmposition with the rest of the into before exporting it and adding it to the main comp ~(in AR). This meant I could easily change which manual track I wanted to use and make amendments without affecting the rest of the video.

pixel effect1.png
trackingtext1.png

Colour Grading, Final Adjustments and Rendering

These screenshots show the final week of colour grading, adding in details like fireballs (created and animated in Blender, exported as PNGs and manually tracked/keyframed into the composition) Tutorial-Youtube.com. 2021. ROBLOX FIREBALL VFX. [online] Available at: <https://www.youtube.com/watch?v=ZwwWatjJ2HM> [Accessed 14 November 2021].

Once I had added a Lumetri colour effect to my clips, I re-white balanced the original introduction footage to take away some of the red/pinkish tint and added a highlight tint of green/cyan. I adjusted the exposure and saturation/tint to achieve a cold 'normal' seeming atmosphere, I wanted the AR experience POV shots to contrast to give the viewer the impression that the virtual world was bright, interesting and ultimately false. The footage I used for the AR sequence was filmed at a different time of day which helped later on when adjusting the colours, I was able to saturate the footage slightly to appear enhanced. 

I did attempt to rotoscope my hand out of a shot where it came across a 3D model at the wrong time but ended up being able ot adjust the 3D model position instead which worked out a little easier and less time consuming when I knew I had other adjustments to make before screening day.

I have included screenshots of my AE composition timeline to show how they were arranged/organized to suit my workspace. I made sure to pre-compose elements in order to work more efficiently and ended up adding my into in separately at the end of editing. I also brought in some edited Super Mario Music which i put through Premier and added a distortion effect- this was done to suit the 'glitchy' theme of the film, I used it from the moment the user turns on the headset to the ending scene where the plant enemy is defeated.

I also added in a counting function to add up points, this was keyframed in manually as each coin approached the camera. 

First Export for Peer Review

I exported my first final edit which was screened to both years of our VFX class; I was able to point out things which I wanted to improve as well as receive other feedback and ideas to work on. When watching the first export I noticed:

-The sound at the end of the clip isn't synced properly and needs adjustment.

-The models (PNG sequences) need shadows to look more embedded into the scene.

-A title screen tracked in could be added.

-The letter my character picks up needs to be manually tracked in better, I have different versions of the track I can work on for the next export.

-Possibly re-grading the intro sequence and adding some diegetic sounds (real life ambient outside noise). This will hep distinguish the two scenes from each other.

Evaluation Section

bottom of page