loader

Your film, made with foundation of Unreal Engine 5, has the potential of being on par with the best Hollywood has to offer. You and your team’s skillset will determine if your movie can be sold in the same marketplaces as the Hollywood elite. Maybe you’re aiming for streaming services, or even a theatrical release – nothing should be stopping you if you’ve made a compelling story. As more filmmakers succeed with making films in this manner, the more acceptable and even expected this format will become.

The ‘Deepfake’ has provided frighteningly real face-swapping for several years, using tech well within the reach of independent filmmakers. Experiments have seen great success when adding a real, trained dataset over top of the metahuman performance. This final step, applied to the final or close to final output (it may want to be done ‘under’ some FX work) can radically affect how real this animated film looks.

Many FX can happen within the engine itself, but you may find more control adding things like lightning, rain, lasers, and so on, after the fact. Programs like After Effects specialize specifically in fine tuned control for all manner of effects. Tutorials for almost anything AE related can be found online, usually easy-to-follow, that will allow you to lift the production quality once again.

While some coloring may be done in the engine pre-render, additional color adjustments may be done on a shot by shot basis in the edit suite. This is a not-to-be-under-estimated stage that turns your ‘render’ into a ‘movie’.

Starting with the soundtrack created for motion capture, it is here that soundtrack is expanded, refined and sweetening occurs. From adding environmental sounds to match the now-available visuals, to providing sounds for post-added visual effects, this audio step is vital for bringing your movie to life.

Some/most editing would be done in Unreal, assuming a multi camera output hasn’t been decided on – but a cleanup edit (removal of bad frames, time jumping) would be done in Davinci or other post editing software. Your scenes will now run in realtime at their full/final resolution as opposed to Unreal’s close-to-real-time proxy. This is when you really see your movie come together!

The project moves from Unreal to a traditional edit suite. This allows more audio work, effects, coloring and music work to be done. Our preference is Davinci but Premiere and other non-linear editors are at the user’s discretion. Because exporting is usually done as png sequences, the first task here is to import each scene and match it to its audio track.

Path tracing is an extremely processor intensive rendering routine that produces very realistic results due to the way light is bouced. If this method is desired then a render farm and networking render solution must be executed on. Updates in the Engine are improving the speed of each rendered frame but presently, it can still be prohibitive.

This is the last render – where a scene based on its edit list will be exported. Over the course of the film, several exports will have been made, scene to scene, possibly half resolution (to save time) as adjustments could be made as required. The final renders will be full resolution and can be an extremely time consuming process, especially if path tracing has been chosen as the delivery look.

This is ‘linear’ editing of the camera tracks, per scene, within the sequencer of the scene. Just like a non-linear edit track, shots are chosen to best visually present the scene. In this case, though, the cameras are all in play and the edits jump between the cameras. It is akin to editing a live concert or sporting event.