Categories
FMP

FMP ‘Red Dawn’ Process 03

This chapter will document the creation process from 12.11 to 19.11.2024, focusing on: the creation of the bullet time scene and its compositing, the post color grading of Part3 and its compositing (if necessary), fixing the bugs that were created in the previous videos, the compositing ideas for the final video and the post dubbing process.

The above are in no particular order on the following contents

First let’s start with the basement scene, in the video the Russian soldiers kick down the blue steel door to enter the room, so I created a basement 3D scene and made the gun firing bullets and the bullets hitting the magical walls in it, the final result is much better than the one in Part I. The bullets were fired as whole bullets and not divided into warheads and casings in the previous scene, now I made the distinction and the bullets were fired by me manually controlling the animation, which is honestly stupid.Now I made the distinction, and I manually controlled the firing animation of the bullets before, which was honestly stupid, I learned how to use C4D’s particle emitter process as well as the spatial field speed, and now it’s very easy to create the effect of the gun firing.And I also created 3 different NATO soldiers’ movements.

Moving on to the bullets, I manually created a simple 7.62x39mm bullet model (made in C4D), then I created the textures for the bullets in Substance, which of course is divided into two parts: the casing and the tip of the bullet, and so also two different texture combination files.

I found a default fabric material at the end and transferred its Normal mapping to the bullet, which makes the scratches look more visible

Then I adjusted the intensity of the lights and added GI lighting to the scene so that the lighting quality would be better.Then I rendered a frame as a test.

Here the node I used Blendspill this Gizmo tool, which can combine BG and FG, and can automatically handle the spill (of course, the spill needs to be obtained through the noise reduction of the original RGBA screen minus a simple Keylight after the results) it essentially serves to remove the character edge of the gray or black outer edges of the stuff will definitelyappear, the specific reasons for the simple general three cases: 1 Alpha channel problem: Keylight will generate an Alpha channel for transparency, but if the background area of the Alpha value is not exactly 0 (i.e., not completely deducted cleanly), may lead to the edge of the area residual mixed color (usually gray).
2 Premult (PremultProblem : During rendering, the original image may have been premultiplied and the green screen deducted without correctly managing or re-multiplying the image, resulting in mismatched Alpha values for the edges.
3 Spill: When Keylight processes the green screen edges, the green spill may not be completely removed, but instead is converted to grey in order toavoid visible jaggies.
And this node handles this with a single click, and it also incorporates Lightwrap’s functionality and can control the strength based on the RGB of the green screen footage

Of course while actually using this node I ran into a serious problem that bothered me for a long time in the beginning, in the beginning it produced results that would make the whole background a lot of noise, I was already using Denoise in the beginning, which was weird, and in the end I was normals that were produced by the IBK_Master node when I grabbed the alpha of the Core part of the character, and in the endI solved this by adjusting the size of the green value

When absorbing the background green, the system will be too dark and the reflection of green on the glasses together to recognize as a green screen, so you need to manually add a mask, this time the material’s problems are mainly these: clothes are too green, which will create some problems; shooting some of the shots of the environment is very dark, and not good to deal with the edges of the characters in the late stage

I spent a lot of time on this next step, trying to get the Z-channel (Depth) in C4D’s RS renderer so I could isolate the Depth channel in Nuke to control some realistic depth of field effects.In practice, even if you do everything right in the render settings and AOV settings, you can’t find the Z channel in the resultant Mutiple File (EXR file). This problem has been bothering me for about half a year, and I couldn’t find the answer on Youtube or in the official C4D tutorials, but in the end, I kept asking ChatGPT for an answer, which actually didn’t give me the exact solution.In the end, I found that the solution is to render an extra layer of Depth files directly to the hard disk in the AOV manager while opening the multi-channel rendering of Z-channel, so that I will get two results after the rendering, one is the normal EXR data, and the other one is the floating point data of the extra Depth.

Later I used Mixamo to generate the shooting motion, I then used the motion editing system in C4D to adjust the timing of the shooting animation and make the animation repeat until 200F

The actual animation, the character moving behind will appear in the beginning shot, this is just a preview of all of it, and of course the soldiers don’t shoot in the beginning shot, just aiming

The animation of the bullet casings being thrown appears in the GIF above, so let me explain how this was created! Here at first I wanted to manually animate the bullet shells being thrown, but then I thought I’m a graduate student so why not find a more convenient way to do it? So I thought that C4D has a particle emitter system (which I’ve never used before) and I started to try to use it to create it, and then I realized that the effect is exactly as I expected, you can adjust its emitting speed, emitting angle, emitting frequency, and the most important thing is that it can be perfectly matched with the dynamics system in C4D. I learned a lot in this series of steps, and because I have a lot of experience with C4D, so I studied all the steps on my own, without looking at one instruction at all.

Of course, this is a bullet time effect, so I also created an autofire effect, I still used the particle emitter system, but I went on to add a velocity field, and an invisible wall to block the bullet’s flight, the difficulty in this step is that the velocity field isn’t there by default, and you have to test the effect by using some other field, such as a gravity field or a gravitational field. By default the field is applied to all objects created by gravity, you need to specify a space for it.

The exact steps to do this are in Breakdown

Next is the rendering of the bullets, here to save time I rendered the scene and the bullets separately, so that 1) I can speed up the process, and 2) I can control them better in NUKE.The difficulty here is to render the bullets separately so that the bullets are affected by the environment, and also so that the bullets create shadows on the invisible ground, here I added RS tags to the bullets and the ground, and then adjusted the data in the tags.

Magic Barrier Since Alwin is in charge of it, that’s all I have in this section, and I’ve also made a generic NUKE compositing workflow to make it easier for him to add magic barriers later.

It’s a pretty perfect set of processes, not a single node in it is redundant and the results obtained are of high quality

Next, due to the a littlke slow progress of Alwin’s part of the project (houdini is difficult), I took his place at that part and created a special effects scene of a missile flying over London, which I will explain below.

First of all, I’d like to note that I was going to do this part on my own from the beginning, I started looking for footage of the sky over London around October, and I ended up using stereoscopic scenes in the web version of Apple Map for screen shots, and then using PR and NUKE to stabilize, speed up, and motion blur the footage.

Moving on to the missile part, I found the Aim-120 model on Sketchfab, then I added a rotation to it in C4D and removed the Bezier curve from the motion curve so that it rotates seamlessly

For the missile flame part, I downloaded an uncopyrighted meteorite fall clip in MotionFX and added it to NUKE, then I added that high temperature vignette preset that I talked about in Part I to apply it to the background and added a Glow effect to this clip, of course, this step requires a separate control for the luminance of the flame, I applied theI put the Alpha of the Luminance on the Glow separately so that the smoke doesn’t glow

Above you can see that there is noise in the background in channel A. Let me explain what it does.This is a virtual fog effect I did on the background, I used a Noise node and did a simple 2D tracking of the scene, then changed the parameters of the noise to make it look more realistic.

This is the final result, it would be better if we use Houdini for the whole process, but we don’t have the time.

Next I will show the production process of HUD, I use AE to draw a simple dynamic HUD, and in Pr to add a luminous effect on it, so that it looks more sci-fi feeling

Then I finished the AI dubbing for the project, and I found a perfect AI model for it, which is ElevenLabs, which has almost no usage cap per day, and the quality of the AI is high, that many of the engines that cost money are probably socked with the AI model from this site. And it’s okay at producing Russian dubs, though there are some minor lags. But overall it’s good. In PR I did the same thing as before with the voiceovers, for example I used a multichannel processor for each voice to control it to give it the effect of, say, a walkie-talkie or a radio broadcast

Here’s a video of this stage of the project, basically only Alwin’s part is left now, what I still need to produce is the calibration of the previous scenes, and the subtitles for the video.There may be more to come, you can see my production process in 4

Leave a Reply

Your email address will not be published. Required fields are marked *