For example, in a top-down game you might want to assign the player character GameObject to drive the blending instead of the Camera Transform. You can use other GameObjects to control the blending feature. Unity assigns the Camera to the Trigger by default. This transform controls the volume blending feature. Volume blending assigns a trigger for a Post-processing Layer and controls which layer affects the Camera. You can use local Volumes to change environment settings, such as fog color and density, to alter the mood of different areas of your Scene. For example, one Volume can hold a Physically Based Sky Volume override, while another Volume holds an Exponential Fog Volume override. Volumes can contain different combinations of Volume overrides that you can blend between. They each contain scene setting property values that Unity blends between, depending on the position of the Camera, in order to calculate a final value. Each Volume can either be global or have local boundaries. You can use a Volume framework to manage and blend between post-processing effects in Unity.
This component allows you to configure anti-aliasing for this post-process layer, choose which layer it will apply the post-processing to, and select the GameObject that triggers this post-process layer. To enable post-processing in your scene, add the Rendering > Post Process Layer component to the Main Camera GameObject.
Best settings for denoiser 3 how to#
This page explains how to set up the components required to create post-processing effects in your scene.