The Definitive Guide to Creating Advanced Visual Effects in Unity Unity 6 Edition
The Definitive Guide to Creating Advanced Visual Effects in Unity Unity 6 Edition
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Contributors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Main author . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Unity contributors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
External contributors. . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Visual workflow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Graph logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
The Blackboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Subgraphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Attributes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Blackboard Attributes. . . . . . . . . . . . . . . . . . . . . . . . . 39
Events. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Event Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . 44
UI improvements in Unity 6 . . . . . . . . . . . . . . . . . . . . . . . . 45
Node search. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Activation ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Keyboard shortcuts. . . . . . . . . . . . . . . . . . . . . . . . . . . 47
More resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Additional references . . . . . . . . . . . . . . . . . . . . . . . . . 55
Visualizing gizmos. . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Graph fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Spawn Context. . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Capacity Count. . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Multiple Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . 60
Bounds. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Particle pivots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Decal particles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Particle Strips. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Smoke Portal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
GooBall. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Physics-based effects . . . . . . . . . . . . . . . . . . . . . . . . 92
The Ribbon Pack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Meteorite sample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Decals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Timeline. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
VFXToolbox. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Blender. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Bounds. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
Whether you plan on shooting fireballs from your fingertips or traveling through a wormhole,
visual effects (VFX) in a game make the impossible, well, possible. Not only do they enhance
the atmosphere and help tell the story of your game, visual effects bring imagined worlds to
life with details that can truly captivate your players.
Unity is pushing the boundaries of real-time graphics with tools such as the VFX Graph. This
node-based editor enables technical and VFX artists to design dynamic visual effects – from
simple common particle behaviors to complex simulations involving particles, lines, ribbons,
trails, meshes, and more.
Our comprehensive guide is intended for artists and creators looking to incorporate the VFX
Graph into their game applications. It provides specific instructions on how to use the VFX
Graph and its related tools to build real-time visual effects in Unity.
Taking into account experiences of solo developers and those on hundred-person teams, this
guide is filled with many examples to make our AAA-quality tools more accessible. This way,
everyone can find themselves at the fun part of game design.
Important note: This revised edition includes new features and quality of life improvements
available with Unity 6 and above. Please ensure that you install Unity 6 from the Unity Hub
to follow along with this guide. Also note that the naming convention for Unity releases has
changed from Unity 6 and for future releases. Read more about the new naming standard in
this Discussions post.
Contributors
Main author
Wilmer Lin is a 3D and visual effects artist with over 15 years of industry experience in film
and television, now working as an independent game developer and educator. Wilmer’s feature
credits include X-Men: Days of Future Past, The Incredible Hulk, and The Chronicles of Narnia:
The Lion, the Witch, and the Wardrobe.
Unity contributors
Mathieu Muller, lead product manager for graphics
Vlad Neykov, director, software engineering, quality
Orson Favrel, technical artist (creator of many of the new samples used in the book)
Julien Fryer, graphics engineer
Fred Moreau, technical product manager
External contributors
Marie Guffroy, technical artist
Thomas Iché, VFX artist and specialist
Today’s gamers crave deeply immersive experiences. As hardware advancements push the
limits of what mobile and console platforms can do, what used to be available only for creating
Hollywood blockbusters can now be attained in real-time.
Visual effects in games continue to have their moment as both interest and investment in
advanced graphics trend upward. After all, gameplay effects transport your players into the
action.
It’s difficult to imagine a fantasy role-playing game (RPG) without characters casting magic, or
a hack-and-slash brawler without glowing weapon contrails. When race cars plow the asphalt,
we expect that they kick up a cloud of dust in their wake.
Not even your environments are the same without visual effects. If you’re telling a film noir
detective story, you’ll likely cloak your cityscape in rain and fog. But if your characters go on a
quest through the wilderness, you might make your foliage and vegetation sway in the wind,
reacting to their every move.
V Rising by Stunlock Studios is a game made with the High Definition Render Pipeline and VFX Graph.
Visual effects uniquely enhance the gaming experience. However, creating them requires you
to don the mantle of a multidisciplinary artist who can manipulate shape, color, and timing.
That’s where the VFX Graph comes in. This sophisticated tool is equipped with workflows that
reflect those of motion picture VFX – except working at 30, 60, or more frames per second (fps).
Image from a project in development by Sakura Rabbit, made with Unity’s VFX tools
Visual workflow
For complex, AAA-level visual effects on high-end platforms, use the VFX Graph to create
GPU-accelerated particles in an intuitive, node-based interface.
— Create events via C# or Timeline to turn parts of your effects on and off
The VFX Graph works with the Universal Render Pipeline (URP)* and the High Definition
Render Pipeline (HDRP). It also adds support for the Lit outputs and 2D Renderer available
with URP. Check the VFX Graph feature comparison for all render pipelines here, and read
more about the VFX Graph’s compatibility in the documentation.
The VFX Graph requires compute shader support to maintain compatibility with your device.
Supported devices include:
— Android for a subset of high-end compute capable devices (only with URP)
The new Learning Templates sample showcases different VFX Graph features.
— The Visual Effect Graph Additions: This includes example prefabs of fire, smoke, sparks,
and electricity. Each sample shows a stripped down effect to illustrate fundamental
graph logic and construction. Just drag and drop one of the sample Prefabs into the
Hierarchy to see them in action.
Like the VFX Graph, the Built-In Particle System allows you to create a variety of effects
such as fire, explosions, smoke, and magic spells. It remains a valuable tool for real-time
effects, even though it renders fewer particles than the VFX Graph.
The primary distinction between the VFX Graph and the Built-In Particle System lies in their
hardware. The Built-In Particle System is simulated on the CPU, whereas the VFX Graph
moves many of the calculations to compute shaders, which run on the GPU.
The VFX Graph has the advantage of simulating millions of particles, but there’s a caveat;
being simulated on the GPU means that it’s computationally nontrivial to read data back to
the CPU and to interact with other systems that live on the CPU.
If you’re using a mobile platform, you’ll need to verify that it supports compute shaders in
order to use the VFX Graph. Otherwise, you might need to use the Built-In Particle System
for CPU-based effects.
The Built-In Particle System can use the underlying physics system and interact with
gameplay more directly, but its particle count is limited and its simulations must stay
relatively straightforward.
Download the Particle Pack from the Unity Asset Store to get a set of examples with the
Built-In Particle System. This sample asset demonstrates a variety of in-game effects (fire,
explosions, ice, and dissolves, among others). You can also check out this Dev Takeover for
more information on using Shader Graph with the Built-In Particle System.
URP offers standard shaders (Lit, Unlit, Simple Lit) for the Built-In Particle System, whereas
HDRP provides Shader Graph-based shader samples from the HDRP package sample. You
can review the particle system feature comparison for render pipelines here.
Note: Experimental features are not fully validated and are thus subject to change. The full
release version of Unity is recommended for production work. Please see this Discussions
post about how naming conventions will change with the release of Unity 6.
You can enable experimental features from this guide via Preferences > Visual Effects >
Experimental Operators/Blocks, as shown here:
Any visual effect in the VFX Graph is made up of these two parts:
— Visual Effect (VFX) Graph Asset that lives at the project level
As Unity stores each VFX Graph in the Assets folder, you must connect each asset to a Visual
Effect component in your scene. Keep in mind that different GameObjects can refer to the
same graph at runtime.
This opens a creation wizard that allows you to select a starting template. You can begin with
one of the default VFX Graphs or choose one of the Learning Templates from the Samples.
Note that the Learning Templates won’t appear in the window unless the additional packages
are installed in the Package Manager.
To add the effect to the scene, attach a Visual Effect component to a GameObject and then
connect the VFX Graph Asset. There are few ways to do this:
— Drag the resulting asset into the Scene view or Hierarchy. A new default GameObject will
appear in the Hierarchy window.
— Assign the asset to an existing Visual Effect component in the Inspector. You can create
an empty GameObject by right-clicking in the Hierarchy (Visual Effects > Visual Effect)
or create a GameObject and then manually add the Visual Effect component.
— With a GameObject selected, drag and drop the asset into the Inspector window. This
creates the Visual Effect component and assigns the asset in one quick action.
The VFX Graph Asset contains all the logic. Select one of the following ways to edit its
behavior:
— Select the VFX Graph Asset in the Project window and click the Open button in the header.
— Click the Edit button next to the Asset Template property in the Visual Effect
component.
The asset opens in the VFX Graph window, available under Window > Visual Effects > Visual
Effect Graph.
— Blackboard: To manage attributes and properties that are reusable throughout the graph
Make sure you leave some space in the Editor layout for the Inspector. Selecting part of the
graph can expose certain parameters, such as partition options and render states.
Graph logic
You must build your visual effect from a network of nodes inside the window’s workspace. The
VFX Graph uses an interface similar to other node-based tools, such as Shader Graph.
Press the spacebar or right-click to create a new graph element. With the mouse over the
empty workspace, select Create Node to create a graph’s Context, Operator, or Property. If
you hover the mouse above an existing Context, use Create Block.
Opening up a complex VFX Graph can be daunting at first. Fortunately though, while a
production-level graph can include hundreds of nodes, every graph follows the same set of
rules – no matter its size.
Let’s examine each part of the VFX Graph to learn how they work together.
Each Context is composed of individual Blocks, which can set Attributes (size, color, velocity,
etc.) for its particles and meshes. Multiple Systems can work together within one graph to
create the final visual effect.
Select Insert template from the menu dropdown to add a sample System from the existing
templates to the current VFX Graph. This can help you get started with some pre-built graph
logic. Select one of the Default VFX Graph Templates for a simple System, or choose one of
the Learning Templates if it’s similar to your intended effect (see Exploring VFX sample content
below).
If you select the Minimal System template from the Default VFX Graph Templates, you’ll see a
barebones System, which includes four parts like this:
The flow between the Contexts determines how particles spawn and simulate. Each Context
defines one stage of computation:
— Spawn: Determines how many particles you should create and when to spawn them
(e.g., in one burst, looping, with a delay, etc.)
— Initialize: Determines the starting Attributes for the particles, as well as the Capacity
(maximum particle count) and Bounds (volume where the effect renders)
— Update: Changes the particle properties each frame; here you can apply Forces, add
animation, create Collisions, or set up some interaction, such as with Signed Distance
Fields (SDF)
— Output: Renders the particles and determines their final look (color, texture, orientation);
each System can have multiple outputs for maximum flexibility
Systems and Contexts form the backbone of the graph’s “vertical logic,” or processing
workflow. Data in a System flows downward, from top to bottom, and each Context
encountered along the way modifies the data according to the simulation.
Systems are flexible, so you can omit a Context as needed or link multiple outputs together.
This example shows more than one Output Context rendering within the same System.
Contexts themselves behave differently depending on their individual Blocks, which similarly
calculate data from top to bottom. You can add and manipulate more Blocks to process that data.
Click the button at the top-right corner of a Context to toggle the System’s simulation space
between Local and World.
Blocks can do just about anything, from simple value storage for Color, to complex operations
such as Noises, Forces, and Collisions. They often have slots on the left, where they can
receive input from Operators and Properties.
See the Node Library for a complete list of Contexts and Blocks.
Horizontal logic
Operators flow left to right, akin to Shader Graph nodes. You can use them for handling values
or performing a range of calculations.
These Operators from the Bonfire sample, for instance, compute a random wind direction.
Properties are editable fields that connect to graph elements using the property workflow.
Properties can be:
Properties change value according to their actual value in the graph. You can connect the
input ports (to the left of the Property) to other graph nodes.
Property Nodes are Operators that allow you to reuse the same value at various points in the
graph. They have corresponding global Properties that appear in the Blackboard.
Property Nodes
The Blackboard
The Blackboard utility panel manages Properties and Attributes. To open it, click the
Blackboard button in the window Toolbar or use the default Shift-1 shortcut.
To view Properties and Attributes together, select the All tab at the top of the Blackboard. To
filter by type, select the respective Properties or Attributes tab.
Properties you define in the Blackboard act as global variables that you can reuse throughout
the graph as Property Nodes. For example, you can define a bounding box property once and
then apply it across multiple particle systems within the same graph.
— Exposed: The green dot to the left of any Exposed Property indicates that you can see
and edit it outside of the graph. Access an Exposed Property in the Inspector via script
using the Exposed Property class.
New properties are set to Exposed by default, and as such, appear in the Inspector. You must
uncheck the Exposed option if you want to hide your Property outside of the graph, and
create Categories to keep your properties organized.
The Blackboard also manages both built-in and custom Attributes, which you can drag and
drop into the graph or create directly from the interface. Each Attribute includes a short
description. Hover over an attribute to highlight where it appears in the graph.
To create Group Nodes, select a group of nodes, right-click over them, then choose Group
Selection. You can also use the new default shortcut, Shift + G.
You can also drag and drop a node into an existing Group Node. Hover the node over the
Group and release it once the Group highlights. To remove a node from a Group, hold the Shift
key while dragging it out.
By deleting a Group Node, either with the Delete key or from the right-click menu, you do not
delete its included nodes.
Meanwhile, you can use Sticky Notes to describe how a section of the graph works, plus leave
comments for yourself or your teammates. Add as many Sticky Notes as you need and freely
move or resize them.
Each Sticky Note has a title and a body. Right-click in the graph view to create a Sticky Note.
Double-click on a text area to edit its content. Set the Theme color (dark/light) and Text Size
from the right click menu to organize your notes.
Subgraphs
A Subgraph appears as a single node, which can help declutter your graph logic. Use it to
save part of your VFX Graph as a separate asset that you can drop into another VFX Graph
for reorganization and reuse. You can package Systems, Blocks, and Operators into different
types of Subgraphs.
Subgraphs can be created directly from the Project window. Navigate to Create > Visual
Effect > Subgraph Operator or Subgraph Block to start a new Subgraph from scratch. This
method allows you to design Subgraphs without first building them within an existing VFX
Graph.
Alternatively, you can create a Subgraph by selecting a set of nodes when editing a VFX
Graph, then choosing the appropriate Subgraph type (Block or Operator) from the right-click
menu. For example, if you want to convert a set of Operators, select Convert To Subgraph
Operator. Save the asset to disk, and the selected nodes will be replaced with a single
Subgraph node.
To create Input properties for the Subgraph, add new properties to the Blackboard and enable
their Exposed flag.
To create Output properties for the Subgraph, add new properties, and move them to the
Output Category in the Blackboard.
The Blackboard also allows you to define the menu Category where the Subgraph Block
appears. Use this to sort or search for Subgraphs.
Creating a Subgraph is analogous to refactoring code. Just as you would organize logic into
reusable methods or functions, a Subgraph makes elements of your VFX Graph more modular.
— Asset instance configuration: Use this to modify any existing VFX Graph. Designers
and programmers alike can adjust exposed parameters in the Inspector to tweak
an effect’s look, timing, or setup. Artists can also use external scripting or events to
change preauthored content. At this level, you’re treating each graph as a black box.
— VFX asset authoring: This is where your creativity can truly take charge. Build a
network of Operator Nodes to start making your own VFX Graph, and set up custom
behaviors and parameters to create custom simulations. Whether you’re riffing off
existing samples or starting from scratch, you can take ownership of a specific effect.
Custom HLSL in Unity 6 allows you to implement complex or unique particle behaviors
that aren’t easily achievable using the standard VFX Graph nodes. For example, you
could create custom physics simulations, particle interactions, or flocking behaviors.
Regardless of your experience level, you can start creating effects with the VFX Graph. Begin
with a premade effect to get familiar with the workflow, and then gradually assemble your own
graphs.
Attributes
An Attribute is a piece of data you might use within a System, such as the color of a particle,
its position and size, or how many of them you should spawn. Attributes can be read or
modified during the simulation to create dynamic effects.
Here are the some Attributes you’ll frequently use in VFX Graph:
Attributes are essential for managing the fundamental aspects of your VFX Graph particles.
See the Standard Attributes documentation page for a complete list.
You can also use the Attributes tab in the Blackboard (see below) to explore many of the built-
in Attributes or for defining a custom Attribute.
— Get Attribute Operator to read from Attributes in the Particle or ParticleStrip System
— Set Attribute Block to write values to an Attribute; either set the value of the Attribute
directly or use a random mode (for example, set a Color Attribute with a Random
Gradient or Random Per-component Block)
Get the Attribute with an Operator and set the Attribute with a Block.
Most Attributes are stored per particle, which can increase the memory footprint as the
number of particles and Attributes grows. For instance, if you have 10,000 particles and each
particle stores multiple Attributes like position, velocity, color, and size, the memory required
to maintain this data can become significant.
Monitor and optimize your Attributes by using the System Attribute Summary and Current
Attribute Layout displayed in the Inspector when you select a Context:
— Current Attribute Layout: This shows the Attributes used in the selected Context.
— Source Attribute Layout: This shows Attributes used in the source Context (the Context
that provides the initial data or input), e.g. Attributes initialized in the Initialize
Context that are then used in the Update Context.
For example, in the Trigger Event on Collide template, if you select a Context within the Dart_
Spawn System, the Inspector shows:
To optimize memory usage, a System only stores Attributes that are actively needed. If an
Attribute’s simulation data hasn’t been stored, VFX Graph will use its default constant value
instead of storing unnecessary data. See the Optimization section for more details about the
profiling tools in VFX Graph.
Blackboard Attributes
The Blackboard panel now has a new section dedicated to Attributes, making it easier
to create the corresponding Operators or Blocks with a context-sensitive drag and drop.
These new features are available in Unity 6:
— Drag and drop from the Blackboard to the node workspace to create a Get Attribute
Operator.
— Drag and drop from the Blackboard to a System Context to create a Set Attribute
Block.
— Create custom Attribute operators from the Blackboard and change their Type.
Events
The various parts of a VFX Graph communicate with each other (and the rest of your scene)
through Events. For example, each Spawn Context contains Start and Stop flow ports, which
receive Events to control particle spawning.
When something needs to happen, external GameObjects can notify parts of your graph with
the SendEvent method of the C# API. Visual Effect components will then pass the Event as a
string name or property ID.
An Event Context identifies an Event by its Event string name or ID inside a graph. In the above
example, external objects in your scene can raise an OnPlay Event to start a Spawn system or
an OnStop Event to stop it.
You can combine an Output Event with an Output Event Handler. Output Events are useful
if the initial spawning of the particles needs to drive something else in your scene. This is
common for synchronizing lighting or gameplay with your visual effects.
At the same time, you can use GPU Events to spawn particles based on other particle
behavior. This way, when a particle dies in one system, you can notify another system, which
creates a useful chain reaction of effects, such as a projectile particle that spawns a dust
effect upon death.
A GPU Event Context receives an Event from the Trigger Event Rate Block.
These Update Blocks can send GPU Event data in the following way:
— Trigger Event On Die: Spawns particles on another system when a particle dies
— Trigger Event Rate: Spawns particles per second (or based on their velocity)
The Blocks’ outputs connect to a GPU Event Context, which can then notify an Initialize
Context of a dependent system. Chaining different systems together in this fashion helps you
create richly detailed and complex particle effects.
The Initialize Context of the GPU Event system can also inherit Attributes available in the
parent system prior to the Trigger Event. So, for instance, by inheriting its position, a new
particle will appear in the same place as the original particle that spawned it.
Event Attributes
Use Event Attribute Payloads to pass data like 3D position or color along with the Event.
These Payloads carry Attributes that implicitly travel through the graph where you can
“catch” the data in an Operator or Block.
You can also read Attributes passed with Spawn Events or Timeline Events. The Set
SpawnEvent Attribute Block modifies the Event Attribute in a Spawn Context.
To catch a Payload in an Initialize Context, use Get Source Attribute Operators or Inherit
Attribute Blocks.
However, it’s important to keep these caveats in mind when using Event Attributes:
— Regular Event Attributes can only be read in the Initialize Context. You cannot inherit
them in Update or Output. To use the Attribute in a later Context, you must inherit and
set it in Initialize.
— Output Event Attributes only carry the initial values set in the Spawn Context. They
do not catch any changes that occur later in the graph.
See Sending Events in the Visual Effect component API for more details.
UI improvements in Unity 6
Unity 6 includes several quality of life improvements and updates to the VFX Graph UI.
Node search
Creating nodes or blocks now uses a hierarchical tree view, making it easier to browse the
node library. Enhancements include custom colors and a favorites folder for a more efficient
and personalized search experience. Use the advanced search filtering to select from the
available nodes.
The new side detail panels also display any node sub-variants (e.g., Output Particle Unlit
Octagon and Output Particle Unlit Triangle are sub-variants of Output Particle Unlit Quad). You
can toggle the button to show sub-variants to control their visibility. Disable it to see only the
most common nodes, or enable it to access all available variants.
Activation ports
A Block has a special activation port, located on the top left next to its name, which is linked to
a boolean property. This port allows you to control whether a Block is active.
You can manually toggle the Block on or off, or connect graph logic to the port to control when
the Block should be active. This allows you to implement different behaviors or states per
particle within the same system.
Note that statically inactive Blocks are grayed out and automatically removed during
compilation, resulting in zero runtime cost.
Keyboard shortcuts
The Shortcut Manager now has a VFX Graph category that lets you modify the shortcut
command available in the Visual Effect Graph window. New shortcut commands have been
added to speed up the VFX artist’s workflow.
Two samples, available in the Package Manager, can help show these features in context: The
VFX Graph Learning Templates and the VFX Graph Additions.
The VFX Graph Learning Templates showcase a number of techniques. This collection of
education samples can help you explore a specific aspect or feature set of VFX Graph.
The sample content is compatible with both URP and HDRP projects, for VFX Graph versions
17.0 (Unity 6) and later.
Use the Scene view to move around freely or the Game view to focus on each effect. The
Sample Showcase Window in the Inspector displays the corresponding information, with
quick-access links to the documentation or to navigate between effects. Each VFX asset
includes embedded notes and explanations to guide you.
We will explore these samples in more detail under Visual effects by example.
Meanwhile, the VFX Graph Additions in the Package Manager demonstrate several simple
graphs, making them a starting point for learning how to manage particles. In the example
below, you can see how the Smoke, Flames, and Sparks build up to form the Bonfire effect:
You’ll encounter some common Blocks and Operators as you explore the samples provided:
— Noise and Random Operators: Procedural Noise helps reduce the “machine-like” look of
your rendered imagery. The VFX Graph provides several Operators that you can use for
one-, two-, and three-dimensional Noise and Randomness.
— Attribute Blocks: These similarly include the option of applying Randomness in various
modes. They can vary slightly per Attribute, so experiment with them to familiarize
yourself with their behavior.
Randomness Blocks
For more information on creating your own flipbooks within Unity, check out the Image
Sequencer in the VFXToolbox section.
Flipbook Nodes
— Physics: Forces, Collisions, and Drag are essential to making particles simulate natural
phenomena. But don’t be afraid to push the boundaries of what’s real. As the artist, you
get to decide what looks just right.
Physics Blocks
— Subgraphs: The Smoke, Flames, and Sparks are Subgraphs. They are the parts of a VFX
Graph that can be saved as an asset for later reuse.
Splitting the main elements into smaller parts makes the Bonfire graph more readable.
So if you need to make a new explosion effect somewhere else in your application, for
example, you can now deploy it by dragging and dropping it into another graph. This
works because the Subgraph is an asset.
For a breakdown of how to construct the Bonfire graph, among other effects, watch these
community videos from Thomas Iché, a senior VFX and technical artist involved in creating the
samples and the Unity Spaceship Demo.
More resources
Once you’re familiar with the basic workings of a VFX Graph, try building a few effects from
scratch. Start with a simple system for falling snow, then play around with fire, smoke, and
mist.
The following videos offer an introduction to several effects, however, please note they are
using older versions of VFX Graph.:
— Creating fire, smoke, and mist effects with the VFX Graph in Unity
Additional references
As you get more comfortable with the VFX Graph, you can dive deeper to discover its
nuances. Keep these pages handy when you need to reference specific Node or Operator
functionalities:
— The Node Library describes every Context, Block, and Operator in the VFX Graph.
— The Standard Attribute Reference offers a comprehensive list of all common Attributes.
— The VFX Type Reference lists Data types used in the VFX Graph.
Once you understand the fundamentals of the VFX Graph, challenge yourself to craft more
complex graphs. A number of example projects are available to help you better prepare for
problems you might encounter during production.
These samples run the gamut of what’s possible for your visuals. From ambient smoke and fire
to fully scripted, AAA cinematic gameplay, take your time to explore.
The graphs are small and focused, making them ideal learning samples. Dive into each
template to master a new technique or use it as a starting point for your own effect. Each
graph comes with detailed notes to help you understand their construction.
The Learning Templates are available from the wizard when creating a new VFX Graph, or you
can import them via the Package Manager as a complete sample scene for either URP or HDRP.
Use the VFX Samples Showcase window to navigate the samples. Let’s take a quick tour of the
Learning Templates.
Visualizing gizmos
Many Blocks have adjustable parameters that may correspond to visual gizmos in the Scene
view. You can change the values directly within the Blocks themselves or manipulate the
gizmos in the Scene view.
To manipulate the gizmo, you need to “attach” the VFX Graph that you’re editing to a
corresponding VFX instance in your scene.
Use the small “link” icon near the top of the window to attach the VFX Graph to the selected
GameObject with Visual Effect component.
Once attached, selecting the corresponding block in the VFX Graph editor will sync and
display. Use the handles of the gizmo to change the settings interactively.
For example, this shows the Bounds settings in the Initialize Particle Context and its
corresponding gizmo in the Scene view.
The Bounds gizmo represents the Bounds settings in the Initialize Particle Context.
Graph fundamentals
These samples show the basics of graph logic and how to build and optimize your VFX Graphs.
Spawn Context
This VFX Graph demonstrates how to use the options in the Spawn Context to control how
particles first appear. By utilizing the Spawn State, you can access valuable information such
as the loop index, spawn count, loop state, and loop duration.
In the example effect, a stack of numbers appears to count up by manipulating the texture index.
This is achieved by generating a single burst of particles with a small delay between each burst.
Additionally, this example shows how to set attributes like lifetime, color, and alpha in the
Spawn Context so they can later be inherited in the Initialize Particle Context.
Capacity Count
This VFX Graph generates randomly sized particles within a volume to demonstrate how to
use the Capacity attribute.
Capacity Count is used for the particle memory allocation of a system. Increasing this number
will increase the memory allocated. This capacity serves as the maximum for the current
number of active particles.
Multiple Outputs
This VFX Graph demonstrates how the Initialize or Update context can be wired to one or
several output contexts. This allows you to create one particle simulation and generate several
types of renderer on each particle.
In this demo, each particle renders as an unlit particle, a mesh, and a quad. For example,
imagine using a Mesh Output to make the core of a missile and then a second Quad Output to
add an emissive glow on top of it.
Note that some Outputs are only compatible with a particular SRP (e.g. the HDRP Volumetric
Fog output is only compatible with HDRP).
Bounds
This shows how the Bounds settings can cull an effect to improve performance. In this
example, the Bounds are deliberately set outside the camera frustum by default. As a result,
the particles are culled (in this example, they may still cast shadows if enabled for the Output
Context).
Adjust the Bounds settings so the Bounds gizmo is within the camera frustum and the
particles reappear.
— Recorded: This allows users to record the Bounds using the VFX Control and then apply
those settings to the system.
— Automatic: Bounds will be computed each frame. This can be needed for dynamic VFX or
when iterating on a VFX but is resource intensive. Use Manual or Recorded when possible.
The Orient Face Camera, Orient Fixed Axis, and Orient Advanced VFX Graphs demonstrate
different ways of using particle orientation.
You can also rotate particles using the Angle attribute, as seen in the Rotation & Angle VFX
Graph.
For more realistic results, use angular velocity. The Rotation & Angular Velocity sample shows
how to set this in the Initialize Context and then use the Update Context to update the rotation.
While the Flipbook Block helps animate your sprite sheet, you can control this manually by
manipulating the texIndex attribute, a float that determines which part of a sprite sheet to
display.
In the Flipbook Mode VFX Graph, compare how enabling Flipbook Blend Frames can create
smoother, interpolated animation beyond the basic animated flipbook texture.
The Flipbook Blending VFX Graph illustrates the differences between traditional frame
blending and frame blending using motion vectors. Motion vector blending uses a texture to
describe the pixel displacement between frames, which can be useful for reducing the number
of frames in the flipbook or for showing the effect in slow-motion.
The TexIndex Advanced VFX Graph is composed of several systems that are playing with
the texIndex attribute creatively. Time, noise, and even particle position are used to drive the
attribute values. By animating several particles together, this setup produces a multilayered
motion graphic effect.
Particle pivots
By default, the pivot is centered on the particle’s position, but you can offset it on any axis (XYZ).
In the Pivot Attribute VFX Graph, each particle has a different pivot offset and angular velocity.
The Pivot Attribute VFX Graph shows three particles with different pivots.
Controlling the Pivot attribute of a particle can unlock interesting motion. The Pivot Advanced
VFX Graph gives an example of pivot manipulation. Here, the petals, leaves and spikes of the
flower are all particles.
Initially, the petal’s pivot is set at its root to allow for proper bending while the petal is
attached. As the particle’s lifetime progresses, the pivot is animated to shift to the center,
allowing the petal to break away and float off naturally.
This randomly spawns particles on the surface of the mesh. The Position and Color
outputs pass into an Initialize Particle Context that stores the particle properties for
later animation. In this example, an external wind force and turbulence disperse the
lion statue into a cloud of dust.
You can achieve a similar effect using a 2D image. The Sample Texture 2D VFX Graph shows
how to use the Texture2D Sample operator to determine the color of particles and perform
rejection sampling. The graph spawns particles in a 2D grid and remaps their XY coordinates
to sample the texture. Then, particles die based on a threshold of the sampled texture values.
You can also sample a signed distance field (SDF), a technique to represent a shape and
contours of 2D or 3D objects. An SDF calculates the distance from any point in space to the
nearest point on the object’s surface. The SampleSDF VFX Graph demonstrates how to make
particles crawl along the surface of the mesh using an SDF.
If you need to apply effects to a rigged character or prop, sample a skinned mesh to get
information about its surface position, vertex colors, UVs, normals, etc. The Sample Skinned
Mesh VFX Graph retrieves the surface UVs in order to spawn feathers on the creature’s back.
Open the Collision Simple VFX Graph to see how to set up basic collision graph logic.
The Collision Properties VFX Graph demonstrates how properties like bounce, friction,
lifetime loss, and roughness can influence the collision response of particles.
In Unity 6, individual Collider Blocks have been combined into the Collision Shape Block
so that you can easily switch between shapes. Several options have been added, with
improvements to both stability and accuracy. Chain several Collision Shape Blocks within a
Context for the desired effect.
New Collision Attributes allow for more precise control over particle collisions.
If simple collision shapes aren’t enough to get a precise enough collision with the environment,
a signed distance field can be a good solution to approximate complex geometry. The
Collision Advanced VFX Graph shows how to use the Collide with Signed Distance Field Block
for particle collisions with the sculpture of a hand.
Decal particles
Use decals to project textures on the environment and even on dynamic objects or skinned
meshes. These can add visual complexity to your scene without significantly impacting
performance. Common decals may include:
Output Decals allow you to render particles as decals and to project their properties onto a
surface using a base color map (albedo), a normal map, or a mask map. This example shows
how to project decals onto an animated Skinned Mesh Renderer component.
Particle Strips
A Particle Strip System is a linked group of
particles that can create an effect like a ribbon
or trail by drawing quads between them. Take
these two simple Particle Strips from the Strip
Properties template: A line on the left and a
wider ribbon on the right.
To generate Strips, you use the Initialize Particle Strip Context, where you set the Strip
Capacity and Particles Per Strip Count. These settings determine the maximum number of
strips the system will handle and how many particles will compose each strip. Both settings
influence the strip’s appearance and behavior.
Inside the graph, note that a few Attributes are specific to Particle Strips:
— The Strip Index defines which strip each particle belongs to. In a simple setup like this
one, all the particles fall along one Particle Strip, so they share a Strip Index of 0.
— The ParticleIndexInStrip
Attribute is a uint that
represents the particle’s
position within the Particle
Strip Buffer. This is a number
between 0 and the total number
of particles within the strip (in
this example, between 0 and 8).
— This SpawnIndexInStrip
Attribute is a uint that
represents the Spawn Index
in the Strip. Unlike the
ParticleIndexInStrip attribute,
the SpawnIndexinStrip isn’t a
unique ID. Two particles born
on different frames on the same
strip could have the same index
value.
There are several ways to spawn particles, and when dealing with Particle Strips, this can
have some implications for how you need to set up your VFX Graph. These examples show a
variety of different setups.
The Strip SpawnRate template shows how to make a single trail out of a continuous spawn
rate of particles.
The Multi-Strip SpawnRate template shows how to make multiple trails out of a continuous
spawn rate.
This example generates several robot arms out of a single burst. See the Multi-Strip Single
Burst template to see how to set up the Strip Index (divide the Spawn Index by the number of
particles per Strip).
The robot tentacle is a Particle Strip while the claw arm at the end is a Lit Mesh Particle.
Meanwhile, the Multi-Strip Periodic Burst example shows how to create new trails for each
periodic burst in a system. The Loop Index is stored as the texIndex for use outside the Spawn
Context. This value is then used as the Strip Index during initialization and then normalized
into a “strip ratio.”
Calculating this strip ratio can be useful when dealing with Particle Strips. In this example, it
influences their behavior and appearance, such as how they stretch, change size, and respond
to forces like gravity and turbulence.
This Strip GPU Event example shows an example of a growing mushroom VFX. The
mushroom’s cap (the Mushroom Hat System) renders particle meshes while the mushroom’s
stems (the Mushroom Foot System) is made from Particle Strips.
One system acts like a “parent” to the other, using GPU Events to trigger the next system.
This mechanism also generates clouds of dust and smoke as the mushrooms reach full
growth, adding to the effect.
The Multi-Strips GPU Event template demonstrates how to create a more complex effect with
Particle Strips. In this example, each headphone jack is a particle mesh that spawns particles
along its path, bypassing the usual one-strip-per-parent particle.
As each jack follows a Bezier path from the floor to the plugs beneath the speakers, it leaves
a trail of snaking cables that move organically. When a jack successfully connects, additional
green light particles add a dynamic touch to the effect.
Note: The samples only support HDRP and are therefore incompatible with URP.
Go to the Release tab to find snapshots of these samples, as well as links to prebuilt binaries.
Alternatively, you can clone the entire repository.
Each sample appears in a subdirectory within the project’s Assets/Samples folder. The main
VisualEffectsSample scene lives at the root.
If you need to build a player, ensure this main scene is set to index zero within the Build
Settings. Then add all other scenes you plan to cycle afterward.
Each scene part of the VFX Graph Samples showcases a unique effect. Let’s take a closer look
at some of them.
Smoke Portal
The Smoke Portal is a swirling vortex of dense, volumetric smoke that forms a mystical portal.
The effect combines a Houdini-simulated smoke animation with real-time lighting using the
new six-way smoke lighting material (see below). This technique allows VFX Graph to use
baked light maps from six axes.
Surrounding the main portal are additional elements that enhance its magical appearance.
These include flickering flames, sparks, floating rocks, and a subtle distortion effect that
warps the space around the portal.
This making-of video walks you through the complete workflow for this sample. Here are some
important steps in how it’s made:
— Houdini simulation: The smoke portal VFX is composed of layers including a smoke
simulation exported from Houdini. The original vortex starts from a torus with noise in
Houdini. Density, temperature, and velocity attributes drive the smoke simulation, which
is then exported as an 8-by-8 flipbook texture.
— Flames and additional effects: Flame effects are developed using flipbook textures
and blended with the smoke ring for realism. Rocks and sparks are added to the scene,
creating a sense of gravitational pull and integrating the effect with the surroundings.
— Six-way lighting: The six-way lighting feature uses baked light maps from six different
axes. This technique avoids the high computational cost of volumetric rendering by
instead using sprites. Real-time lights can interact with the smoke, creating realistic
lighting and shadows without a true volumetric render.
— Flipbook blending: Motion vectors help blend the frames of the flipbook. TFlow (from
Asset Store) helps to generate motion vectors from the exported flipbook textures;
when reducing the texture resolution, using the TFlow motion vector map could help
compensate for lost quality. Flipbook textures are applied to a parabola-shaped mesh for
better volume and integration with the environment.
— Procedural crystals: VFX Graph helps to create varied crystal formations in the
surrounding environment. Instancing reduces the computational load by grouping the
VFX crystals into single batches.
— HDRP lighting: Spotlights provide focused illumination around the flames, while area lights
complete the ambient lighting along the path. Dynamic lighting effects add animation to
the light positions with a light flicker script to make the scene more life-like. The radius of
point lights is kept small to optimize the setup without compromising visual quality.
— Optimization: Several strategies improve performance in the VFX Graph. These include
reducing the portal’s flipbook texture from 8K to 2K resolution, using alpha clipping to
reduce transparent overdraw, and utilizing VFX Graph’s instancing feature for the crystal
elements.
A common question arises: How do you light the smoke based on flat, 2D geometry? Adding
more textures for variety quickly eats up the memory budget, reducing both texture quality
and the variety of explosions you can have. Other methods, like normal mapping or fully
baked color maps, often aren’t realistic looking or flexible enough.
That’s where six-way lighting comes in. This method allows for smoke rendering from baked
simulations and works well across different lighting conditions. It can approximate the
volumetric feel of smoke with a cost-effective process.
The secret of six-way lighting lies in using a set of lightmaps that capture how the smoke
looks when lit from six different directions (top, bottom, left, right, front, back).
These lightmaps are then baked into two RGBA textures. The first texture’s RGB channels
store the top, left, and right maps, while the alpha channel stores the bottom map.
The second texture’s RGB channels store the front, back, and an additional map, with the
alpha channel available for an optional emissive mask.
When rendering, the shader blends between the six lightmaps based on the direction of
the light relative to each particle. This means the smoke can be dynamically shaded for
different lighting conditions, using direct lighting from all light types and indirect lighting
from light probes and other global illumination techniques.
— Improved visual integration with the environment: The dynamic shading helps the
smoke blend with its surroundings, adapting to changing lighting conditions.
Six-way lighting enables varied smoke rendering under different lighting conditions.
— Memory efficiency: By reusing the lightmap textures for different lighting conditions,
this method conserves memory compared to having multiple textures for different
lighting conditions.
Six-way lighting can be a useful technique in your effects toolkit, balancing visual quality,
performance, and memory usage for rendering real-time smoke effects.
Watch VFX Graph: Six-way lighting workflow for a complete walkthrough of the technique.
You can also read this blog post for more information.
GooBall
Have you ever wanted to splatter paint in Unity? This playful sci-fi demo incorporates decals
that simulate gooey substances interacting with their surrounding environment. Like other
production examples, this multilayered effect leverages several Systems to achieve its final look.
The center blob starts with a Vertex Animation Texture (VAT) Shader Graph that creates the
impression of fluid movement. A scrolling texture sheet in the shader constantly undulates the
mesh’s 3D points, which is ideal for sci-fi goo.
Because the Shader Graph connects directly to the Output Particle Lit Mesh Context, there
are some input ports that influence the shader. Here’s the VAT Shader Graph at a glance:
In addition to the vertex motion, the Shader Graph also creates the appearance of a
transparent, green glass-like material. This makes up the goo’s look.
Shader Graph is closely integrated with the VFX Graph. In this example, you can edit some of
the GooBall’s shading parameters from the VFX Graph’s Blackboard. Go to the Inspector to
access a unified set of sliders and fields for tweaking the goo’s appearance.
The main blob triggers a GPU Event to spawn some particles on the surface of a sphere.
Downward force is applied with the proper maps and some droplets of goo drip periodically.
Even though the VFX Graph doesn’t directly interact with a Collider on the floor, you can approx-
imate the environment with the Camera’s depth buffer. This works if precision isn’t a concern.
As drops hit the depth buffer, they trigger a GPU Event. This Event passes the position, color,
and size Attributes to a separate System for handling the decals.
To orient the splats correctly against the geometry, the buffer’s depth normals are calculated
using something like this:
Depth normals
They then pass into the Z axis Attribute so that all the decals face the right way. With the new
puddle texture created, each decal splats convincingly across the uneven surface of the floor.
The scene gets more chaotic with extra projectiles that fire off randomly from the spherical
surface. This reiterates the power of VFX.
Physics-based effects
The VFX Graph can compute complex simulations and read frame buffers. However, it does
not support bringing particle data into C# or connecting to the underlying physics system.
That’s why you’ll need to use some workarounds to create physics-based effects, such as:
As the projectile collides with the room geometry, a similar technique calculates the splats.
This time, though, you should also check if a second Trace decal falls within the height range
of the walls. If it does, the Y scale will animate slightly, completing the illusion of the slime
slipping down the smooth, metallic surfaces.
Even if you’re not aiming to reproduce this exact effect, the GooBall scene shows you how to:
Often, they can simulate animated trails. Guide them with other particles using GPU Events,
and notice how every point of a trail evolves independently, allowing you to apply wind, force,
and turbulence.
The Ribbon Pack graph spawns multicolored particles on the surface of a spherical arc. Use
Noise to add some organic motion, or modify the available Blocks to customize each Particle
Strip’s texture mapping, spawning, and orientation.
If you were to render this effect conventionally with mesh particles, it would resemble
something like this:
Instead, the first System is hidden and doesn’t appear onscreen. A Trigger Event Rate Block is
used to invoke a GPU Event.
This sends a message to a series of Contexts that initialize, update, and render the Particle
Strips. The result is a prismatic tangle of cables or fibers.
Particle Strips have numerous applications; think of magical streaks, weapon trails, and wires, to
name a few. The Magic Book sample uses Particle Strips for the trails swirling around each beam.
They also stand in for blades of grass in the Meteorite sample, discussed in the section below.
Meteorite sample
The Meteorite sample combines several effects to accentuate the impact of a meteor crashing
into the earth. A MeteoriteControl script on the Timeline object listens for your keypresses or
mouse clicks and then activates the meteorite effect.
In this scene, the ground and surrounding trees react to the blast. But don’t worry, no VFX
critters were harmed in the making of this effect.
Here, a single graph called MeteoriteMain drives several others. The graph itself is neatly
organized:
A chain reaction of effects plays every time a meteor drops from the sky. It consists of a
central Spawn Context that triggers many other effects:
Note: The MeteoriteMain graph looks relatively clean because much of it is broken into
Subgraphs. The Spawn Event plugs directly into many of their Start ports. Drill down into
each individual Subgraph to see its specific implementation.
This structure relegates a number of the details into smaller, more manageable parts. It makes
the graphs easier to navigate, so you won’t have to wade through a confusing web of nodes.
The Spawn Event plugs into the Subgraphs and Output Events.
Output Events are used to communicate with other components outside of the graph. In
particular, the light animation, camera shake, and plank debris have their own Output Events.
If you select the GameObject called VFX_MeteoriteMain, you’ll see various Output Event
Handler scripts that receive these events and respond accordingly.
Meanwhile, the graph’s Buffer Output Event spawns a Prefab called ImpactBuffer, which
has its own VFX Graph. It animates ground decals, separated into red, blue, and green color
channels. The ImpactBuffer effect looks something like this:
You only need a 2D recording of it. A separate camera called Buffer Recorder looks straight
down and generates a render texture called Meteorite_BufferRender. This texture buffer then
sends data to a separate graph called GrassStrip.
The BufferRecorder
The texture plugs into the GrassStrip graph, driving the movement and rendering of the
Particle Strips. Here you can see how the texture buffer drives the effect:
Beyond the main meteorite, there are some secondary effects, including:
These secondary effects rely on the Meteorite Timeline, which contains several Visual Effect
Activation Tracks. When the MeteoriteControl script kicks off the Playable Director, each
secondary effect plays back according to its prescribed timing.
See the Interactivity chapter for more information on how to set up Timeline and Output
Events.
The Placement Mode can be set to Vertex, Edge, or Surface. Here’s the Position (Mesh)
Block at work on some simple meshes:
This variation modifies the original Magic Lamp scene from the samples. With some additional
work, you can use the Sample Mesh Operator to initialize particles on the lamp’s surface
and change their orientation as they float away. Grab the Magic Lamp’s Sample Texture to
smoothly integrate the particle colors with the mesh surface.
The sample effect called EllenSkinnedMeshEffects illustrates a variety of effects involving its
title heroine. However, the general process for sampling skinned mesh data is similar in each
case:
— Expose a Skinned Mesh Renderer property on the Blackboard, and connect it to the
Skinned Mesh flow port.
— Use a Skinned Mesh Operator if you need additional access to Surface, Edge, or Vertex
data (see the Hologram and Disintegration graphs for example usage).
— In the Inspector, set the Property with a Skinned Mesh Renderer from your scene.
— Use a Property Binder (see Interactivity chapter) to transform the effect’s position and
orientation in your character’s skeleton. Set this Transform and its corresponding Set
Position Block to World space for optimal results.
You’ll also need to add a Mesh Transform Property Binder that accounts for the Base
Transform of the skeleton itself.1 This varies from character to character, but in this case, it
connects the character’s hip joint. Now the particles can follow the Skinned Mesh Renderer.
1
Unity 2021 LTS is shown here. This step is not necessary in Unity 2022.2 or newer.
Leverage Skinned Mesh sampling’s versatility when creating effects for characters and objects
alike. In the samples, the Ellen character is shown:
— Jolted by electricity
Each of these is a separate graph that samples a Skinned Mesh Renderer. You can examine
their implementation details to see how the particles interact with the character mesh.
More examples
There are many more possibilities you can explore with the VFX Graph Samples. Be sure
to check out the other scenes in the project; each one demonstrates a different set of
techniques for creating a specific visual effect.
Your choice of render pipeline affects the available output Contexts in VFX Graph. Your design
needs and target platform will ultimately determine which render pipeline is most suitable for
your application.
The Universal Render Pipeline (URP) is optimized for performance across a wide range of
devices, from low-end mobile to high-end consoles and PCs. It provides a streamlined feature
set with simplified lighting and single-pass forward rendering. URP supports VFX Graph,
though some advanced features may be limited or unavailable.
The High Definition Render Pipeline (HDRP) is optimized for high-end PCs and consoles, using
a deferred rendering path to handle complex lighting and shading. HDRP includes advanced
features like ray tracing, volumetric lighting, subsurface scattering, and screen-space
reflections. However, HDRP’s higher performance overhead and limited platform support make
it less suitable for low-end devices and mobile platforms.
You can visit the full render pipeline feature comparison here.
Download Download
Device support
The VFX Graph requires compute shaders for device compatibility. Compute support
on mobile devices varies widely across brands, mobile GPU architecture, and operating
systems. Unity’s Built-In Particle System is recommended if your platform does not support
compute shaders.
For detailed render pipeline compatibility, please refer to the Unity VFX Graph System
Requirements.
Unity 6 has added support for several features in the Universal Render Pipeline (URP) that
were previously available only in HDRP.
Lit output
VFX Graph now supports Lit outputs in URP. Use them to create effects that respond directly
to the scene’s lighting.
You can now spawn URP decals with VFX Graph and use Shader Graph to customize decals for both HDRP
and URP.
Decals
Spawn URP decals with VFX Graph and customize them with Shader Graph for both HDRP
and URP. Use them to add bullet impacts, footprints, surface damage, or any other dynamic
effects to the environment.
Motion vectors
VFX Graph particles can generate motion vectors in URP, useful for effects like Temporal
Anti-Aliasing (TAA) or Motion Blur. Note that URP only supports opaque particles, while both
transparent and opaque are available in HDRP.
Camera buffer
You can now sample URP camera buffers to obtain the scene’s depth and color. This feature
allows you to perform fast collision on the GPU or to spawn particles against the depth buffer
and inherit the scene color. For instance, you can create VFX impacts and splats, a character
dissolving into particles, or effects that change color based on the underlying objects.
— Ray traced reflections: This can use offscreen data for more accurate reflections, as an
alternative to Screen Space Reflection.
— Ray traced shadows: Ray tracing can replace traditional shadow maps for more
accurate and realistic shadows.
— Ray traced ambient occlusion: This is an alternative to HDRP’s screen space ambient
occlusion, with a more accurate ray traced solution that can use off-screen data.
— Ray traced global illumination: This is a more accurate representation of indirect light,
accounting for light bouncing off surfaces.
RTX support adds the ability to render VFX in ray tracing passes to enable taking VFX into
account in ray traced reflections.
RTX support adds the ability to render VFX Graphs in ray tracing passes.
You can enable ray tracing with VFX Graphs that use quads, triangles, and octagons. VFX
Graph does not support ray tracing with meshes or strips.
To use ray tracing in an HDRP scene, refer to Getting started with ray tracing.
For more details on render pipeline compatibility with VFX Graph, visit the Unity Graphics
product roadmap.
Visual effects often defy the rules of the real world, requiring unique shading and rendering
beyond what standard URP and HDRP shaders offer. Sci-fi force fields or magical auras lack
real-world counterparts; thus, the pre-built shaders might be inadequate to describe them. For
these scenarios, you can customize shaders using Shader Graph.
One of the key advantages of Shader Graph integration is the ability to drive shader behavior
on a per-particle level. This allows for creating variations, color randomization, and other
dynamic effects with different per-particle values, enabling highly complex visuals.
Built-in Outputs
Before diving into custom shaders, be aware that Unity provides a variety of built-in Output
Contexts optimized for different render pipelines, including:
— HDRP/URP Lit Output: This output is optimized for rendering particles with realistic
lighting and shading. Lit Outputs are useful in scenarios where the visual effect needs to
react to the lighting in the scene.
— Unlit Output: Designed for simple, non-lit effects, this output does not interact with
scene lighting. Use this for stylized or 2D-like effects where lighting calculations are
unnecessary.
— Six-Way Smoke Lit: This is a Material Type option for HDRP Lit and URP Lit Outputs.
Using this output supports the six-way lighting technique to achieve realistic,
volumetric-like shading.
— HDRP Distortion Output: This HDRP output creates effects that warp or distort the
background, simulating phenomena like heat haze, water ripples, or glass refraction.
— Decals Output: Used for projecting textures onto surfaces in your scene, this output for
URP and HDRP allows you to add details like bullet holes, graffiti, or surface wear directly
onto existing geometry
These outputs come with integrated shader functionality, including features like frame
blending, UVs flipbook, emissive properties, and more. This allows you to create sophisticated
effects without needing to build custom shaders from scratch.
Shader effects
Shader Graph enables technical artists to build custom shading with a graph network.
Though Shader Graph and shader authoring are entire subtopics unto themselves, a working
knowledge of shaders can complement your usage of VFX Graph. Shaders allow you to
manipulate light and color to give your effects an added boost.
With Shader Graph, you can warp and animate UVs or procedurally alter a surface’s
appearance. Shaders can act like fullscreen image filters or be useful for changing an object’s
surface based on world location, normals, distance from the camera, etc. The visual interface
of Shader Graph helps you iterate more quickly with real-time feedback.
Shaders can be used to create dynamic effects like fire that flickers or water that reacts to
objects moving through it. For natural phenomena like clouds, smoke, and fluids, shaders can
help create intricate and non-repetitive patterns, adding depth and variation.
Here are some other ways shaders can complement your VFX Graphs:
— Distortion effects: Shaders can manipulate pixels to simulate the refraction of light. Use
a distortion shader to create a wavy effect over a heat source like a bonfire or jet engine.
Distortion can also imitate the appearance of ripples in water or another liquid.
— For example, the Portal effect from the Visual Effect Graph Samples uses a Shader
Graph to distort the center of the portal interior.
In the Magic Book sample, for instance, a specialized shader makes the flying pages
appear to dissolve into embers.
— Lighting and shading: Add realism or stylization through advanced lighting effects. VFX
Graphs often benefit from glowing shaders or complex lighting interactions, such as
reflections, refractions, and shadows.
For instance, if you have a magic effect that imitates glass, water, or metal, your shader
will need to simulate accurate reflections. Making a toon shaded game? Shader Graph
can help you render stylized anime-like effects.
In this example, the darts are particles that use a Shader Graph to render the lit meshes
like springy projectiles.
— Color transitions: Shaders can interpolate colors based on time, position, or user input.
Use a ramp or gradient to transition colors smoothly across a surface or through an
animation sequence. Make your effects more dynamic, such as a flame shifting from a
hot blue core to red and yellow outer layers.
— Blending masks and transparency: Effects requiring varying levels of transparency can
benefit from custom shading. Use a Shader Graph to control the alpha value of pixels.
Blending masks can fade based on height, angle, or distance to camera. This technique
is also useful for glass, ice, or anything that needs to transmit light.
— Particle shading: Shaders can add complexity to particle effects by adding small visual
details or manipulating their appearance based on various parameters. For example, use
a Shader Graph to control the color, size, and brightness of particles in fireworks. Or
apply shaders to make magic spells change color or animate their light emission.
VFX Graph now includes integration with Shader Graph keywords. This allows you to create
one Shader Graph for use in multiple VFX Graphs. Then, enable features based on those
keywords in the VFX Graph Out Particle Mesh node.
The VFX Graph can enable behavior based on Shader Graph keywords.
You can explore examples in both the VFX Graph Learning Templates or look for more general
examples in the Shader Graph Feature Examples available in the Package Manager.
Visual effects often involve many moving pieces. Connecting them to the correct points in your
application is essential to integrating them at runtime.
Whether you need a projectile to explode on contact or bolts of electricity to jump from the
mouse pointer, one of these available tools can help them interact with the rest of your Unity
scene.
— Event Binders: These listen for several different things that happen in your scene and
react to specific actions at runtime.
— Timeline: You can sequence visual effects with Activation Tracks to send events to your
graph at select moments. Gain precise control with pre-scripted timing (e.g., playing
effects during a cutscene).
— Property Binders: These link scene or gameplay values to the Exposed properties on
your Blackboard so that your effects react to changes in the scene, in real-time.
— Output Events: Use these for sending messages from the VFX Graph to scripts or other
scene components.
Let’s explore each of these tools in more detail as they are crucial techniques for bridging your
GameObjects with VFX Graphs.
Event Binders
Event Binders are MonoBehaviour scripts that can invoke Events from within the VFX Graph.
They ensure that your effects react to mouse actions, collisions, triggers, and visibility events
in the scene.
If you’re familiar with the GameObject playback controls in the Scene view, the Play() and
Stop() buttons at the bottom send the OnPlay and OnStop Events, respectively.
Use the Play and Stop buttons to send the OnPlay and OnStop Events.
Events facilitate the process of sending messages between objects. In the VFX Graph, Events
pass as strings. Pressing OnPlay or OnStop doesn’t change the effect immediately, especially
compared to using the playback icons at the top. They simply provide signals to the Spawn
system.
If you open the dialog window Visual Effect Event Tester, you can use the Play and Stop
buttons for the same effect. Take advantage of this flexibility to specify a Custom Event and
invoke it with the Custom button.
Add your own Event to the graph using Node > Context > Event. Press the Send button to
raise the Event manually when testing.
Creating Custom Events is a matter of changing the Event Name string and invoking the Event
with an Event Binder or Timeline Activation Clip at runtime.
In this example, two Events called CustomPlay and CustomStop have been added:
In the example, two Event Binders connect the CustomPlay Event and CustomStop Event to
the Bonfire effect.
If the mouse pointer enters the Collider onscreen, send the CustomPlay Event to the graph.
This Event begins spawning the flames, smoke, and sparks. If the mouse pointer exits, the
CustomStop Event notifies the Spawn Context to stop.
The mouse pointer raises Events when entering and exiting the Collider.
You can bind Events to any standard mouse actions (Up, Down, Enter, Exit, Over, or Drag).
The Raycast Mouse Position option passes the pointer’s 3D location as an Event Attribute.
Rigidbody Collision
Any Collider making contact raises an Event (with the specified Event Name) and sends it as a
message to the Visual Effect Target.
In this example, the Mesh Collider with a Rigidbody acts as a button. The effect only begins
spawning once the player makes contact.
Seeing as this Event Binder responds to Collisions with a Rigidbody, you can create different
forms of interactivity with it. Imagine a sphere that emits particles every time it bounces, or a
force field that becomes distorted when hit with a projectile.
Use this to trigger an effect when the player, or some other GameObject, reaches a particular
part of the level.
Pass the specific Event Name to the Target graph based on its Activation, either
OnBecameVisible or OnBecameInvisible. This notifies the graph when the Renderer enters or
leaves the Camera frustum, or toggles its Renderer on and off.
The visual effect only activates once the target becomes visible.
Timeline
Timeline offers another way to communicate with your graphs, should you need to turn your
visual effects on and off with precise timing. You can coordinate multiple layers of effects with
Visual Effect Activation Tracks.
Assign an effect to an Activation Track and then create one or more Activation Clips in the
Timeline track. Each clip sends two Events; one at the beginning and one at the end.
Timeline helps organize the pieces that collectively create the sum total of the effect. Here,
a separate Activation Track is used for each Subgraph and passes in Events through the
Activation Clips.
By sliding the clips within Timeline, you can adjust the timing interactively. The custom
MeteoriteControl script then invokes the Playable Director component.
You can also use Timeline to mute specific Visual Effect Activation Tracks, which temporarily
stops them from receiving Events. This can be useful when troubleshooting.
Event Attributes
Both Event Binders and Timeline Activation Clips can attach Event Attribute Payloads to
Events. In doing so, they pass along extra information with an Event when it’s invoked.
For instance, you might create an Event Attribute with an exposed Vector3 property that
notifies the graph where to instantiate the effect.
To set these Attributes in a VFX Graph, use the Set Attribute Blocks in the Spawn
Contexts. You can also attach them to Events sent from C# scripts. See the Visual Effect
component API for more information.
Property Binder
Property Binders are C# behaviors that enable you to connect scene or gameplay values to
the Exposed properties of the VFX Graph. You can add Property Binders through a common
MonoBehaviour called the VFX Property Binder.
For example, a Sphere Collider Binder can automatically set the position and the radius of a
Sphere Exposed Property using the values of a Sphere Collider in the scene.
Do you need a light or camera in the scene to influence your effect at runtime? Does the effect
follow a Transform or a Vector3? A Property Binder can sync a number of Exposed Property
types with values in your scene. Go to Add Component > VFX > Property Binders for the
complete selection of what’s available.
There are instances in the Visual Effect Samples that show how Property Binders can connect
the Scene Hierarchy to the graph:
— In the Magic Lamp sample, the Property Binder ties the position of several Scene objects
(P1, P2, and P3) to the graph’s Blackboard properties (Pos1, Pos2, and Pos3). These
Vector3 Nodes then form a Bézier spline defining the genie’s overall shape. Move the P1,
P2, and P3 Transforms in your scene, and the genie’s smoke trail will respond in real-time.
— In ARRadar, a PointLight Transform determines where the player’s ship appears on the
3D radar screen. It syncs the glowing blip with real-time light.
— In the Grass Wind sample, Property Binders capture the Position and Velocity of the
Transform called ThirdPersonController to push the grass.
The SpaceshipHoloTable
These are just a few instances where Property Binders can solidify the relationship between a
graph and your scene. Find built-in Property Binders for audio, input, physics, and UI, among
other components.
Output Events
Just as you can leverage Events to send messages into the VFX Graph, you can similarly use
them to send messages out. With Output Events, you have the ability to obtain the Attributes
of new particles from a Spawner Context. Use them with Output Event Handlers to notify
your C# scripts in the scene.
Output Events
Create any number of behaviors that respond to your effect; shake the Camera, play back a
sound, spawn a Prefab, or anything else your gameplay logic dictates.
Unity calls OnVFXOutputEvent whenever an Event triggers, passing the Event Attributes as
parameters. Look for provided implementations in the Output Event Helpers, included with the
VFX Graph. Install them from the Package Manager to review the example scripts.
You can also revisit the Meteorite sample to see how they work within an actual graph. Attach
some of these Output Event Handlers to the VFX_MeteoriteMain GameObject:
— Camera Shake: The VFX Output Event Cinemachine Camera Shake component rattles
the Camera upon the Camera Shake Event.
— Secondary effects: The VFX Output Event Prefab Spawn components raise the Buffer
and PlankImpulse Events. A resulting shock wave passes through the grass and sends
wood planks flying, courtesy of Output Event Handlers.
— Light animation: Another VFX Output Event Prefab Spawn creates the light animation
upon the meteor’s impact. A custom Output Event Handler Light Update script modifies
the volumetric scale, brightness, and color to add more dramatic flair to the collision.
Begin by exploring a few of these ideas. You can use the sample scripts directly without
writing any code or treat them as a starting point for your own scripts. Give it some time,
and soon you’ll be able to roll your own Output Event Handlers for whatever your application
requires.
Effects aren’t isolated in a vacuum. Often you’ll need to supply them with external data to
achieve your intended look.
What if you want the genie to emerge from a magic lamp? Or you’d like to integrate a hologram
with the sci-fi spaceship? Though you can accomplish much of this with math functions and
Operators, you might need the effect to interact with more complex shapes and forms.
For this reason, Unity provides support for a number of Data types:
— Point Caches: Store attributes of points in space, such as Transforms, normals, colors,
and UVs.
— Signed Distance Fields: Attract and collide with particles using a volumetric
representation.
— Vector Fields: Push particles in 3D space after sampling the particle’s position.
Unity also offers some support utilities to facilitate the generation of these file formats.
Point Caches
A Point Cache is an asset that stores a fixed list of Particle Attribute data, including points and
their positions, normals, and colors.
Point Cache assets follow the open-source Point Cache specification and use the .pCache file
extension. Internally, Point Caches are nested ScriptableObjects containing various textures
that represent the maps of Particle Attributes. They are less resource intensive than Signed
Distance Fields.
— Use the built-in Point Cache Bake Tool via Window > Visual Effects > Utilities > Point
Cache Bake Tool.
— Build your own custom exporter. See the pCache README for more information on the
asset format and specification.
Point Caches store lists of points generated from 3D meshes or 2D textures – but not their
actual geometry. During baking, additional filtering relaxes the points in order to separate them
more evenly and reduce the number of overlaps. Choose a Mesh or Texture, set an adequate
Point Count, and select Save to pCache file.
Point Caches are similar to Stanford PLY files, but the .pCache file format removes the
polygons and adds support for vectors. As such, they are more easily readable and writable in
Python or C#.
Looking at the Morphing Face sample, the Operator creates one output slot for the Point
Count and separate texture slots for Attribute Maps. You can connect the outputs to other
nodes, such as the Set Attribute from Map Block.
By convention, this distance is negative inside the mesh and positive outside of it. You can
thereby place a particle at any point on the surface, inside the bounds of the geometry, or at
any given distance from it.
While it’s more resource intensive to calculate SDFs than Point Caches, they can provide
additional functionality. Very detailed meshes require a high texture resolution, which typically
takes up more memory.
Using SDFs
A visual effect can use SDFs to position particles, conform particles to a shape, or collide with
particles. In the Magic Lamp sample, an SDF was used for the genie’s body. A preview of the
SDF asset looks like this:
Unity imports the SDF asset as a 3D texture Volume File (.vf). Compatible VFX Graph Blocks
and Operators then make the particle system interact with the sampled points. In this
example, the Conform to Signed Distance Field Block attracts the particles to the area where
the genie’s torso appears.
You can also bake SDFs at runtime and in the Unity Editor with the SDF Bake Tool API. Just be
aware that runtime baking is resource intensive. Best practice is to use a low-resolution SDF
and only process every nth frame.
See Signed Distance Fields in the VFX Graph for more details on how to generate and use
SDFs, or find additional samples in this repository.
Vector Fields
A Vector Field is a uniform grid of vectors that controls the velocity or acceleration of a
particle. An arrow represents each vector. The larger the size of the vector, the faster the
particles will move through it.
As with Signed Distance Fields, you can represent Vector Fields using the open-source
Volume File (.vf) format or generate vector fields from the Houdini VF Exporter bundled
with VFXToolbox. You can even write your own VF File Exporter that follows the Volume File
specification.
In the UnityLogo scene, the particles flow as if pushed by the unseen currents of the Vector
Field.
VFXToolbox
The VFXToolbox features additional tools for Unity visual effects artists. It enables the export
of .pCache and .vf files from SideFX’s Houdini Point Cache Exporter and Volume Exporter.
Download this repository on GitHub and install it with the Package Manager.
Image Sequencer
Use Flipbook Texture Sheets to bake animated effects into a sprite. If you don’t have the
frame budget to simulate effects like smoke, fire, or explosions, saving the images as a
Flipbook Texture Sheet can produce a comparable “baked” effect without the high cost.
First, use Unity or another DCC package to render an image sequence of effects into a
project folder. Next, convert the individual images into a single texture sheet using the Image
Sequencer. Retime and loop the images to your liking before playing them back with the
Flipbook Player Block.
Be sure to check out some of the sample flipbooks created with this tool.
SideFX Houdini
Houdini has long been an industry-standard tool for simulation and visual effects. Its
procedural workflows and node-based interface facilitate the production of textures,
shaders, and particles in comparatively fewer steps. Its Operator-centric structure
encourages nonlinear development and covers all the major areas of 3D production.
Autodesk Maya
Maya fortifies the foundation of many game development studios. Its relatively new Bifrost
system makes it possible to create physically accurate and incredibly detailed simulations in
a visual programming environment.
Blender
Blender is a free and open-source 3D creation suite. Its features cover all aspects of 3D
production, from modeling and rigging to animation, simulation, and rendering. Blender
continues to receive widespread community support, as it is cross-platform and runs
equally well on Linux, Windows, and macOS.
Adobe Photoshop
Along with the other 3D tools discussed, you’ll benefit from an image-editing software such
as Adobe Photoshop. Use Photoshop to edit and create raster images in multiple layers, and
support masks, alpha composites, and several color models. Photoshop uses its own PSD
and PSB file formats to uphold these features.
Of course, these are just some of the DCC tools available. As you start building your graphs,
they’ll help you fill in flipbook textures, meshes, or anything else to achieve your vision.
After working closely with VFX Graphs, you’ll likely want to reorganize and optimize them,
much like how a programmer profiles code and checks its performance. Once the effect
looks right, make sure it’s not using excess resources before deploying to your final game or
application.
Click the debug icon in the top right of the Visual Effect Graph window. Note that these panels
only work if the VFX Graph window is attached to a GameObject with a Visual Effect component.
Use the Profiling and Debug panels to optimize your VFX Graph.
— CPU Information: The CPU Information panel displays crucial performance metrics like
the full update time for the entire graph in milliseconds, the time spent evaluating graph
parameters computed on the CPU, and the update time for specific systems within the
graph in milliseconds.
— GPU Information: The GPU Information panel shows the execution time of the VFX
Graph on the GPU in milliseconds and its GPU memory usage.
— Texture Usage: This lists textures used, their dimensions, and memory size.
— Heatmap Parameters: Adjust the GPU time threshold to highlight expensive parts of the
graph.
You can also access the Rendering Debugger (URP/HDRP) and Unity Profiler from the top-
right vertical ellipsis (⁝) menu.
— Particle state: This shows whether the System is playing/paused, awake/asleep, and
visible/culled.
— Alive/capacity: This indicates the number of particles alive and the capacity set by the
user in the Initialize context. Optimizing this capacity to match the maximum number of
particles alive helps save memory allocation space.
— Initialize, Update, Output Contexts: These panels display implicitly updated attributes
in the Update Context, break down GPU execution time by tasks, and list texture usage
along with their dimensions and memory sizes.
Keep in mind that the profiling timings in this panel are for comparison only, not precise
measurements, as they’re recorded in the Editor. Use them to identify bottlenecks and
optimize performance.
Note: GPU execution timings are unavailable on Apple Silicon, and profiling panels disable
instancing for the attached visual effect. To ensure accurate performance assessment,
always profile on target devices using the Unity Profiler.
Use the Profiler Standalone Process option or create a separate build when you need to
measure real-world performance. Consider the fundamentals of graphics performance to
maintain high frame rates, and in turn, deliver the best possible experience to your players.
When examining rendering statistics, take note of the time cost per frame rather than frames
per second. The fps can be misleading as a benchmark because it’s nonlinear (see the graph
below). If you’re aiming for 60 fps, use 16 ms per frame as your frame budget (or 33 ms per
frame for 30 fps).
The Frame Debugger shows draw call information, so you can control how the frame is
constructed.
In the image above, the left panel shows the sequence of draw calls and other rendering
events arranged hierarchically. Meanwhile, the right panel displays the details of a selected
draw call, including shader passes and textures. This helps you play “frame detective” and find
out where Unity is spending its resources.
— Texture size: If the asset doesn’t get close to the Camera, reduce its resolution.
— Capacity: Fewer particles use less resources. Set the Capacity in the Initialize Block to
cap the System’s maximum number of particles.
— Visibility and lifetime: In general, if you can’t see something onscreen, turn it off.
The resulting plots will show how many particles are alive, or how that count compares to
the System’s set capacity. Adjust your count and capacity settings to improve your VFX
Graph’s efficiency.
— Operators and memory: Simplify unnecessary Operators. If not visibly different, use
fewer iterations.
— Flipbooks: Rather than simulate every particle, consider pre-rendering certain effects
into texture flipbooks. Then, play back the animated texture wherever the full simulation
isn’t necessary.
— Mesh size: If your particles are Output Meshes, be sure to reduce your polygon counts.
— Excessive overdraw: If you have a number of transparent surfaces, they will consume
your rendering resources. Use the Rendering Debugger (Window > Analysis >
Rendering Debugger) to check excess overdraw and tweak your graph accordingly.
Also, switch to octagon particles when possible.
When troubleshooting performance, enable or disable each Block with the checkbox at the
top-right corner. This lets you do quick A/B testing to measure performance (before and after)
so you can isolate part of your graph. Don’t forget to restore your Blocks to their Active state
once complete.
Bounds
The Bounds of visual effects comprise a built-in optimization based on visibility. You’ve
probably noticed a few settings that appear in every Initialize Context:
Use the Bounds settings to define where your effect will render.
If the Camera can’t see the Bounds, Unity culls the effect, meaning that it doesn’t render.
Follow these guidelines to set up each System’s Bounds:
— If the Bounds are too large, cameras will process the visual effects even if individual
particles go offscreen. This wastes resources.
— If the Bounds are too small, Unity might cull the visual effects even if some particles are
still onscreen. This can produce visible popping.
By default, Unity calculates the Bounds of each System automatically, but you can change the
Bounds Setting Mode to:
— Automatic: Unity expands the Bounds to keep the effect visible. If this option is not the
most efficient, use one of the other options below to optimize your Bounds.
— Manual: Use the Bounds and Bounds Padding to define a volume in the Initialize
Context. This is simple yet time-consuming to set up for all of your Systems.
— Recorded: This option allows you to record the Bounds from the VFX Control panel. The
Bounds, shown in red when recording, expand as you play back the effect. Press Apply
Bounds to save the dimensions.
You can use Operators at runtime to calculate the Bounds for each System in Manual or
Recorded mode. The Initialize Context contains a Bounds Padding input; use this Vector3 to
enlarge the Bounds’ values.
Mesh LOD
Take advantage of level of detail (LOD) if your particles are outputting meshes. Here, you can
manually specify simpler meshes for distant particles.
Particle Mesh Outputs have a Mesh Count parameter visible in the Inspector, which lets you
specify up to four meshes per output. When you combine this with the LOD checkbox, you can
automatically switch between meshes based on how large they appear onscreen.
Higher resolution models can hand off to lower resolution models, depending on the screen
space percentage in the LOD values of the Output context.
LOD resolutions
In this example, the SpaceRock_LOD0 model swaps with the smaller SpaceRock_LOD1 model
when the mesh occupies less than 15% of the screen.
LOD values
When creating a massive number of mesh particles, you won’t need to render millions of
polygons. This significantly cuts down the frame time.
Check out the PlanetaryRing example in this project to see the Mesh LOD firsthand.
Mesh Count
You can similarly leverage the Mesh Count without LOD. In this case, we use multiple
meshes to add randomness: The four different meshes for the Output Particle Lit Mesh
create a variety of props scattered on the floor.
Particle rendering
To hit your target frame rate and frame budget, consider these optimization tips when
rendering particles or meshes:
— Triangle particles: With half the geometry of quad particles, these are effective for fast-
moving effects and rendering large quantities of particles.
— Simplified lighting: If you don’t need the full Lit HDRP shader, switch to a less resource-
intensive one. Customize outputs in Shader Graph to drop features you don’t need for
certain effects. For example, the Bonfire sample scene uses a stylized Shader Graph,
which greatly simplifies the output.
— Low resolution transparency: In your HDRP Rendering properties, enable Low Res
Transparency to render your transparent particles at a lower resolution. This will
boost performance by a factor of four at the expense of a little blurriness. When used
judiciously, it can be nearly indistinguishable from rendering at full resolution.
— Octagon particles: Octagon particles crop the corners of quad particles. If your particle
textures are transparent in the corners, this technique can reduce or prevent overdraw.
Overlapping transparent areas still requires some calculation, so using octagons can
save unnecessary work computing where the corners of quads intersect.
If you are tracking multiple GameObject positions in your graph, you can make that information
accessible to your simulation via scripting. Graphics Buffers can similarly replace storing data
within a texture, as seen in some of the previous samples.
In this example, we pass the GameObjects’ Position and Color data from built-in Types, custom
structs, and compute shaders to a VFX Graph.
We use a script to define a Graphics Buffer and fill it with data from our GameObjects. We then
pass it into the VFX Graph through its Blackboard properties.
In this more complex example, an electrifying tower can access the approaching sphere
positions via Graphics Buffers. With many spheres, it becomes impractical to expose a
Property for every GameObject.
The Blackboard has just one custom struct for use within a Graphics Buffer. The tower can
potentially hit hundreds of targets, accessing their data with just a few Operators.
This demonstrates how your VFX Graphs can interact within your scene; think of complex
simulations like boids, fluids, hair simulation, or crowds. While using Graphics Buffers requires
knowledge of the C# API, they make trading data with your GameObjects more convenient
than ever.
Take a look at this project for other examples of how to use Graphics Buffers with the VFX
Graph in Unity.
Custom HLSL
Seasoned VFX artists and developers can now take advantage of the Custom HLSL Block. This
feature allows you to create unique effects that may not yet be natively supported in Unity.
With Custom HLSL, you could create advanced physics simulations, flocking behaviors, or
real-time data visualizations.
Custom HLSL nodes allow you to execute custom HLSL code during particle simulation. You
can use an Operator for horizontal flow or a Block for vertical flow within Contexts.
To be compatible with VFX Graph, a Custom HLSL Block must meet the following
requirements:
— There must be one parameter of type VFXAttributes with access modifier inout
Create the Custom HLSL node and then either embed the HLSL code or source an HLSL text
file. See this documentation page for complete requirements.
Writing low-level HLSL code can be more efficient than using a complex network of VFX Graph
nodes. By implementing certain calculations directly in HLSL, you can potentially improve
performance, especially for computationally intensive effects.
Custom HLSL can be used to procedurally generate particle attributes or behaviors based
on mathematical functions, noise algorithms, or other procedural techniques. This can create
more organic or varied effects.
This project demonstrates how to use neighbor search to simulate a 2D flock using a Visual
Effect Graph using Custom HLSL Blocks and Graphics Buffers (see the GridManager script).
We hope that this guide has inspired you to dive deeper into the VFX Graph and Unity’s real-time
visual effects toolsets. After all, our mission is to help every creator achieve their artistic vision.
With the VFX Graph, you’re fully equipped to captivate your players with hyperrealistic
simulations and stunning graphics. We can’t wait to see what you create with it.
Here is a collection of additional learning resources for taking on the VFX Graph:
Video tutorials
— Create amazing VFX with the VFX Graph: This covers many of the fundamentals for
setting up your own VFX Graph.
— The power for artists to create: This video highlights recent updates to VFX Graph
features, such as Mesh LODs, Graphics Buffers, and Shader Graph integration.
— VFX Graph: Building visual elements in the Spaceship Demo: This session unpacks a
number of techniques used in the Spaceship Demo.
— Build a portal effect with VFX Graph: Generate a portal effect by using the VFX Graph
to transform a ring of particles into a more dynamic effect.