Shader Editor Doc
Shader Editor Doc
Section 1: Basics
Installation → With Unity, either drag the UnityPackage into Unity, or, alternatively, select “Import
Package” from the Assets Menu inside of Unity. Place the UnityPackage into your Standard
Packages Folder (Applications/Unity/Standard Packages on OSX) to have Unity automatically
offer to include the package on forming a new project.
As of this writing, the package uses Pro-Only features for the drawing of the preview, and is
strictly limited to Unity3, as it creates surface shaders.
What does it do, exactly? → The Strumpy Shader Editor (SSE) provides a visual interface for the
creation of Shader Graphs (.sgraphs), aimed at simplifying the process of shader design and to
make it more ,“Artist accessible”- It is based on Unity3's new “Surface Shader” concept, which
allows the creation of shaders that properly interact with Unity's lighting schemes (Forward,
Deferred, and Vertex/Fixed)
However, it should not be expected to create every conceivable effect, and is unsuited for tasks
requiring advanced transformations or particularly advanced effects (Primarily those which
require loop unfolding, such as relief mapping, ray-marching, crepuscular, etc)- It is more suited
for defining how various texture maps should interact, a field in which it largely excels. It
should also not be expected to produce elegant output code, nor should it's code output be
considered proper- Rather, the code is stripped by the CG-Compiler producing largely optimal
code.
Getting Started → After the package is imported into Unity, Open the Shader Editor Window using
Window/Shader Editor. You can either doc the window into an existing set of tabs, or into the
main Unity Window- Whichever you choose, you can hit the Space key to maximize it, which
makes observing the graph layout much easier.
The default display (Fig 1-1) shows the basic layout of the editor. When launched, a new graph
is created, and the output node is placed in the upper corner. This entire upper left area is the
work area, where the graph is built. On the bottom of the work area you will find the buttons for
file operations, such as new, export, save, and load, aswell as the “Update Shader” button. The
Update Shader button will update the shader used in the preview to reflect changes in the graph-
Please be patient, it can take several seconds for it to generate all the permutations for various
lighting conditions. When a node is selected, there is an additional row of buttons along the top
left, allowing you to delete the node, or break it's connections.
Located on the bottom of the window you will find the preview- The two vertical bars on the far
left will adjust the rotation of the preview model, where clicking on the preview display will
show material settings for the display, in the same format as when you would create a material
normally.
Finally, on the right of the display, you will see the collapsable list of node blocks you can add-
I heartily recommend keeping these elements collapsed, so you can quickly find the node your
looking for, without having to thumb through a long list.
Figure 1-1: Overview
Designing your first Shader → The function list can be mighty intimidating, so rather then dive
into that, let's build up a simple shader; For that, first thing we need to do is assign a shader name- To
do this, click on the Master node in the node view- Below, next to the preview you will receive a
prompt for the variables of the node, in this case the name of the shader. This is what it will be listed
under when you create a material in your scene, for example, if you filled in Custom/MyShader, you
will be able to find it in the material drop-down under that name. As soon as you start to type a valid
name, the Master Node will recolor itself grey, showing that it has no error.
Once this is done, expand the Input section in the nodes list, and add a new input of the color
type. New nodes will appear in the upper left corner of your node window, so you may end up having
to move the Master node (Click and drag it) to see your newly created node. Unwired nodes will
appear blue, indicating they are not yet connected to the graph, and thusly will be excluded from
compilation. To remedy this, let's wire the Color value to the material- Click the box on the left of the
color node (it's output) to start drawing a connection, and place it on the Albedo input of the Master
node. With this, you can now hit “Update Shader” and have the preview display reflect your new
shader- If you did things right, you will end up with a solid black preview. This is because that color
input you created defaults to a black tone, to adjust the value it's using for the display, click the preview
and adjust the color value (Fig 2-1). Rather then actually requiring the user to define a color before
being able to see the result, you can adjust the default color by clicking on the Color node you had
created, and setting the value to, say, a bright white. You should also rename the color name to
something that better expresses what your doing, such as “Color”, which is what is used by the rest of
Unity's built-in shaders to describe color (aswell as being what color-changing scripts look for!), By
using names consistent with other shaders, the artist is free to change the shader on the material without
having to reassign all of his or her references. If your going to be releasing the shaders you create with
this tool, be sure to properly name your inputs!
Figure 2-1: A adjustable color shader we just created
When naming your inputs, Unity has some specific naming conventions used in the builtin
shaders, which are as follows:
Colors:
Textures:
By keeping names consistent with the builtins whenever possible, you can make it easy to
quickly flip through shaders in the editor.
Texturing →
As fun as having a single colored object is, most people would agree that to liven up a game
your going to need to use texture maps. The node for a plain 2D shader is Samper2D, where for a cube-
map it is Sampler Cube, both to be found under Inputs in the node list. With SSE, most of the operators
will pass a four component vector to each-other, however, samplers require a special function to read
from them- Tex2D, or in the case of Cubemaps, TexCube.
Let's create a simple diffuse shader: Start by hitting the “New Graph” button, and again, give
the master node a name, such as MyShaders/Diffuse. With that out of the way, add your Sampler2D
node, and drag it a fair distance to the left of your master node, so we have room to build between
them. In accordance with the naming conventions, give your sampler the proper name, in this case,
MainTex. Notice that the Sampler2D has two outputs, the actual sampler, aswell as a UV parameter.
The UV is the modified texture coordinates, after the user has defined the Scale X/Y aswell as the
Offset X/Y, and as not necessarily the same for different samplers. I'll get back to this shortly, but for
now, add a new function node, Tex2D. Tex2D samples a pixel from the provided sampler, filtered
according to the texture settings from the assigned texture, as follows:
For filter mode Point, Tex2D will return the pixel it is over
For filter mode Bilinear, you only get the pixels value if you sample directly on it, otherwise
the value is interpolated between adjacent pixels
For filter mode Trilinear (which only works with power of two textures), the result will factor
adjacent pixels aswell as adjacent mipmap levels, removing hard transitions when dealing with high
anisotropy levels (best used for floor textures, or particles with broken aspect ratios)
To use Tex2D, you connect the desired sampler and the position to sample- also be sure to wire
in the UV-set you want to use for that sample. Once you have connected them, wire the Tex2D node to
the Albedo input of the master (Fig 2-2). A very large proportion of the cost of a shader is determined
by how many Tex2D nodes you have, since each one directly correlates to the graphics card reading a
value off the texture. However, there is a major hit when you have dependent-texture-reads, that is,
when one texture read is required to evaluate another texture. Graphics cards are optimized to perform
all the texture reads together, which isn't as feasible when you layer your texture reads, such as in a
distortion shader.
Using Operators →
Let's create another effect, known as Double Vision (Fig2-3), which is accomplished by reading
the same texture twice, with different offsets, then recombining it. This effect is mostly used on organic
surfaces, to add variance surface appearance without using additional memory on a detail map. To start,
again, create a new shader, as before, name it properly and add your sampler map (Being the main
texture, name it MainTex), and arrange it so you have a good bit of space to work in. Add two Tex2D,
and wire the sampler into both of them, but leave the UV input empty for the moment. Unlike before,
where we directly link the texture's UV-set, this time we are going to modify it; under operation in the
node list, add both an Add and Subtract operator, aswell as a float4 type input. (Fig2-4)
Float4 (known in ShaderLab as Vector, and in UnityEngine as Vector4) is a four component
input type, which we will use to input the desired offset between our textures. Give it an appropriate
name, such as Offset, and assign a default value- UV coordinates are in the [0,1] range for each texture
repeat, so try to keep the default values in the [0,1] range. Connect the Sampler's UV outputs to the first
argument of both the add and subtract nodes, and the float4 input's value to the arg2 for both. With this,
we now have created two new UV-sets offset by our input values.
The next step is to actually sample the textures, so wire the results for the Add and Subtract
blocks to the Tex2D functions you added earlier. At this point, we now have two texture reads, but the
results remain unused. If, at this point, you were to wire the texture output to the Albedo output, you
would be able to offset the texture using your new offset parameter in the material settings, however,
we need to actually combine the results. There are operators available such as min or max, however
since we want the average, the logical approach is to add the results together and divide by two. Thus,
to accomplish this, add the new operations Add and Divide, aswell as a constant float4. Add the two
Tex2D outputs, then divide by the constant, the values of which you should set to 2,2,2,2. Finally, wire
the results to the master.
At this point, it's worth noting alternatives, to help you build better graphs. SSE always swizzles
out all arguments into float4 (using a repeating scheme, that is, float is .xxxx, float2 is .xyxy, where
float3 is simply a float4 internally), relying on the compiler to strip it down to the proper sizes (which
the compiler does quite well). Because of this, you can use a float input and constant instead of float4,
and still wire everything correctly, unlike other node based editors like ShaderFX. Further, division is a
more expensive node then multiplication, so you can further optimize by multiplying by 0.5 instead of
2. There is also the Lerp Function, which combines two inputs with a given blend factor.
Section 3: Using Secondary Maps
In an attempt to further vary surfaces, we have heralded the arrival of Secondary Maps. These
include Normal, Additive, Detail, Illumination, Specular, Gloss, countless others, all aimed at
improving the visual appearance of the surface. Normal maps are of particularly common use, being a
map that encodes a value by which offset the surface normal, and is commonly used in modern lighting
schemes.
Using normal maps requires a bit more effort then one would expect, as you have to integrate
the UnpackNormal function. To set up your graph, create two samplers (MainTex and BumpMap, as per
the naming conventions). For the BumpMap sampler, set the default texture to Normal, which shall
prevent the shader from looking 'wonky' when the texture is not set in the material parameters.
Similarly, you should set diffuse maps to white, additive maps to black, aswell as illumination maps to
black- the rule of thumb is you try to make an unassigned texture have the least impact on the final
result. The UnpackNormal function will format the raw Tex2D result so it can be fed into the normal
input for the Master node. (Fig3-1/2)
(Figure 3-1) Output of the normal mapped shader
Next on the list of outputs is “Emission”- Emission is a function of how self-illuminating the surface is,
and is combined with the texture such to ignore current lighting. Nothing special has to be done for
illumination maps, and the simplest programs can provide a texture output directly. (Fig 3-3)
Specular maps are an indicator of how reflective the surface is at any given point, and should be
expected to vary across the surface of the material, so you usually will see Specular Maps included in
most engines. Glossiness, on the other hand, is how “Sharp” the reflections should appear- a high
glossiness indicates that the surface has extremely sharp reflections, where a glossiness appears to be
slightly rough and plastic. While this can vary across a single surface, it usually is constant (use a
Range type input for these), but when combining wet/dry surfaces in a single material you may find
yourself wanting to use a full surface map.
Figure 3-4 (left): The
extended illumination
graph. 3-5 (up) Result
Alpha mapping, unlike the other modes, is generally contained in the fourth channel of another
texture. If we directly map the Tex2D RGBA result to the alpha component of master, we would end up
mapping the Red Channel (RGBA has components mapped to XYZW, so when used as a single float, it
takes the X value, or the R-channel), instead of the alpha channel like we want. To get around this, we
can use the Splat function, which takes a specific channel from the original and copies it out over the
other channels (Which the compiler will then strip any unnecessary computation from), allowing you to
use an individual specific channel. To use it, simply create the node, and select the proper splat channel,
in this case, W, then push the result to the alpha component of Master. (Fig 3-6)
(Figure 3-6) Alpha Mapping