If you are starting to get your feet wet with Blender and you are now looking towards materials and how to use Blender nodes, you have come to the right place. For the seasonal Blender user there can also be gems here. After all, as we progress and become better artists we understand that the basics is what everything rests upon.
How to work with Blenders shader node system?
I aim to combine important details with a basic workflow using image textures. The goal is that when you have read and understood this article, you will have a pretty deep understanding of how Blender nodes work under the hood, and also be able to create a large number of materials with just a handful of nodes. In other words, a solid foundation.
Let’s start by taking a bird’s eye view on nodes. The reason being, to raise your awareness to what nodes are and how widely used they are. Especially in 3D art, visual effects and the game industry.
When we use nodes, we are programming. It's good to understand basic logic and math when you dive in. But that shouldn’t scare you, you can still do very much even if you don’t feel confident in those subjects.
Node systems are used extensively in computer graphics and game making. In Blender, we mostly use it for shading, in other words, create materials.
Node systems are gaining popularity at a rapid pace. The reason for this is that they are easy to grasp and understand compared to programming. There are node systems for creating game logic, making textures and animating just to name a few uses.
Nodes are tools that enable a person to do more with less time by removing some details. The trade-off is that we lose a bit of control compared to programming, but with nodes the trade-off isn’t that huge. We can still make pretty much anything within the bounds of the node system, and rarely do we run into problems that can’t be solved with a well-developed node system.
Let’s look at a few common node-based systems.
For creating game logic, the most well-known node system is probably Unreal Engines blueprints. With Blueprints you can pretty much create any kind of game without having to write a single line of code.
Another node system for another purpose is Substance Designer. Substance designer is a PBR (Physically based rendering) material authoring tool that is completely built on a node-based system. It is considered the industry standard for material authoring by many people.
Another well-known node-based application is Houdini. Houdini is primarily known for its outstanding simulation and effects capabilities. We can use it to create tools that can generate different geometry. For instance, you can create building generation tools or weather effects.
Those are just a handful of the node systems available, and these few paragraphs can’t possibly describe how powerful they can be.
But what about Blender?
Blender has a few nodes systems. The first and obvious one is Blenders shading system for Cycles and Eevee. This is the node system that we will focus on in this article.
But we also have nodes for compositing, lighting and textures, even if the use case and future for texture nodes are uncertain at this point.
We can also extend Blender with other node systems through add-ons. The most well-known is probably animation nodes that come bundled with Blender.
There are also other node systems available for Blender. AMD ProRender for example, a third-party render engine that has its own shader node system. Another example is Luxrender. There is also mTree for generating trees with nodes and Sverchok that can manipulate all kinds of data with nodes.
I think you get the point. Nodes are a big part of the future in 3D art and game making.
There are lots of nodes to learn. Learning one node system will feed into other systems. Starting with Blender nodes and shading with Eevee and Cycles is a great start, and in this guide, we will take it from the beginning.
I will assume some basic understanding of Blender, but I will assume no prior knowledge of shading with Eevee or Cycles.
I will not cover UV mapping here. For UV Mapping you can find my guide to it here:
Related content: The definitive tutorial to UV mapping in Blender
In Blender we have two render engines, the real-time render engine Eevee and a ray-traced render engine, Cycles.
The shader system for these two render engines are mostly compatible, apart from a handful of nodes that only work in one engine. We won't use any of those nodes today. Everything we do here can be used in both engines.
I will use Cycles as a base for my explanations since it is much easier to understand it from a ray-traced engine perspective as opposed to a raster-based engine like Eevee that is approximating a lot to gain speed.
Now we can begin by looking at the interface. For shading we need a 3D viewport and a node editor. Occasionally, we also need an extra window for an image editor or a UV editor.
You can rearrange your interface; how you like, after all, I’m not there watching you. But for me, I will just switch over to the Shading workspace since it is already pretty much setup.
In the properties panel, I will also change over to the material properties tab.
If we select the default cube, we should have a default material slot with a material called “Material” added to it.
That is a mouthful, but let me explain. A material slot is just an empty placeholder where we can add materials. If we look at the list at the top of the material settings, you can see that we can hit the plus. This will add a material slot.
The material slot can then contain a material, and this material holds all the nodes. We add the material with the “New” button that appears when an empty material slot is selected in the list.
If we select a slot that has a material attached, we will see the corresponding nodes in the node editor.
We will also see the corresponding nodes in the properties panel.
The only difference is that in the properties we get a list of the node tree while in the node editor we get a full interface with nodes and connections.
Most people find the node editor much easier to navigate and understand.
Just know that in the properties panel, the surface and volume sections are compressed versions of the node editor. Mostly, we won’t go into these sections since the interface is redundant to the node editor.
Let’s turn our attention to the node editor. In the header section, we have a checkbox called “Use nodes”. We can think of this as the ON/OFF toggle for the material. If you don’t see any nodes, make sure you check it and that you have a material selected from the list in the properties panel.
By default, we see two nodes. One called “Principled BSDF” and then a line connecting it to a “Material Output” node.
Let’s come back to these soon.
Just like any editor, we can press “N” or “T” to toggle the tools panel and properties panel inside the editor.
To move a node, we click and drag it or press “G” to move it.
We add new nodes through the “add” menu or press “Shift+A” with our mouse hovering the node editor.
To connect one node to another we press and hold to drag out a connection that we drop on an input socket of another node.
To break a connection we can click and drag the input side of the connection and drop it in an open space. We can also hold “Ctrl” and right-click hold and drag across a connection to break it.
All slots on the left side of a node are input sockets and all slots on the right side are output sockets.
Every output slot can connect to many input slots, but only one input can go into every input slot.
If we direct our attention to the two nodes we have we can then see that we send some “BSDF” output to the “Surface” input of the Material output node.
The material output node is special. It is the only node without an output slot. Instead, it is the actual output.
You can imagine the node tree as a factory, each node is a station where something is happening. Made, changed or added, and at the end we send the result out the door as a finished product. In this case, as a finished material we see on our object in the 3D viewport.
Factories, or node trees can range from simple to complex. But they all end at the “Material output” node.
We can connect 3 different connections to the output node. One for the surface, one for volume and one for displacement. We will only focus on the surface in this article, though.
If at some point your material turns black, you can make sure you didn’t accidently change the last node to output into the “Volume” instead of the “Surface”. It is a common beginner problem that can be hard to spot.
If you go to the add menu in the shader editor or press “Shift+A” you see that there are a handful of categories with nodes. At first it may seem like a lot but Blenders nodes are actually few compared to other node systems.
I will help you filter the list of nodes so you can get a better idea of what nodes are the most important when starting out. I will give you a kind of starter pack of nodes.
We need two shaders. The principled BSDF and a Mix shader. We can find these in the Shader submenu. The principled shader is an Uber shader. This means it replaces a lot of other shader nodes and put everything into one. If we understand the principled shader, we are well on our way to create a very large variety of materials.
Related article: Physically based rendering and Blender materials
The mix shader allows us to mix two shaders together.
The nodes in this category are first the “Hue Saturation Value” node. This node helps us shift colors if we want to make a color slightly darker, maybe more green or desaturate. For these color shifts this node is great.
The second node in this category is the “Brightness Contrast” node. Just like it says, it will allow us to increase or decrease brightness and the contrast of its input.
We will use two kinds of texture nodes. The image texture node is a cornerstone in our node trees. It allows us to add a texture instead of a flat value.
The second node will be the noise texture node. We will use this to create masks.
In this category, we will look at the separate RGB and Color ramp nodes. The Color ramp node is one of the most important nodes in the entire shader system. The separate RGB will take a color input and separate the red, green, and blue into individual values.
We will use a single input node, the texture coordinate. I will explain this node more in the “Introduction to vectors” section.
The next node is the mapping node. We will use this together with the texture coordinate node. It is much like the “hue saturation value” node but for vectors instead of color. It helps us shift and change positions.
The normal map and bump nodes are two special nodes that we will use to add some micro scale surface imperfections to our material surface. We will not cover normal maps extensively here, but we will look at how we can use them.
Those are all the nodes we will use, 11 nodes plus the material output node.
Now of course there are other important nodes, remember that this is just our starter pack and you can soon expand beyond this subset of nodes.
For a full guide on all Blender nodes, I can recommend the Cycles Encyclopedia. Read more about it in the resource pages.
I will now take you on a tour of a single node and dive deeper into what it does.
We already know about inputs on the left and outputs on the right from the nodes’ perspective. A lot of nodes also have a bunch of settings and different colored sockets.
The color of a socket determines what data comes in and goes out of that socket.
Grey sockets take or send a single value. It can be 1.0, 0.0 or 3.141926 or any other number. 5000.0 perhaps or why not 600153.12?
Sometimes, a grey input expects a value in a certain range. Often this range is from 0.0 to 1.0, representing a percentage value. Sometimes it is from 0 to 16. Or some other range.
If you look at all the grey inputs in the principled BSDF, you can see that all of them have a corresponding slider with a single value. We can either set the slider or input something that comes from another node. This is true for all gray single value inputs.
Sometimes when we are dealing with values, the slider allows us to slide between two values, let’s say maybe between 0 and 1. The specular slider on the Principled BSDF is a good example. Even if this slider is limited to values between 0 and 1 when we slide it we can input a higher value if we typed it in instead of sliding. This will give us more specular.
However, this is not the case for the roughness slider, for instance. There is also no way for us to see directly in the interface what values it accepts. This is something we have to learn and get a “feel” for.
The yellow inputs are color data. Color is represented by three values. Red, green and blue. Next to each yellow socket is a color picker that we can use to set a color or we can use the input socket.
Color values are within a fixed range. For web, we often use hex codes like #FFFFFF for white and #000000 for black, #FF0000 for a bright red and so on. In 2D image applications, it is common to use three values from 0 to 255. It can look like this, (127,255,0) for some lime green color.
In Blender, we have three values from 0.0 to 1.0. These systems work the same, the ranges are just mapped differently. The lime green color above would be something like 0.5 for red,1.0 for green and 0.0 for blue in Blender.
That is the color sockets, but what about the purple ones? They are vector slots. But we will introduce them a little later. For now, just know that they are like color, three values. But instead of representing a color, they represent a position in space and maps to the X, Y and Z coordinates instead of R, G and B.
The green sockets are the most complex, but the complexity is hidden from us, making them very simple to work with. We can’t access the complexity, anyway.
There is a specific set of shader nodes. The shader nodes are mathematical calculations that translate the simpler single values, colors and vectors into parameters for the render engine to understand.
Appart from the material output node, there are only two nodes that take a green socket as input. The mix shader and the add shader.
The add shader does not have any parameters and the mix shader has only one parameter that we can change, a gray input slot ranging from 0 to 1.
The add and mix shader nodes are the only way we can combine shaders.
Many times, we may only have a single shader connected to the material output and no other green sockets.
We now understand the interface. We looked at a good set of nodes to start with and we dissected what makes up a node. In this part of the article I want to continue to give you more background information. This time in terms of the method we use to create materials. Instead of zooming into a single node, I now want to zoom out and look at a broader picture.
We could go ahead and start connecting nodes left and right, but it is much better if we have some structure to how we create materials. It turns out that there are two well established standards for creating materials. The glossiness and metallic workflows. Blender has support for the metallic workflow.
So, what is a PBR metallic workflow?
It is a standard for how we take real world material properties and simplify them to something we can work with and have the render engine understand.
Blender supports the metallic workflow through the use of the Principled BSDF shader node.
You can read more about the metalness workflow and Principle BSDF through the link below. In this article, we will just make an overview.
Related content: Physically based rendering and Blender materials
The backbone of the metalness workflow is the diffuse/albedo, roughness and normal map. To this we also add a metalness map. A map is just an image.
The diffuse map gives us the color of the material.
The roughness will tell us how rough the surface is. A mirror would have little or no roughness. A value close to zero. At the same time, a rock would have plenty of roughness or most values close to 1 for its roughness.
The normal map is angle data. It tells incoming light what direction it should bounce of in. This allow us to add or fake detail that would otherwise require a huge amount of geometry.
The metalness map is special. Like roughness it is a black and white mask. But metalness tells the shader what kind of material this is. Is it a metal or non-metal.
A non-metal is often refered to as a dialectric material. The reason for us to make this distinction between metals and non-metals is that metals interact with light in a certain way and all other materials interact with light in another way.
With this map, we can then tell the Principled BSDF if it should calculate our material as a metal or as a non-metal.
This allow us to focus on the color, roughness and normal maps. The shader will take care of all the other calculations and properties a material has depending on these inputs.
We will be using image texture maps to map not only one value across the whole surface, but many values for individual pixels. In this part and the examples coming later, I will use a set of textures from texturehaven.com called “forest ground 03”.
You can find many free textures on texturehaven.com and in the resource section you can find a whole list of additional sites that offer free textures and a few paid ones.
I opted for the 2k size for this example. We get no less than seven downloads for this texture. Let’s look at what we got.
Each of these maps is prepared with one piece of information about how this texture interact with light. The materials way of interacting with light is the only thing we really care about. That is all there is to shading.
Notice how there are more maps here than we have talked about so far. There is also one missing, the metalness map. These are all maps that we could potentially use. But the core is still the diffuse, normal and roughness maps. Since there isn't any metal in this texture, we can just set the metalness slider in the Princepled BSDF to 0 and leave out the metalness map.
We will ignore the other maps for now. If you want to dive deeper into those maps, I would suggest to start looking up more about displacement since it has the greatest impact.
We will start with creating a full material and I will explain concepts as we progress and going more into depth about the nodes we use.
We need to do a handful of changes to set up our scene.
I will start by going to the top left drop-down menu in the node editor and change from object to world. We are now in the node system for the world material. Here I will add an environment texture node and connect the yellow color output to the color input of the “background” shader.
Next, I will press the open button and browse for a hdri image. For this example, I downloaded the 2k variation of a hdri called “quarry_01” from hdrihaven.com
Hdri image: Quarry 01
Then I will jump back to object nodes in the drop-down menu.
Look at that, our first material ended up being a world material as opposed to an object material.
In the properties panel under the render tab, change the render engine to Cycles and switch to GPU rendering. In the sampling section, I set the samples for the viewport to 200.
In the film section, I will check the transparent checkbox to have the background render transparent instead of having the hdri in the background.
Then in the 3D viewport I will change to rendered view so we get a Cycles preview.
I don’t want to render the full 3D view for performance reasons. To speed up the render I will press “Ctrl+b” and draw a box in the 3D viewport. Now only the inside of the box will be rendered as a preview. To clear the border, press “Ctrl+Alt+B”.
I will show you two examples. We will start by looking at how we can use a full set of texture maps to create our material. Then we will take a look at the same example but using only one texture map and try to generate the other maps from this one texture.
I have browsed for the folder where I stored the texture maps in the file browser window. From here I can drag and drop each texture that I need into the node editor.
To see clearer I set the view to vertical list and deselect date and size. This way I can see the full filename so that I know wich is wich.
Then I drag in the base color, labled diff for diffuse. The normal map and the roughness map. Make sure that you click and drag the icon. If you click and drag the text Blender will assume box selection instead of drag-and-drop.
In order for our images to be used properly by the shader we need to set the color space. For base color input we need image textures to be set to the “sRGB” color space. For all other inputs we should set “non-color”
For more information about why we need to set these, read the Physically based rendering and Blender materials article.
Connect the color output from the diffuse texture to the base color of the Principled BSDF.
Connect the color output of the roughness texture to the roughness grey input of the Principled BSDF. The color will be converted to greyscale when we plug color data into value data.
The normal map texture needs special care. For this texture we need to tell Blender that this map should be converted from color data, meaning RGB values to vector data, meaning XYZ coordinates.
We do this by adding the normal map node between the color output of the normal map texture and the principled normal input.
This is the basic setup when using a set of image textures.
I will add a subdivision surface modifier of level 4 to our cube just to make the preview render a bit better.
Now we will add a few nodes to make slight adjustments. Fist I will add a “brightness/contrast node” and a “hue saturation value” node between the diffuse texture and the base color input.
If I hover the selected node over a connection it will highlight and connect the inputs and outputs automatically.
Sometimes Blender may connect them wrong when there are multiple viable input/output pairs. In those cases just reconnect by drag-and-droping the connection to the correct slot.
I use these values for the Brightness/Contrast node
For the hue saturation node I use these values.
These values just slightly changes the color of the diffuse map. But we can do more drastic changes as well, just play with these nodes.
I also add a plane just below the sphere to have some light bounce to the back of the sphere.
We will now take a look at how we can easily adjust the roughness map. Just like with the diffuse, we will add a node in between. This time we will use the colorramp node.
The values in our roughness map is very close to 1 in most places of the map. This means that the surface is very rough. To introduce more shine and glossiness we can use the color ramp and push the dark values closer to the white ones. In this way we increase the contrast introducing more veriety and end up with a more wet looking material.
To make the color reflect the wetness I also change the brightness/contrast node and the hue saturation node.
For the normal map we only have one value to play with. That is the strength in the normal map node. I increased this to 10 in this case to create more contrast in the reflections.
By the way, an interesting follow up here is to check out how node groups work in Blender.
This is the node tree
And this is the rendered result.
In some cases we only have one texture to work with. Normally this is a diffuse map of some kind. While having a full set of texture maps is often prefered, a single texture can also do the job pretty well.
The major downside of only having one texture is that we have to tweak settings a bit more and it is easy to end with a result that looks flatter when we don’t have all the maps available. On the other hand, using fewer texture maps is a major plus because we don’t need as much RAM to store textures in.
Let’s take a look at the node setup and render result.
This time I chose to use the separate node to separate the individual Red, Green and Blue channels. Each channel can then be accessed as a black and white image.
To quickly scroll through and see how each output looks I use the node wrangler add-on. Node wrangler comes with Blender by default and can be enabled by going to the Edit menu and preferences. Find the “Add-on” section and search for “wrangler”. Check the checkbox to enable the add-on and close preferences.
Now you can “ctrl+shift+left click” any node to view it’s output. Use the same shortcut and click the prinsipled BSDF to get your full material view back.
Click the same node multiple times to cycle through the different outputs.
Since we are now looking at a wet surface we want a roughness map that has some darker values and modest contrast. The pine needles in the textures are probably going to reflect the most since they break up the light pattern and stick out from the surface. Therefore we want our roughness map to have darker needles and lighter background. I choose an output from the “separate RGB” node based on these thoughts. Then I adjust it with a color ramp.
Same deal with the normal map. I want to separate the needles from the background. In this case though I want a different dark to light ratio and I want the needles to be lighter since the lighter parts will stick out from the surface.
Based on these thoughts I look at the different outputs and use the output I get the best results from. In this case I went with the red channel.
Notice here that we use the bump node instead of the normal map since we need to convert a black and white height map to a normal map. If we have both a height and a normal map we want to convert we should plug the normal map output to the normal input of the bump node before pluging the bump node into the principled bsdf.
So far we have taken a look at the nodes and their inputs and outputs. We also made a ground material with a full set of texture maps. We then looked at how we can set up a material from a single texture.
Now we will look at some options about how we can tile, scale and project textures to the objects using vectors.
So far we have not looked at how we can change the way we map the texture to our object. This is done using vectors.
Vectors demands a bit of explaining. A vector in Blender is 3 values put together to represent a point in space given the X, Y and Z coordinate.
Many times you might hear that a vector is a direction. Even if in reality it is a single point in space, and it can’t have a direction since it is only one point. The reason why a vector is sometiems refered to as a direction is that a vector requires a coordinate system and the vector will always be some distance and some direction from the origin point of that coordinate system.
Therefore we need to know two things when dealing with vectors. We need to know the position of the vector. But we also need to know what coordinate system we are using.
Let’s take a look at the texture coordinate node. It is an essential node that is used very often.
The texture coordinate node has no inputs but a handful of outputs. Each of these outputs is a coordinate system, an origin point that we can map a texture in.
The texture coordinate node is essentialy a smorgosboard of coordinate systems we can choose from that each can tell Blender how we want a texture to appear on the 3D object.
Here are a few examples. The red color correspond to the X direction and the green color corresponds to the Y direction. The Z is towards or away from the camera. Any black portion represents negative coordinates since any color that has a value of 0 or lower is black.
With image textures, we are generally bound to two ways of mapping textures. Either UVmapping or Box projection.
UV Mapping is the most common form of texture projection on to object surfaces. It gives us complete control over where on the model each part of the texture goes.
Box projection on the other hand is much less flexible. Instead it is very quick to work with and does not require a UV map, allowing us to skip the UV unwrapping stage completely.
You can read about UV mapping in my “Definitive tutorial to UV mapping in Blender” article and Box mapping can be learned in the “Blender box mapping workflow, a quick look” article.
Let’s see how we can approach this in the node editor.
The textures we used so far has been mapped to the object in some way without us specifying how. This is because an image texture will use the default UV map if nothing else is plugged into the node. For all the other textures that are procedural, the generated output from the texture coordinate node is assumed by default.
We can specify another method if we want by plugging the texture coordinate node into the vector slot of one or more of our image texture nodes.
If we also connect the mapping node between the texture coordinate node and the image texture, we can tweak the transformation of the original vector before it reaches the image texture node.
Note that we can't use the mapping node alone. The mapping node does not have a default input like the texture nodes, so we need to provide one or we won't get a proper vector input for the texture node.
We have not yet used the Mix shader node or the noise texture from our starter pack. We can use these together with a colorramp to create a mask and combine two sets of textures making up two different materials.
To look more into how this is done I suggest that you take a look at this article where I go through how to mask an object in this way and combining multiple layers to create a final material.
Related content: Physically based rendering, blender nodes, with brick texture example
With this knowledge under your belt you should be able to create most basic materials with image textures. You have also learned about what values we actually pass around the node system. It is only values. Sometimes single values and sometimes combined in a group of three in the corm of vectors or in the form of colors.
From there the shaders take care of these inputs and convert them to materials with real world properties.
Now, there is much more to learn about materials. A good next step could be to start looking at procedural materials. Just know that it requires a fair bit of math.
Thanks for your time.