Artisticrender is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

How to use the GPU to render with Blender

In this article we are going to cover some common questions about GPU rendering in Blender. By default the GPU isn't used for rendering in Blender, so for new artists, they may be missing out on a lot of performance if they don't configure their Blender correctly.

In general, the GPU is the preferred device to use for rendering in Blender thanks to its superior performance. If you have a GPU to use with Blender, go into Edit->Preferences and the system settings to enable Optix for most Nvidia graphics cards and OpenCL for most AMD graphics cards.

Read on for more details.

How to use the GPU in Blender?

First, why doesn't Blender render with GPU by default? Blender is configured to use the CPU during rendering. This is likely because Blender should work out of the box on as many different types of hardware as possible. But we can easily enable GPU rendering with just two steps if we have a dedicated GPU with support for Cuda, Optix or OpenCL. That is, most dedicated AMD and Nvidia graphics cards that is 10 years or younger.

To enable GPU rendering in Blender with Cycles follow these steps.

  • Go to Edit->Preferences
  • Open the systems section
  • At the top, find Cycles render devices.
    • For AMD and Intel GPUs, turn on OpenCL, your GPU should get listed if it is available
    • For Nvidia GPUs, Turn on Optix, your GPU should get listed if it is available
    • For older Nvidia GPUs, you may need to use Cuda.
  • Make sure that the checkbox next to your Graphics card is checked.
  • Check the box next to your CPU if you want to use both GPU and CPU.
  • Close the preference window.
  • Go to the properties panel and the rendering tab indicated by the white camera icon
  • Change your render engine to Cycles
  • Change Device to GPU compute
  • GPU rendering is now turned on for rendering in Cycles.

Cuda and Optix are both available for GPU rendering with Nvidia graphics cards. Most later generations of Nvidia GPUs now support Optix. Optix is the newer and faster option.

Graphics card inside pc
Photo by Vagelis Lnz on Unsplash

Does Blender need a GPU?

Blender does not require a GPU. But the answer is a bit more involved than that. All desktop and laptop computers that has a screen has a graphics card. So, any regular user already has a graphics card, but it may not be a dedicated graphics card.

So, to work in Blenders interface you obviously need a graphics card because you need a plugged-in screen that requires a graphics card. But Blender can also run from the command line.

You can start Blender on another computer over the network and in this case, no graphics card is needed, but the functionality is limited. This is, for instance, how render farms run. They fire up multiple instances of Blender to render on server hardware to compute the render. Then they send the finished render over the Internet to the customer.

So, if your intention is to have render run on a headless machine as a render station or small-scale render farm, that is possible without a graphics card.

Will my GPU render faster than my CPU?

If your GPU is capable of rendering through Cuda, Optix or OpenCL it will likely be faster than rendering with CPU. Both dedicated Radeon and Nvidia GPUs are supported.

There are cases when a CPU is faster at rendering, but it is rare to see this in a PC with a dedicated GPU.

You can find data on rendering speeds by going to opendata.blender.org. Here you can press "search data" and input the devices you want to compare.

What is the benefit and downside of using the GPU?

The primary advantage of using the GPU for rendering is performance. GPUs are usually much faster at rendering than their CPU counterparts even if there are cases where the CPU may be faster.

There are a handful of downsides of rendering with the GPU, but the downsides are also growing fewer as Blender develops.

The most prominent downside is the memory limit. Historically, GPUs has been limited to the graphics memory dedicated on the graphics card itself. It has not had access to store data in system ram for use during rendering. Often limiting the memory, a scene could use to 2, 4 or 8GB of RAM depending on the GPU memory size.

If the scene could not fit, it meant that the GPU could not render the scene at all.

Recently however, graphics cards gained the ability to use system memory if needed to store the scene data, so this is now much less of an issue. If the graphics card needs to use the system RAM, the render process will be slower than if the whole scene can fit into graphics memory, but it is still an improvement over needing to use the CPU instead.

Another downside of rendering with GPU is that some features may not be supported for GPU rendering. Sometimes new features are supported on CPU rendering only before they are developed for GPU rendering.

You can go to this page in the Blender manual to see the currently unsupported feature by each GPU rendering technology. Here are some examples.

  • Open Shading Language
  • Advanced volume light sampling
  • Branched Path Tracing (Not supported on Optix)
  • Baking (Not supported on Optix, falls back on Cuda)

Can Blender use multiple GPUs?

Yes, Blender can use multiple GPUs. Multiple GPUs come into play during rendering with Cycles, but not in Eevee. Go to Edit->Preferences and find the systems section. If you have selected the correct compute device, all your available graphics cards are listed here. Make sure that they are all selected to use all of them. You can also render with both GPU and CPU at the same time, just select your CPU as well to have it render along your GPU(s).

Keep in mind that it is only during rendering with Cycles that multiple GPUs come into play. There are ways you can use multiple GPUs to render with Eevee.

Does Blender use raytracing?

Yes, Blender uses raytracing. Raytracing is a technique used by a specific type of render engine. Blender has two render engines, Eevee and Cycles. Cycles use raytracing while Eevee does not. Eevee is instead what is known as a rasterized engine.

Raytracing is more computational heavy but typically renders much more realistic results because a raytraced render engine mimic how light behave in the real world while a rasterized engine relies on other techniques to maximize speed instead of realism.

Both kinds of engines can be fast, and both kinds of engines can look realistic.

Is rendering bad for the GPU?

In most cases, rendering will not harm your GPU. If your GPU is harmed during rendering it is most likely caused by other factors. For example, there may be excessive amounts of dust build up inside the GPU cooler that hinder the proper airflow required during intense processing such as rendering.

If your computer is relatively clean, air can flow properly and the environment the computer is in isn't excessively hot, the risk of the GPU taking damage as a result of heavy load is minimal.

Even if you run the risk of harming the GPU, your computer has mechanisms in place to shut down if overheating should occur, potentially saving your hardware from damage.

Does Eevee use GPU or CPU?

Eevee uses GPU only. It uses OpenGL to render, but it does not require a dedicated GPU with a compute device such as Optix, Cuda or OpenCL, so any GPU with support for OpenGL 3.3 or later should work.

Final thoughts

In most cases you would want to enable your GPU for rendering in Blender. There are only in very special circumstances the GPU isn't the better option if you have one.

Author

Erik Selin
3D artist, writer, and owner of artisticrender.com

My top product picks for Blender artists

Recent posts