OpenGL Demos
OpenGL Misc
MSG Board
Megabyte Softworks
C++, OpenGL, Algorithms

Current series: OpenGL 3.3
(Return to list of OpenGL 3.3 tutorials)

Download (3.32 MB)
4620 downloads. 4 comments
16.) Rendering To A Texture

Welcome to the 16th OpenGL 3.3 tutorial. This time we're going to learn how to render to a texture. What's this good for? Imagine a situation where you have security cameras somewhere in the scene, and in the other part of scene there's a terminal, where you want to see camera image. How would you do that? Exactly! You must look at the scene from camera's view, render the scene somewhere, and then copy the final image (to the texture). Then you can apply that texture on that terminal's camera screen. This is probably the most common use and it's calling rendering to a texture.

Significant change in code now

This tutorial also has some significant changes compared to all previous - my coding style opinions have changed recently and I renamed all functions to begin with a capital letter. You know, it just makes sense to differentiate between functions and variables. And the good way to do that is to have variable names starting with lower-case letter and function names starting with capital letter. Another small change is that I unified the deleting / releasing function names. Now they are all beginning with Delete, just like in OpenGL. It's because some classes had releasing function names beginning with Delete and some with Release and it was to avail.

In this tutorial, we'll create SpongeBob watching The Avengers . Well not actual Avengers movie, but a rotating and moving Thor model on the TV screen. So let's get familiar with some new terms.

Framebuffers and renderbuffers

If you haven't been introduced to framebuffers and renderbuffers and your reaction to them is like:

then this tutorial should help you (I wanted to put at least one picture into article other than the screenshot ). Framebuffers and renderbuffers are another types of OpenGL objects (so they're created the traditional way with functions starting with glGen), that allows us to do off-screen rendering. That means, that you render, but not onto screen, but somewhere on virtual screen, a.k.a framebuffer. After that, you can read the framebuffer contents, most generally produced final 2D image, and you create a texture from that, which you can apply anywhere. So what's a renderbuffer then? The thing is, that framebuffer consists of multiple renderbuffers, and you already know some of them. The default framebuffer (with name 0), which is normal on-screen rendering has a color buffer (storing RGBA), depth buffer (storing pixel depths), then optional stencil buffer and maybe some other buffers. All these sub-buffers are called renderbuffers. So that's how it is - nothing difficult .

Working with framebuffer

Now we're getting to part where you'll see how to use these objects. We can summarize using FBOs for the purpose of this tutorial in these 6 steps:

These are all important steps we need to do. We'll go through them one by one. But first, the framebuffer class to handle everything nicely in one place:

class CFramebuffer
   bool CreateFramebufferWithTexture(int a_iWidth, int a_iHeight);

   bool AddDepthBuffer();
   void BindFramebuffer(bool bSetFullViewport = true);
   void SetFramebufferTextureFiltering(int a_tfMagnification, int a_tfMinification);
   void BindFrameBufferTexture(int iTextureUnit = 0, bool bRegenMipMaps = false);

   glm::mat4 CalculateProjectionMatrix(float fFOV, float fNear, float fFar);
   glm::mat4 CalculateOrthoMatrix();

   void DeleteFramebuffer();

   int GetWidth();
   int GetHeight();

   int iWidth, iHeight;
   UINT uiFramebuffer;
   UINT uiDepthRenderbuffer;
   CTexture tFramebufferTex;

Let's get through the main functions. CreateFramebufferWithTexture does first two steps. It just calls glGenFramebuffers function, and creates initally empty texture for FBO (empty texture is created using normal glTexImage2D function, but with NULL pointer to data). Important thing to notice are the parameters width and height. Before using FBO, you must specifiy its dimension. They don't have to be powers of 2, it will work with any numbers as well. But I chose the framebuffer in this tutorial to have 512x256 dimension. Important OpenGL call in this function is:

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tFramebufferTex.GetTextureID(), 0);

This is attaching texture to the framebuffer. The first parameter must be GL_FRAMEBUFFER, second tells what part of framebuffer we want this texture to store. GL_COLOR_ATTACHMENT0 is the framebuffer colorbuffer. A FBO can have multiple color attachments, but GL_COLOR_ATTACHMENT0 is default and rendered image is stored there. If you want to have 2 color attachments, for example one for normal RGB image and one for let's say grayscale image, you can do this, but in the fragment shader, where you have output color specified, you would have to specify it like this:

out vec4 outputColor;                     // Normal output, GL_COLOR_ATTACHMENT0, no need for layout keyword
layout(location = 1)out float fGrayscale; // One float per pixel for grayscale value, GL_COLOR_ATTACHMENT1

Notice, that our texture is only RGB, but we actually output vec4. But it doesn't matter, OpenGL seems to be intelligent enough to copy only RGB values. It worked both on nVidia and AMD cards, so I don't care about deepest details how compatible these outputs and textures must be as long as it works.

Let's get into step 3 - adding depth buffer. It's all in the function AddDepthBuffer. The depth buffer isn't in newly created FBO by default, so we must add it there. Therefore, we create one renderbuffer using function glGenRenderbuffers. After that, we bind it (as with all OpenGL objects we're about to work with) and initialize its size and type with:

glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24, iWidth, iHeight);

First parameter must be GL_RENDERBUFFER, second is the renderbuffer type, for depth buffer I used GL_DEPTH_COMPONENT24, which is depth buffer with 24-bit precision (3 bytes per pixel), and the last two important parameters are renderbuffer's width and height. These two must match dimension of FBO we want to attach renderbuffer to. Final step is actual attachment of renderbuffer to FBO using function:

glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, uiDepthRenderbuffer);

The first parameter must be GL_FRAMEBUFFER, second specifies that we're attaching depth buffer, third must be GL_RENDERBUFFER and the last is ID of previously generated renderbuffer.

Now, the FBO is ready to be used. Let's have a look into the RenderScene function. Before we render the real on-screen scene, we're going to render The Avengers scene into our FBO. That's why we bind our FBO using call glBindFramebuffer, which has two parameters - first is always GL_FRAMEBUFFER, and second is FBO ID. This is wrapped in function BindFramebuffer of our class and it's the step 4.

Now we're good to proceed with rendering our Avengers scene. We'll render normal way, just like rendering on-screen, but the results are stored in FBO and in its associated texture. This is step 5 and after all the rendering is done, our texture is ready, image is written directly into texture.

The last step is unbinding FBO and returning to normal on-screen rendering. This is done using glBindFramebuffer function with FBO ID 0. Our texture is ready to be mapped anywhere. Notice however, that if you want to use a filtering with mipmaps, you must recalculate them every frame. That's why the function BindFramebufferTexture of our class takes 2 parameters - first is texture unit, and second is whether the mipmaps should be recalculated. I selected mipmap filtering, even trilinear, so the mipmaps must definitely be recalculated.


What we just programmed looks like this:

and it's not bad . Hope you enjoyed this tutorial and if you have never been rendering to a texture, I hope this tutorial makes it clear to you. If you don't understand something, feel free to ask in the comments or mail me. Have a nice day .

Download (3.32 MB)
4620 downloads. 4 comments


Enter the text from image:


7Y8APqljcWHr (at0lbyia2@yahoo.com) on 17.09.2015 01:13:28
Yes. When you create a deaflut OpenGL context you get an OpenGL 4.3 compatibility profile context by deaflut (assuming the machine you work on supports GL 4.3). Then if you only use features that are supported also on Shader Model 2.0 hardware, i.e. GL 2.1 features then it will run on both your netbook and your strong machine.Of course, if you use such OpenGL 4 features that are available only on Shader Model 5.0 hardware then it won't run, but that's nothing different than doing the same thing in D3D as if you want to use Shader Model 5.0 features then you have to use feature level 11 and that means it won't run on your netbook (considering that it doesn't support Shader Model 5.0). Not to mention that if you are running XP then you are stuck with D3D 9 anyways, while OpenGL support could be way better (even GL 4.3).The thing that confuses you is that when people refer to using OpenGL 4 they mean actually using such GL 4 features that are supported only on Shader Model 5.0 hardware, but again, when people refer to using D3D 11 they also usually refer to using Shader Model 5.0, i.e. feature level 11, thus the same compatibility issue is there.Again, using OpenGL 4 can mean two things:1. actually using SM 5.0 features thus limited to new hardware2. making an application on a GL 4 implementation that only uses SM 2.0 features (roughly equivalent to GL 2.1 stuff) and that will be compatible with old hardwareThe same thing applies to D3D as using D3D 11 can mean two things:1. actually using SM 5.0 features (feature level 11) which limits the application to new hardware2. using feature level 9_3 that will be compatible with old hardware but does not support any new features (technically equivalent with D3D 9 if you don't consider the changed syntax and API convenience)
TheWeepingCorpse on 23.11.2012 00:46:15
Is it possible to use an existing depth buffer? I want to have several render to textures sequences, all using the same depth buffer for image composition.
Hertz (diehertz@gmail.com) on 24.08.2012 13:56:57
As we'll find out later, it's very useful for realistic water and other environment effects, where you want a reflection of the real scene :-)
Have you read some books like GPU Gems? Some real cool techniques described in them :-)
Michal Bubnar (michalbb1@gmail.com) on 25.08.2012 12:15:47
I know them, but I only hafe read a part of Gems 1, so not that much. And you're right with those effect
Jump to page: