Sunday, May 22, 2011

WebGL pipeline

Welcome to my first tutorial about WebGL! For this one I suggest you grab a nice cup of coffee, lay back, a just scroll down a bit... I promise, it's not too hard!

webgl globe
WebGL planet?!



So, before we start making our hands dirty, let's take a closer look at how WebGL works. The first thing that you have to understand is that any WebGL app can be divided into 3 parts: the vertex shader, the fragment shader and the JavaScript code. Now, what are vertex and fragments shaders? Does are "mini" programs that run on the GPU and make things appear on the screen - without them nothing gets drawn! 

Their tasks are fairly simple - the vertex shader has to set the value of a  variable named gl_Position, and the fragment shader has to set gl_FragColor. gl_Position says on which pixel of the screen (5x10, 354x785 etc.) to draw on, and gl_FragColor defines what color it should be. Note: The vertext shader gets executed first, and the fragmnet shader second - ALWAYS! 

You might ask yourself how do they achieve this? I assume you have some basic knowledge about 3D math - if not, next week I will write a tutorial on that, so please be patient. The vertex shader converts 3d coordinates of your objects (or any geometry) from 3D coordinates to 2D coordinates of the screen. using  transformation matrices - one to convert from local to space to world space (the worldMatrix), and another one to convert from world coordinates into screen coordinates (the viewProjection matrix). The fragment shader just puts the correct color on those screen coordinates - maybe from a textures, maybe it's a plain color, maybe even a shadow. Fairly simple, right?

And where is JavaScript in this whole process? Well, JavaScript actually just sets up the variables in those mini programs and calls their execution. So, you need a worldMatrix in you vertex shader? JavaScript binds it! You need a color or a texture in you fragment shader? JavaScript binds it! You want to render a sphere instead of a cube? JavaScript says "render a sphere now"! To get a better understanding of this process, take a look at the following picture (taken from a Google IO presentaion):

webgl pipeline

Yes, that's WebGL. Cool. Hah.

So there you have it - a vertex and a fragment shader that calculate all the stuff, gl_Postion and gl_FragColor as outputs and - OH! - buffers/attributes, uniforms and varying?! If you have never seen shader programs before you are going to feel a bit out of the line here (at least I did :O). Let's take a step at a time, it's not that complicated. Buffers are WebGL objects that contain information about the geometry you want to render - vertex positions, normals, whatever... You can have a few BIIIG buffers to hold all the data together (one buffer holding vertex coordinates, normal, texture mapping etc), or you can have a lot of smaller ones that hold chuncks of it - one buffer for the vertices, one buffer for normals... Inside a shader, those buffers are refered to as attributes because that is what they are - properties or attributes of the geometry you want to show on your screen.That is what gets bound when you says "I want a sphere instead of a cube" - the buffer values of a sphere get bound to the vertex shader beacuse does are the attributes that differentiate a sphere and a cube. Got it? Sure hope so :)

Now, the little red box that says uniforms are for example our world and viewprojection matrices or a texture - stuff that is necessary to calculate the correct position and color of a object, but is not  property of it. So why are does not attribute also? Well, consider you want to draw a cube 1000 times in different positions and in different colors - all you need is one cube geometry (which is held in buffer objects) and lot of uniforms to describe the different position and colors of 1000 cubes. What would be the alternative? 1000 cube different cubes geometries and that can take up a looot of memory if you want to render 1 000 000 instead of 1000 cubes. Not so bad does uniforms, aren't they?

No they last thing here are vaying types. As you can see in the picture, they basically serve  for passing values from the vertex to the fragment shader (remember, does are different functions!). So, you have calculated something in the vertex shader, you pass it to the fragment shader by specifying that variable as varying. If you wouldn't do that, you would have to recompute the same thing again, which does not make sense really, does it?

Let's take a quick look at a veeery simple vertex and fragment shader:

            /** 
       * The vertex shader
       */

       //Attribute describing the position
       attribute vec3 aVertexPosition; 
       //Unfirom to convert to world space
       uniform mat4 worldMatrix;
       //uniform to convert to screen space
       uniform mat4 viewProjectionMatrix;

       void main(void) {
          //calculate which pixel I want to color
          gl_Position = viewProjectionMatrix worldMatrix *                               vec4(aVertexPosition, 1.0);
       }
                
      /** 
       * The fragment shader
       */

       //a uniform saying which color
       uniform vec4 color;
       void main(void) {
          //put that color on the pixel, YEAH!
          gl_FragColor = color;
       }


So that would be it for this lesson! Oh, one more thing. If you got issues on your Chrome or Firefox browser with WebGL (like, erm, it's not working) do the following:

       - for Chrome: add "--ignore-blacklist-gpu" to your run arguments for chrome, to force WebGL working
       - for Firefox: got to about:config, search webgl, and set force-webgl-enabled to true.

That's it, your ready to go!!!

1 comment: