EduGraf Tutorial: Low-level shading

Resources

Introduction

In this tutorial, we want to develop our own low-level shader in GLSL (OpenGL Shading Language). We continue the hello world tutorial.
Unfortunately, there is quite some math behind this that is beyond the scope of this tutorial. So we will take a more technical perspective on this. To read up on the former, refer e.g. to subsections Transformations and Coordinate Systems.

Uniform color shader

A shader consists of two GLSL programs compiled together. The first program defines the transformation of vertices and is called vertex shader. The second program defines the subsequent step of determining the color of the pixels respectively fragments. Subsequently, vertices and fragments are referred to as primitives, if they need not be distinguished. Both programs feature a main()-method as entry-point. There are three kinds of parameters. in- and out-parameters are per primitive. uniform-parameters are the same for all primitives. out-parameters of the vertex shader are passed to the fragment shader.
Before starting with GLSL programs a note about erroneous programs. There are three main error categories that result in different behavior.
  • Most interoperation errors (mainly parameter passing) between EduGraf and the shader program cause a .NET exception.
  • Syntax errors in the GLSL program are reported by OpenGL in the console window and all objects using it are not part of the scene.
  • Unintended computation in the GLSL program result in potentially visible objects not looking correct.
The vertex shader is mainly about defining the positions of all vertices. Since we do not cover the theory in this tutorial, the vertex shader is given as follows. #version 410 in vec3 Position; uniform mat4 Model; uniform mat4 View; uniform mat4 Projection; void main(void) { gl_Position = vec4(Position, 1.0) * Model * View * Projection; }
gl_Position is an out-parameter that is given by OpenGL. It is interpreted as the vertex-position in homogeneous, normalized device coordinates. The computation in the script calculates gl_Position from the vertex position in the model space with a sequence of linear transformations, i.e. matrix multiplications.
The uniform parameters are defined and set by EduGraf. It calculates their values from the position an orientation of the camera.
The fragment shader is very simple in this case, since all fragments feature the same color. #version 410 uniform vec3 color; out vec3 fragment; void main() { fragment = color; }
The first out-parameter defines the color that is used for the fragment when displayed by the graphics pipeline. The color needs to be defined and set by our shading.
First we need to construct a shading that integrates the GLSL shader programs defined above into EduGraf. This is achieved by creating a new shading class derived from GlShading and passing all required parameters to the base-class constructor. The shaders can be defined as string constants. uniform parameters can be set using the Set()-method. This must be done in a specific context, which is activated by DoInContext. public class UniformShading : GlShading { public UniformShading(GlGraphic graphic, Color3 color) : base("uniform", graphic, VertexShader, FragmentShader) { DoInContext(() => Set("color", color)); } }
We can now go back to the code of the hello-world tutorial and replace the material, light and shading code with our defined emissive shading as follows. We need to pass a graphic of type GlGraphic to our rendering, since, obviously, GLSL is platform specific. ... var shading = new UniformShading(graphic, new Color3(0, 0.5f. 1)); ... }
When running the program, the earth should now appear as a uniformly colored turquoise ball.

Texture shader

Although correct, the result from our work above looks rather boring. Let us make it more interesting by using a texture. We need to extend the vertex shader with the texture coordinate information. Texture coordinates are passed in by vertex through the geometry. The parameter name is defined by EduGraf to TextureUV. This information needs to be interpolated by OpenGL to yield the coordinates per fragment by passing it though the vertex shader. Extend it with the following code. in vec2 TextureUv; out vec2 textureUv; ... textureUv = TextureUv; // in main ...
In the fragment shader, we want to pick the color at the current texture coordinates from the texture instead of the uniform color. This is achieved by the following fragment shader. #version 410 in vec2 textureUv; uniform sampler2D textureUnit; out vec4 fragment; void main(void) { fragment = texture(textureUnit, textureUv); }
A shader can work with multiple textures. Therefore they need to be identified, which happens through the uniform parameter textureUnit in this case. Texture values are looked up with the texture()-function taking the identifier as first and the texture-coordinates as second parameter.
Finally, the two new shading programs need to be combined into an EduGraf-shading again as follows. public ColorTextureShading(GlGraphic graphic, GlTextureHandle handle) : base( "color_texture", graphic, VertexShader, FragmentShader, new GlNamedTextureShadingAspect("textureUnit", handle)) {}
Some setup needs to happen to activate the texturing unit on the GPU. This is achieved through the GlNamedTextureShadingAspect, whose first argument needs to match the textureUnit-parameter in the fragment shader. Replace again the shading in the rendering with this one (reusing the world map texture from the preceding tutorial).

Simulate sun light

This looks fine again, but we want to achieve something more than with standard means. The vertex shader passes on the world-normal as follows. The model-transformation of the normal does not include the translation part, which happens in the homogeneous coordinate that is stripped by converting it to a 3x3 matrix. ... in vec3 Normal; out vec3 worldNormal; ... worldNormal = Normal * mat3(Model); // in main ...
Since the model for earth is a sphere around the origin with radius one, its model positions correspond to the normal vector at that position. The x-coordinate of the world normal yields the cosine between the normal- and the unit-vector in direction x, which is equal to the relative intensity of the light depending on the orientation of the surface. fragment = max(0, worldNormal.x) * texture(textureUnit, textureUv);

Simulate city lights at night

We also want to show something on the night-side of the earth, namely the city lights. Run the program with the texture provided above. Note that the image is mostly back, so it is a bit difficult to see anything. Now, we want to combine the two textures in a single shader. We want to show the map on the day-side and the city lights on the night-side. Also, city lights shall be turned on, when it becomes dark. This is performed by the following fragment shader. Also, add a second GlNamedTextureShadingAspect and set the corresponding names. #version 410 in vec3 worldNormal; in vec2 textureUv; uniform sampler2D mapTextureUnit; uniform sampler2D lightsTextureUnit; out vec4 fragment; void main(void) { float i = normalize(worldNormal).x; if (i > 0) fragment = i * texture(mapTextureUnit, textureUv); else fragment = vec4(0, 0, 0, 1); if (i <= 0.25f) { fragment = min(vec4(1, 1, 1, 1), fragment + texture(lightsTextureUnit, textureUv)); } }