Initializing A Matrix
Now that we have a nicely packaged library, let's apply it to a sample problem. The initialization of a matrix is a great example because of the straightforward mapping from a matrix to a grid, and because the fragment shader directly computes the matrix elements from the row and column indices without the use of an input texture. This makes the initialization a bit simpler than the later computational examples. This setup of initial conditions is also common to many of the problems that we will examine.
Most of the problem specific code in is another file, MatrixInitializer.js, which we will cover in a few paragraphs. This modularity makes the code directly on the page short and straightforward.
This code draws together many of the concepts that we have been talking about and assembles them into a complete application.
"use strict";
var bufferStatus;
var framebuffer;
var gpgpUtility;
var initializer;
var matrixColumns;
var matrixRows;
var texture;
matrixColumns = 128;
matrixRows = 128;
gpgpUtility = new vizit.utility.GPGPUtility(matrixColumns, matrixRows, {premultipliedAlpha:false});
if (GPGPUtility.isFloatingTexture())
{
// Height and width are set in the constructor.
texture = gpgpUtility.makeTexture(WebGLRenderingContext.FLOAT, null);
framebuffer = gpgpUtility.attachFrameBuffer(texture);
bufferStatus = gpgpUtility.frameBufferIsComplete();
if (bufferStatus.isComplete)
{
initializer = new MatrixInitializer(gpgpUtility);
initializer.initialize(matrixColumns, matrixRows);
// Delete resources no longer in use.
initializer.done();
// Tests, terminate on first failure.
initializer.test( 0, 0)
&& initializer.test( 10, 12)
&& initializer.test(100, 100);
}
else
{
alert(bufferStatus.message);
}
}
else
{
alert("Floating point textures are not supported.");
}
The size of our canvas is specified in terms of the size of the problem, in
this case matrixColumns
and matrixRows
. This is
much clearer than using canvasWidth
and canvasHeight
variables.
Creating the canvas and the WebGL context are so standard that we manage
them entirely within the GPGPUtility
class. The
GPGPUtility constructor calls our makeGPCanvas
function, invokes getGLContext
to get and store the WebGL context,
and finally the constructor calls gl.getExtension
to enable the
floating point texture extension.
Immediately after the GPGPUtility
constructor returns, we check that
floating point textures are enabled. If the check succeeds, the next step is to
create our floating point texture with gpgpUtility.makeTexture
. The
null
argument indicates that we supply no data for the texture1.
Later, we will show how to pack floating point values into four unsigned bytes (RGBA) when floating point textures are unavailable.
We need the texture to be the target for the GPU output. We make the texture the
rendering target by calling gpgpUtility.attachFrameBuffer
, which creates
a framebuffer and attaches the texture to it. When our code runs, the results will be
placed into the texture rather than drawn to the screen.
We haven't gotten to any of the problem specific code yet, but we can already
ensure that our setup will work on this system. The
gpgpUtility.frameBufferIsComplete
method returns both the framebuffer
status, and an explanatory text.
When bufferStatus.isComplete
is true, we know two things:
- We can create floating point textures
- We can write, or render, to floating point textures
When the bufferStatus.isComplete
check fails, the framebuffer is unusable.
In that case we display the explanation from bufferStatus.message
and terminate.
Later we will use this result to do much more than simply display a message when our setup
fails. We will use a different approach to the problem that, while less efficient, does not
depend on floating point textures.
The initialization stage of the problem is captured in MatrixInitializer.js, which defines
a MatrixInitializer
class.
The MatrixInitializer
continues the pattern of building up small
easily understood blocks into a complete solution to what at first might seem
like a complex problem. The constructor consists on only three lines of executable
code.
function MatrixInitializer(gpgpUtility_)
{
/** WebGLRenderingContext */
var gl;
var gpgpUtility;
var heightHandle;
var positionHandle;
var program;
var textureCoordHandle;
var widthHandle;
.
.
.
gpgpUtility = gpgpUtility_;
gl = gpgpUtility.getGLContext();
program = this.createProgram(gl);
}
The geometry, and the vertex shader, are the same for all the problems we will
approach. This leads to considerable simplification of the program. We have a
standard vertex shader that we use and provide only the fragment shader, which
is where all the problem specific code will lay. When setting up the shader we
call gpgpUtility.createProgram
with a null first argument. This
tells the gpgpUtility
to use the standard vertex shader.
/**
* Compile shaders and link them into a program, then retrieve references to the
* attributes and uniforms. The standard vertex shader, which simply passes on the
* physical and texture coordinates, is used.
*
* @returns {WebGLProgram} The created program object.
* @see {https://www.khronos.org/registry/webgl/specs/1.0/#5.6|WebGLProgram}
*/
this.createProgram = function (gl)
{
var fragmentShaderSource;
var program;
// Note that the preprocessor requires the newlines.
fragmentShaderSource = "#ifdef GL_FRAGMENT_PRECISION_HIGH\n"
+ "precision highp float;\n"
+ "#else\n"
+ "precision mediump float;\n"
+ "#endif\n"
+ ""
+ "uniform float height;"
+ "uniform float width;"
+ ""
+ "varying vec2 vTextureCoord;"
+ ""
+ "vec4 computeElement(float s, float t)"
+ "{"
+ " float i = floor(width*s);"
+ " float j = floor(height*t);"
+ " return vec4(i*1000.0 + j, 0.0, 0.0, 0.0);"
+ "}"
+ ""
+ "void main()"
+ "{"
+ " gl_FragColor = computeElement(vTextureCoord.s, vTextureCoord.t);"
+ "}";
// Null first argument to createProgram => use the standard vertex shader
program = gpgpUtility.createProgram(null, fragmentShaderSource);
// position and textureCoord are attributes from the standard vertex shader
positionHandle = gpgpUtility.getAttribLocation(program, "position");
gl.enableVertexAttribArray(positionHandle);
textureCoordHandle = gpgpUtility.getAttribLocation(program, "textureCoord");
gl.enableVertexAttribArray(textureCoordHandle);
// Height and width are the problem specific variables
heightHandle = gpgpUtility.getUniformLocation(program, "height");
widthHandle = gpgpUtility.getUniformLocation(program, "width");
return program;
}
The createProgram
method is invoked in the MatrixInitializer
constructor. When the constructor finishes, the program has been successfully compiled,
and the handles for all the attributes and uniforms are known. We could fold all of
MatrixInitializer
into a single static method, but this structure, with
the program built in the constructor, and actual functionality in separate methods,
is more flexible and we will use it again and again.
The fragment shader sets the texel to vec4(i*1000.0 + j, 0.0, 0.0, 0.0)
.
This is an R component valued at i*1000.0 + j, with the other components valued at
0.0. Yes, the GB and A texture channels are unused2.
The initializer.initialize
method runs our program to load
values into the texture.
/**
* Runs the program to do the actual work. On exit the framebuffer &
* texture are populated with the values computed in the fragment shader.
* Use gl.readPixels to retrieve texture values.
*/
this.initialize = function(width, height)
{
gl.useProgram(program);
gpgpUtility.getStandardVertices();
gl.vertexAttribPointer(positionHandle, // The attribute
3, // The three (x,y,z) elements in each value
gl.FLOAT, // The data type, so each position is three floating point numbers
gl.FALSE, // Are values normalized - unused for float
20, // Stride, the spacing, in bytes, between beginnings of successive values
0); // Offset 0, data starts at the beginning of the array
gl.vertexAttribPointer(textureCoordHandle, // The attribute
2, // The two (s,t) elements in each value
gl.FLOAT, // The data type, so each position is two floating point numbers
gl.FALSE, // Are values normalized - unused for float
20, // Stride, the spacing, in bytes, between beginnings of successive values
12); // Offset 12 bytes, data starts after the positional data
gl.uniform1f(widthHandle, width);
gl.uniform1f(heightHandle, height);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
};
After we have done all the computations we need with the initializer, we call
initializer.done()
. This deletes things we no longer need, which
from the initializer is only the shader program. Other things, such as the
standard geometry, the framebuffer and the texture are all used in later
calculations.
/**
* Invoke to clean up resources specific to this program. We leave the texture
* and frame buffer intact as they are used in followon calculations.
*/
this.done = function ()
{
gl.deleteProgram(program);
};
Of course as with any reasonable code, we finish with a set of test cases. These tests also serve as an example of how to read data back from the GPU onto the CPU. The test case reads the (i,j) element from the texture, and computes the expected value by evaluating the same expression from the fragment shader in JavaScript. If the values match, the test passes.
/**
* Read back the i, j pixel and compare it with the expected value. The expected value
* computation matches that in the fragment shader.
*
* @param i {integer} the i index of the matrix.
* @param j {integer} the j index of the matrix.
*/
this.test = function(i, j)
{
var buffer;
var expected;
var passed;
// One each for RGBA component of a pixel
buffer = new Float32Array(4);
// Read a 1x1 block of pixels, a single pixel
gl.readPixels(i, // x-coord of lower left corner
j, // y-coord of lower left corner
1, // width of the block
1, // height of the block
gl.RGBA, // Format of pixel data.
gl.FLOAT,// Data type of the pixel data, must match makeTexture
buffer); // Load pixel data into buffer
expected = i*1000.0 + j;
passed = (buffer[0] === expected);
if (!passed)
{
alert("Read " + buffer[0] + " at (" + i + ", " + j + "), expected " + expected + ".");
}
return passed;
};
- This uninitialized texture memory is the source of early security concerns with WebGL. This memory is now required to be cleared.
- In the long term these other texture components can be used to pack an integer into a set of single byte channels, or to optimize the floating point texture approach.
