WebGL Textures & Vertices
Create an Element Array
The preceding graphic illustrates
the first triangle starting at
3 which points to vertex
The line proceeds from index
3 to index
The triangle then ends at vertex
You could declare just the first triangle with
the following element index array.
When rendered, only the bottom left triangle
var aIndices = new Uint16Array([ 3,2,0 ]);
Listing 3: Element Array for One Triangle
However we also want to display the
Therefore create an element array with
six entries, allowing three entries
for each triangle.
The following listing declares an element
array for the square plane.
Notice this example uses the vertices
and texels declared for indices
var aIndices = new Uint16Array([ // triangle 1 3,2,0, // triangle 2 0,2,1, ]);
Listing 4: Element Array for One Square
WebGL API Cast Uint16Array(Array)
WebGL requires typed arrays.
represents an array of 16 bit unsigned integers.
non negative whole numbers which
don't exceed sixteen bits of information.
That means values range between
216 - 1.
The greatest value equals
Each array entry represents a whole number,
without a decimal point.
After initialization, the developer
can't change the size of a