## WebGL Textures & Vertices

## Beginner's Guide

### Page Ten

# Create an Element Array

The preceding graphic illustrates
the first triangle starting at
index number `3`

which points to vertex `(-1.0,-1.0,0.0)`

.
The line proceeds from index `3`

to index `2`

.
The triangle then ends at vertex `0`

.
You could declare just the first triangle with
the following element index array.
When rendered, only the bottom left triangle
would display.

var aIndices = new Uint16Array([ 3,2,0 ]);

### Listing 3: Element Array for One Triangle

However we also want to display the
second triangle.
Therefore create an element array with
six entries, allowing three entries
for each triangle.
The following listing declares an element
array for the square plane.
Notice this example uses the vertices
and texels declared for indices `0`

and `2`

twice.

var aIndices = new Uint16Array([ // triangle 1 3,2,0, // triangle 2 0,2,1, ]);

### Listing 4: Element Array for One Square

# WebGL API Cast Uint16Array(Array)

WebGL requires typed arrays.
The type `Uint16Array`

represents an array of 16 bit unsigned integers.
Acceptable `Uint16Array`

entries include
non negative whole numbers which
don't exceed sixteen bits of information.
That means values range between `0`

and
`2`

.
The greatest value equals ^{16} - 1`65,535`

.
Each array entry represents a whole number,
without a decimal point.
After initialization, the developer
can't change the size of a `Uint16Array`

.