Chevron DownCommand-Line Reference
Chevron DownAPI Reference

About Compressed Textures

Compressed textures are different from compressed images in that they do not have to be decompressed, they can be used directly by a supporting GPU. However, a compressed texture typically consists of a collection of compressed subimages, representing mipmaps etc. These compressed subimages are stored as an array of "binary blobs" in a container file. Only the container file is parsed, extracting metadata and the binary buffers representing subimages. The binary subimages can then be passed directly to a GPU that understands how to read pixels directly from them without decompressing them first.

Supercompressed textures are an intermediate format whose subimages are compressed in a common format. This format can be cheaply transcoded on to a real compressed texture format supported on the current client, without decompressing and recompressing the texture. This allows a single supercompressed texture to be portably used on multiple platforms even though those platforms do not support the same compressed texture formats.

Performance Considerations


  • Compressed textures can allow a lot more textures (~4x) to be stored in the same amount of GPU memory, which can make a big difference, decreasing memory bandwidth use, or allowing more detail, also mobile devices tend to crash when memory fills up.
  • Compressed textures do not need to be decoded before use which reduces CPU load, noticable when many textures are loaded.
  • Compressed textures include mipmaps, further reducing CPU load by avoiding mipmap generation step, noticable when many textures are loaded.

On the downside:

  • Compressed textures can be somewhat bigger and slower than JPEGs to load over the network. Actual number should be verified but as an example, compressed texture formats might achieve about 4-6x compression, compared to say 15x compression for JPEG.
  • Compression tends to be relatively slow. In combination with some IP issues this usually makes it impractical to create GPU compressed textures on the fly.
  • Since different devices have different GPUs that support different compressed texture formats, one typically has to provide compressed textures in multiple formats and decide which ones to load at runtime (although basis avoids this problem).

Container Formats

Texture Containers This section is based on the information in Dave Evan's helpful Texture Containers article, please refer to it for additional details.

Non-texture image formats do not support storing mipmap chains. When loading a JPG or a PNG, mipmaps must be generated by resizing the original image repeatedly for each required mipmap level.

In contrast, a single texture container can store all the data required for an entire texture, mipmaps, array layers or cubemap faces. Generating mipmaps offline is important if you use compressed textures, as it’s generally impractical to generate compressed textures at runtime.

The main container formats for compressed textures are the Khronos Texture format (KTX) and Microsoft's DirectDraw Surface (DDS). KTX, being a standard, is better specified and therefore recommended.

KTX (Khronos Texture)

The KTX format is a Khronos Group standard for storing textures. It can store 1D, 2D, 3D, Cubemaps and Array Textures, along with any number of mipmaps for these textures. This makes it ideal for storing almost any kind of texture you could want.

The fields in the KTX header are directly compatible with other Khronos standards such as WebGL. The texture data is described in the glType, glFormat, glInternalFormat, and glBaseInternalFormat header fields. These should match up with the parameters to the gl[Compressed]Tex[Sub]Image* calls used to submit each texture mipmap level’s data.

DDS (DirectDraw Surface)

The DDS format is in common use for storing textures (despite DirectDraw being long deprecated). Originally only 2D textures were supported, but the D3D10 header extension added support for texture arrays and D3D10+ features. The format is partially documented on MSDN.

PVR (PowerVR)

The PVR texture compression format defines its own container

Compression Formats

As mentioned the actual compressed subimages are not parsed or modified by, however attempts to identify the formats using metadata and return the appropiate format fields to facilitate use in WebGL and WebGPU.

The following is the typical list of compressed texture formats, which can properly tag:

`S3TCS3 texture compression formats
`S3TC_SRGBS3 SRGB texture compression formats
`PVRTCPowerVR texture compression formats
`ETC1texture compression formats
`ETCtexture compression formats
ASTCtexture compression formats
`ATCAMD texture compression formats

Recommnended Formats

The following could be a starting point for choosing texture formats


  • BC3(DXT5) - transparent textures with full alpha range
  • BC1(DXT1) - opaque textures


  • PVR4 - transparent textures with alpha
  • PVR2 - opaque textures


  • ASTC_4x4, ASTC8x8 - transparent textures with full alpha range
  • ETC1 - opaque textures

Using Compressed Textures

Compressed textures are designed to be directly uploaded to GPUs that have the required decoding support implemented in hardware.

Using compressed textures in JS currently does not provide CPU-side decoding capabilities for compressed textures, meaning that they can only be uploaded directly to supporting GPUs. Use a WebGL context and read back the rendered texture to the client.

Using Compressed Textures in

While itself is framework-independent, (and other frameworks like are designed to seamless consume data loaded by

Data returned by any "image" category loader (including texture loaders) can be passed directly to Texture2D class.

Using Compressed Textures in raw WebGL

To use compressed textures in WebGL

const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);

gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

for (let index = 0; index < images.length; ++index) {
  const image = images[index];
  const {width, height, format, data} = image;

  gl.compressedTexImage2D(gl.TEXTURE_2D, index, format, width, height, 0, data);

if (images.length > 1) {
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
} else {
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);

WebGL Extensions

Used to query if the GPU supports specific proprietary compressed texture formats.

[WEBGL_compressed_texture_s3tc( texture compression formats
[WEBGL_compressed_texture_s3tc_srgb( SRGB texture compression formats
[WEBGL_compressed_texture_atc( texture compression formats
[WEBGL_compressed_texture_pvrtc( texture compression formats
[WEBGL_compressed_texture_etc1( compression formats
[WEBGL_compressed_texture_etc( compression formats
[WEBGL_compressed_texture_astc( compression formats

Using Compressed Textures in WebGPU

Support for compressed textures is a work in progress in the WebGPU standard.

At the time of writing, only S3 texture compression has been specified:

    // BC compressed formats usable if "texture-compression-bc" is both
    // supported by the device/user agent and enabled in requestDevice.

Creating Compressed Textures

Texture compression code is usually not readily available, particulary not in JavaScript. Compression is typically done by binary programs, e.g. PVRTexTool.

The CompressedTextureWriter can compress textures (under Node.js only) by executing a binary with the appropriate command line, and then loading back the output.

IP and Patent Considerations

An issue with compressed texture formats is that they tend to be highly propietary and patent-encumbered, and while it is usually no longer an issue, there can be cases where e.g. royalty requirements come into play when using them.

To side-step patent issues when using these formats an application would typically:

  1. Generate compressed textures in external applications (which should already have licensed any required formats and libraries).
  2. Load them in binary form without touching the content.
  3. Pass them directly to a texture, so that they are processed inside the GPU driver (which should also habe licensed the supported formats and libraries).