GPUDevice: createRenderPipelineAsync() method
Experimental: This is an experimental technology
Check the Browser compatibility table carefully before using this in production.
The createRenderPipelineAsync()
method of the
GPUDevice
interface returns a Promise
that fulfills with a GPURenderPipeline
, which can control the vertex and fragment shader stages and be used in a GPURenderPassEncoder
or GPURenderBundleEncoder
, once the pipeline can be used without any stalling.
Note: It is generally preferable to use this method over GPUDevice.createRenderPipeline()
whenever possible, as it prevents blocking of GPU operation execution on pipeline compilation.
Syntax
js
createRenderPipelineAsync(descriptor)
Parameters
descriptor
-
See the descriptor definition for the
GPUDevice.createRenderPipeline()
method.
Return value
A Promise
that fulfills with a GPURenderPipeline
object instance when the created pipeline is ready to be used without additional delay.
Validation
If pipeline creation fails and the resulting pipeline becomes invalid as a result, the returned promise rejects with a GPUPipelineError
:
- If this is due to an internal error, the
GPUPipelineError
will have areason
of"internal"
. - If this is due to a validation error, the
GPUPipelineError
will have areason
of"validation"
.
A validation error can occur if any of the following are false:
- For
depthStencil
objects:format
is adepth-or-stencil
format.- If
depthWriteEnabled
istrue
ordepthCompare
is not"always"
,format
has a depth component. - If
stencilFront
orstencilBack
's properties are not at their default values,format
has a stencil component.
- For
fragment
objects:targets.length
is less than or equal to theGPUDevice
'smaxColorAttachments
limit.- For each
target
,writeMask
's numeric equivalent is less than 16. - If any of the used blend factor operations use the source alpha channel (for example
"src-alpha-saturated"
), the output has an alpha channel (that is, it must be avec4
).
Examples
Note: The WebGPU samples feature many more examples.
Basic example
The following example shows a basic example of the construction of a valid render pipeline descriptor object, which is then used to create a GPURenderPipeline
via a createRenderPipelineAsync()
call.
js
async function init() {
// ...
const vertexBuffers = [
{
attributes: [
{
shaderLocation: 0, // position
offset: 0,
format: "float32x4",
},
{
shaderLocation: 1, // color
offset: 16,
format: "float32x4",
},
],
arrayStride: 32,
stepMode: "vertex",
},
];
const pipelineDescriptor = {
vertex: {
module: shaderModule,
entryPoint: "vertex_main",
buffers: vertexBuffers,
},
fragment: {
module: shaderModule,
entryPoint: "fragment_main",
targets: [
{
format: navigator.gpu.getPreferredCanvasFormat(),
},
],
},
primitive: {
topology: "triangle-list",
},
layout: "auto",
};
const renderPipeline = await device.createRenderPipelineAsync(
pipelineDescriptor
);
// ...
}
Specifications
Specification |
---|
WebGPU # dom-gpudevice-createrenderpipelineasync |
Browser compatibility
BCD tables only load in the browser
See also
- The WebGPU API