rendy seems to be moving closer to being included in Amethyst and I would like to propose a rethink of the current public API types for rendering to make the low-level rendering types more data-driven and flexible to allow for the power of the coming Asset Pipeline to shine through. The main driver is to make assets more re-usable and composable by avoiding unnecessary coupling of data.
First, let’s see what data is required to render something. Please Render Team, let me know I’m missing something
- List of static vertex channels (device-local GPU buffers) and associated vertex format
- List of dynamic vertex channels (host memory buffers, requires sync) and associated vertex format
- List of image views + sampler combinations
- List of shaders
- List of constant buffers
- Blend & stencil config
- List of buffer/image attachments/outputs
- Binding metadata for all the elements in all the lists - how do vertex channels, textures and constants bind to things in the shaders?
We can construct a “pure” function that takes these inputs and emits render commands for the graphics backend. It can be used in many different scenarios with different inputs, like drawing UI, 3D objects etc. Each situation is different and may have simplified, cached or ignored parts of these inputs depending on the requirements.
I would like for as many of these inputs as possible to be configurable by assets (and thus hot-reloadable). I don’t expect to be able to define everything in data in every situation, but as much as is reasonable given the component-specific constraints would be nice.
Texture struct in Amethyst is pretty good as is.
Mesh includes a
transform matrix which seems redundant. The purpose of a transformation matrix is to place the object in the world and it doesn’t make sense to have this built into the Mesh as it would necessitate one more matrix multiplication in the rendering of any Mesh.
amethyst::Effect contains shaders and constant buffer values. From an asset perspective, this doesn’t make much sense: if you can define constant buffer values in a shader asset, then they can probably be a real constant within the shader anyway. I would like to create an
amethyst::Shader type that only represents a compiled shader program with metadata for its possible constant bindings. The existing
amethyst::Effect could still exist if there’s demand for it, combining
amethyst::Shader and the current API for constant values.
amethyst::Material is probably the most opinionated of the existing Amethyst rendering types. It contains “hard-coded” named constants for textures, both sampler bindings (TextureHandle -> sampler) and constants (TextureOffset -> vec2 constant). I would like to make it more generic by making it contain
- Vec of named constant buffer values (string + constant buffer value)
- Vec of named TextureHandles (string + TextureHandle)
- Blend & stencil config
This would allow Materials to be the primary way to glue together
Textures and constant buffers while leaving the binding of Mesh and Shader unspecified, making it easier to compose Materials with Meshes.
When constructing the pipeline, we can look at the Shader’s metadata and bind constants & samplers based on the corresponding string. If the strings are interned or hashed, it will be fast.
I’m not sure how
Mesh vertex channels and
Shader should be bound, whether the relationship should be described with an asset. I would love to get some ideas about this, but generally I think game engines just define a preset vertex attribute enum and bind them automatically. Could we improve upon this somehow? I feel that vertex data handling is the least flexible part of most popular game engine rendering pipelines.
Buffer/image attachments/outputs, global constants or similar might be interesting to include as assets too? I would love some feedback from the org, and primarily from the Rendering Team.