-
Notifications
You must be signed in to change notification settings - Fork 0
Pipeline State ‐ Details
Input Layouts at the top map almost perfectly to the properties of a Unity Mesh.
POSITION: Mesh.verticies
NORMAL: Mesh.normals
TANGENT: Mesh.tangents
COLOR: Mesh.colors
TEXCOORD: Mesh.uv, Mesh.uv2, etc. There can be up to 8 sets of UVs (Vector2) per mesh, which are combined into Vector4s when sent to the shader, so you may see up to 4 TEXCOORD attributes.
At the bottom, there's a buffer marked index
, which maps to Mesh.triangles. The other buffers listed after index
are just how the above properties are sent to the shader.
The vertex shader stage runs once per vertex. It uses the data configured in the IA step as input (also incorporating additional buffers and sometimes textures) and outputs the location of the vertex on screen (or off screen, which will be culled in the next step).
This is the vertex shader used on this draw call. Luckily for us, the original name is included. This is usually the easiest place to identify what is being rendered, other than looking at the mesh itself.
You can view the shader, but only in its "semi-compiled" form called DXBC. This is tough to read and can't be altered and re-compiled. Luckily there are some tools out there that will decompile DXBC into HLSL, which is the language used in Unity shaders.
Buffers and textures used by the shader. The buffers are set using Material.SetBuffer()
. Textures included are the ones set on the Material.
Just the way the shader is going to view or filter the shader when used. Usually not very interesting.
Additional buffers that can be shared across multiple shaders (hence the name "constant").
The first constant buffer is used by Unity to store the Material properties. Any call to Material.SetFloat()
or .SetInt()
or any of the other will be sent to the shader in this constant buffer. Global properties set by calls like Shader.SetGlobalFloat()
are also included here.
The other constant buffers are built-in Unity-provided variables like the unity_ObjectToWorld transformation, camera position, lighting details, etc.
You can view the contents of each, but without the names of the properties and their locations in the buffer, this can be tricky to read.
These are optional stages that are very rarely used in DSP. Hull+Domain shaders are also referred to as Tessellation shaders.
Vertex position data is transformed, before being sent to the pixel shader, which runs once per pixel on the screen. Triangles that appear off screen are dropped. "Back" culling is on by default and drops the triangles on the backside of the model (which you can't see anyway).
That means for each triangle, the vertex shader ran 3 times (once per vertex) but the pixel shader might run thousands of times if that triangle takes up a decent portion of the screen. 1080p is 2,073,000 pixels!
Also called the Fragment shader by DirectX. The details here are basically the same as the Vertex shader screen, but again, it's running once per pixel instead of once per vertex. It uses the data passed from the vertex shader (through the rasterizer) to determine the color of each pixel. For that reason, you'll see a lot more texture resources included.
Otherwise, refer to the Vertex section above for details on the Shader, Resources, and Constant Buffers sections.
Basically, how should the output of the pixel shader be handled. At the simplest level, this basically just adds the newly rendered object to the "Output to the Screen" texture being built by each draw call. For transparent objects, the Target Blend section might have some simple rules for how it's blended with existing pixels, but opaque objects just overwrite what was already there.
At the bottom, there are some details on if this is written to a secondary Depth texture and if it should ignore some existing objects in the scene but not others, via the Stencil, but this usually isn't very interesting.