Introduction to Computer Graphics

I know that you are desperate to render your first 3D object using the Metal API. However, computer graphics can be difficult to grasp in the beginning. To help you out, I wrote this article that talks about the major components in a rendering pipeline.

In this article, I'm going to explain the following concepts:

  • Mesh
  • Rendering Pipeline
  • Attributes
  • Uniforms
  • Matrix Transformations
  • Functions (Shaders)
  • Framebuffer

Learning these concepts before you use the Metal API (Or OpenGL) will be beneficial.

What is a Mesh

A mesh is a collection of vertices, edges, and faces that define the shape of a 3D object. The most popular type of polygon mesh used in computer graphics is a Triangle Mesh. Any object, either a mountain or a game character, can be modeled with Triangle meshes.

For example, the image below shows triangle meshes making up a tank.

Screen Shot 2016-12-26 at 10.48.04 AM.png

The vertices of a mesh are the inputs to a Rendering Pipeline. These vertices go through several stages, such as coordinate transformation and rasterization, before they end up in the framebuffer, i.e., the screen.

The Rendering Pipeline

The Rendering Pipeline are the stages that vertices must go through before appearing on the screen of your device. The GPU, not the CPU, is in charge of the rendering pipeline.

In general, the vertices of a mesh goes through the following stages in a rendering pipeline:

  • Vertex Processing
  • Primitive Assembly
  • Rasterization
  • Fragment Processing

Vertex Processing

The Vertex Processing stage processes incoming geometrical data, i.e., vertices. In this step, each vertex coordinate system is transformed by a space matrix, thus changing the vertex original coordinate system to a new coordinate system.

Primitive Assembly

The Primitive Assembly constructs a primitive using three processed vertices from the Vertex Processing stage.

Rasterization

What you see on a screen are pixels approximating the shape of a primitive. This approximation occurs in the Rasterization stage. In this step, pixels are tested to determine if they are inside the primitive’s perimeter. If they are not, they are discarded.

If they are within the primitive, they are taken to the next stage. The set of pixels that passed the test is called a Fragment.

Fragment Processing

The responsibility of the Fragment Processing stage it to apply color or texture to the fragment.

Shaders

Long ago, the Rendering Pipeline was fixed. Meaning that a developer had no way to influence the outputs of any of the stages in the pipeline.

However, that is no longer the case. Today, you control the outputs of two stages in the pipeline; the Vertex Processing and Fragment Processing stages. These stages are controlled through GPU programs called "Shaders." In Metal, they are known as "Functions."

The GPU program that controls the Vertex Processing stage is known as a Vertex Shader. Whereas, the program that controls the Fragment Processing stage is known as a Fragment Shader.

Furthermore, you program these shaders with a special programming language called "Shading Languages". In OpenGL, the shading language is known as GLSL (or OpenGL Shading Language). In Metal, the shading language is called "Metal Shading Language."

Transformations

If you hate mathematics but love computer graphics, then you better start hating math a bit less. Mathematics, especially Linear Algebra, plays a huge role in computer graphics, games, graphics effects, etc.

In computer graphics, Matrices are used to transform the coordinate system of a mesh.

For example, to rotate a square 45 degrees as shown below requires a rotational matrix to transform the coordinate system of each vertex of the square.

Matrices are not only used to rotate a mesh, but also to create the illusion of 3D objects on a screen.

To render a 3D object, you are required to implement several transformations. For example, the coordinate space of a model is transformed by a view (camera) matrix. This new coordinate system is known as the Model-View coordinate system. And it shows the position of the model with respect to the camera.

A Projective-Perspective matrix further transforms the Model-View coordinate system, thus creating a 3D illusion. These set of transformation is known as the Model-View-Projection transformation.

Thus, if you want to display an object in 3D, the object must be transformed the by Model-View-Projection Matrix.

Attributes and Uniforms

Attributes

Elements that describe a mesh, such as vertices, are known as Attributes. Aside from vertices, Normal Vectors, and UV coordinates also define a mesh. Normal Vectors are vectors perpendicular to the vertices direction and are used to apply lighting to a model. UV coordinates are used for texturing.

An attribute is also defined as the input to a Vertex Shader (function). The Vertex Shader is the only stage that can receive attribute data from the CPU. The Fragment Shader can't receive attribute data directly from the CPU.

If the Fragment Shader needs attribute data, it must be passed down from the Vertex Shader.

Uniforms

A Vertex Shader deals with constant and non-constant data. An Attribute is data that changes per-vertex, thus non-constant. Uniforms are data that are constant during a render pass.

For example, if your mesh has six vertices, the Vertex Shader will process six different attributes during the render pass. However, it will process the same uniform data during the render pass.

Transformation matrices are usually sent as uniform data to the GPU.

Unlike attributes, both shaders can receive Uniform data.

Framebuffer

The destination of a rendering pipeline is a Framebuffer. A framebuffer contains several attachments such as Color, Depth, and Stencil attachments. However, a framebuffer can display the rendering content to a screen ONLY if a 2D array memory has been allocated and attached to a framebuffer attachment. A 2D array memory is known as a Texture image.

Frame buffer.jpeg

In Metal, you must ask the system for the next available texture which can be attached to the framebuffer.

Now that you know the basics of computer graphics, Metal or OpenGL will be a lot easier to understand. Metal and OpenGL are simply APIs that allow you to communicate with the GPU.

Click here to start using Metal.

Hope this helps

Getting Started with Metal API

One of my goals for 2017 is to become an expert in the new graphics API from Apple known as Metal. Thus I started learning Metal and thought it would be nice to share with you what I have learned.

Prerequesite: Before using Metal: Computer Graphics Basics

Objects in Metal

Unlike OpenGL, Metal treats most rendering components as Objects. For example, Metal creates a rendering pipeline as a Pipeline object. The shaders, known as functions in Metal, are encapsulated in Library objects. Vertex data is encapsulated in Buffer objects.

Metal requires a set of objects to be created before rendering can occur. The primary objects in Metal are:

  • Device Object
  • Command Queue Object
  • Library/Function Objects
  • Pipeline Objects
  • Buffer Objects
  • Render Pass Descriptor Object
  • Command Buffer Object
  • Render Command Encoder Object

Metal Rendering Process

The Metal Rendering Process consists of the initialization of these objects, which are created once and last for the lifetime of the application:

  • Device Object
  • Command Queue Object
  • Library/Function Objects
  • Pipeline Objects
  • Buffer Objects

And the creation of these objects during each render pass:

  • Render Pass Descriptor Object
  • Command Buffer Object
  • Render Command Encoder Object

The Metal rendering process consists of the following steps:

Iniatilizing Objects

  1. Create a device object, i.e. a GPU
  2. Create a Command Queue object
  3. Create a CAEMetalLayer
  4. Create a Library object
  5. Create a Pipeline object
  6. Create buffer objects

Render-Pass

  1. Retrieve the next drawable layer
  2. Create a Render Pass Descriptor object
  3. Create a Command Buffer object
  4. Create a Render Command Encoder object
  5. Present the drawable
  6. Commit the Command Buffer

Initializing Metal Objects

Create a Metal Device

The Metal rendering process starts with the creation of an MTLDevice object. An MTLDevice represents an abstraction of the GPU. A device object is used to create other kinds of objects such as buffers, textures and function libraries.

Create a Command Queue Object

An MTLCommandQueue object is created from an MTLDevice. The Command Queue object provides a way to submit commands/instructions to the GPU.

Create a CAMetalLayer

Next, we must provide a destination texture for the rendering pipeline.

The destination of a rendering pipeline is a Framebuffer. A framebuffer contains several attachments such as Color, Depth, and Stencil attachments. However, a framebuffer can display the rendering content to a screen ONLY if a texture has been attached to a framebuffer attachment. The CAMetalLayer object provides a texture that is used as a rendering destination.

Create a Library and Function Objects

Next, we must create the Vertex and Fragment functions (Shaders) that will be used by the rendering pipeline. The Vertex and Fragment are MTLFunction objects and are created through an MTLLibrary object.

Create a Rendering Pipeline Object

Now is time to create the rendering pipeline object.

An MTLRenderPipelineState object represents a rendering pipeline. However, unlike other objects, you do not directly create a pipeline object. Instead, you create it indirectly through an object called Rendering Pipeline Descriptor.

The MTLRenderPipelineDescriptor describes the attribute of a render pipeline states. For example, it defines the Vertex and Fragment functions used by the pipeline. As well as the color attachment properties.

Create Buffer Objects

The next step is to load MTLBuffers objects with vertex data. i.e., vertices, normals, UV, etc.

The interaction between these objects is illustrated below:

Metal Render-Pass

Whereas, objects mentioned in "Initializing Metal Objects" are created once and last for the lifetime of your application, objects created during the render pass are often created and short-lived.

The steps in the rendering-pass stage are as follow:

Retrieve the next drawable layer

As mentioned above, a framebuffer requires a texture before it can display the rendering results to the screen. Thus, you must ask the CAMetalLayer object for the next available texture.

Create a Render Pass Descriptor object

Next, we must describe the various actions that must occur before and after the render pass. For example, you may want to clear the rendering texture to a particular color.

These actions are described through the MTLRenderPassDescriptor object. Moreover, The MTLRenderPassDescriptor object links the texture provided by the CAMetalLayer as the pipeline destination texture.

Create a Command Buffer object & Encoder object

We then create an MTLCommandBuffer object. An MTLCommandBuffer object stores drawing commands and rendering pipeline states until the buffer is committed for execution by the GPU.

However, these drawing commands and rendering pipeline states must be encoded by an MTLRenderCommandEncoder object before they are stored into the MTLCommandBuffer object. Essentially, the MTLRenderCommandEncoder translates the commands into a format the GPU can understand.

Present the drawable layer

I mentioned previously that the CAMetalLayer provided a texture that serves as the rendering destination. With our commands encoded, we must inform the command buffer that it must present this texture to the screen once rendering is complete.

Commit the Command Buffer

Finally, the Command Buffer is committed, and loaded into the Command Queue; where it waits to be executed by the GPU.

The render pass routine is illustrated below:

In summary, the Metal rendering process can be summarized in these steps:

  1. Create a device object, i.e. a GPU
  2. Create a Command Queue object
  3. Create a CAEMetalLayer
  4. Create a Library object
  5. Create a Pipeline object
  6. Create buffer objects
  7. Retrieve the next drawable layer
  8. Create a Render Pass Descriptor object
  9. Create a Command Buffer object
  10. Create a Render Command Encoder object
  11. Present the drawable
  12. Commit the Command Buffer

Your First Metal Application

Let's create a simple Metal application. We are going to render a simple red rectangle on the screen.

For your convenience, the project can be found here.

Download the project so you can follow along.

Note: The project is found in the "MetalBasics" git branch. The link should take you directly to that branch.

Open Xcode and create a new project. Select "Single View Application" as the template and give your project a name. Select "Objective-C" as the language.

Include the following Frameworks into your project through the "Build Phases" tab:

  • Metal
  • UIKit
  • QuartzCore

In the "ViewController.h" file, make sure to import the following headers:

#import <UIKit/UIKit.h>
#import <Metal/Metal.h>
#import <QuartzCore/CAMetalLayer.h>
#import <GLKit/GLKMath.h>

You are going to initialize Metal in the viewDidLoad method. We are going to follow the 12 Metal Rendering steps outlined above.

Step 1. Create a metal device:

mtlDevice=MTLCreateSystemDefaultDevice();

Step 2. Create a command queue object

mtlCommandQueue=[mtlDevice newCommandQueue];

Step 3. Create a CAMetal Layer

metalLayer=[CAMetalLayer layer];
metalLayer.device=mtlDevice;
metalLayer.pixelFormat=MTLPixelFormatBGRA8Unorm;
metalLayer.frame=self.view.bounds;
[self.view.layer addSublayer:metalLayer];

Step 4. Create a library object and function objects

//create a library object
id<MTLLibrary> mtlLibrary=[mtlDevice newDefaultLibrary];

//create a vertex and fragment function object
id<MTLFunction> vertexProgram=[mtlLibrary newFunctionWithName:@"vertexShader"]; 
id<MTLFunction> fragmentProgram=[mtlLibrary newFunctionWithName:@"fragmentShader"];

Step 5. Build the Rendering Pipeline

//build a Render Pipeline Descriptor Object
 mtlRenderPipelineDescriptor=[[MTLRenderPipelineDescriptor alloc] init];

//assign the vertex and fragment functions to the descriptor
[mtlRenderPipelineDescriptor setVertexFunction:vertexProgram];
[mtlRenderPipelineDescriptor setFragmentFunction:fragmentProgram];

//specify the target-texture pixel format
mtlRenderPipelineDescriptor.colorAttachments[0].pixelFormat=MTLPixelFormatBGRA8Unorm;

//Build the Rendering Pipeline Object
renderPipelineState=[mtlDevice newRenderPipelineStateWithDescriptor:mtlRenderPipelineDescriptor error:nil];

Step 6. Create Buffer objects and load data into it

We will be using these set of data as the vertices of our rectangle

static float quadVertexData[] =
{
    0.5, -0.5, 0.0, 1.0,
    -0.5, -0.5, 0.0, 1.0,
    -0.5,  0.5, 0.0, 1.0,

    0.5,  0.5, 0.0, 1.0,
    0.5, -0.5, 0.0, 1.0,
    -0.5,  0.5, 0.0, 1.0
};

The vertices are loaded into the buffer object:

//load the data QuadVertexData into the buffer
vertexBuffer=[mtlDevice newBufferWithBytes:quadVertexData length:sizeof(quadVertexData) options:MTLResourceOptionCPUCacheModeDefault];

At this point, the initialization of the Metal objects is complete. We need to create sort of a timer that constantly calls a render-pass function. The best way to do this is through a CADisplayLink object.

//Set the display link object to call the renderscene method continuously
displayLink=[CADisplayLink displayLinkWithTarget:self selector:@selector(renderScene)];

[displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];

The system will call the method "renderScene" repeatedly once it is ready for rendering.

Create a method called "renderScene" and include the following:

Step 7. Get the next drawable texture

frameDrawable=[metalLayer nextDrawable];

Step 8. Create a Render Pass object

//create a render pass descriptor
MTLRenderPassDescriptor *mtlRenderPassDescriptor =[MTLRenderPassDescriptor renderPassDescriptor];

//set the target texture for the rendering pipeline
mtlRenderPassDescriptor.colorAttachments[0].texture=frameDrawable.texture;

//set the following states for the pipeline. i.e., clear the texture before each render pass
mtlRenderPassDescriptor.colorAttachments[0].loadAction=MTLLoadActionClear;
mtlRenderPassDescriptor.colorAttachments[0].clearColor=MTLClearColorMake(1.0, 1.0, 1.0, 1.0); 
mtlRenderPassDescriptor.colorAttachments[0].storeAction=MTLStoreActionStore;

Step 9. Create a Command Buffer

id<MTLCommandBuffer> mtlCommandBuffer=[mtlCommandQueue commandBuffer];

Step 10. Create a Command Encoder object

//creat a command encoder
id<MTLRenderCommandEncoder> renderEncoder=[mtlCommandBuffer renderCommandEncoderWithDescriptor:mtlRenderPassDescriptor];

//Configure enconder with the pipeline
[renderEncoder setRenderPipelineState:renderPipelineState];

//set the vertex buffer object and the index for the data
[renderEncoder setVertexBuffer:vertexBuffer offset:0 atIndex:0];

//Set the draw command
[renderEncoder drawPrimitives:MTLPrimitiveTypeTriangle vertexStart:0 vertexCount:6];

//End encoding
[renderEncoder endEncoding];

Step 11. Present the drawable

[mtlCommandBuffer presentDrawable:frameDrawable];

Step 12. Commit the buffer

[mtlCommandBuffer commit];

The Metal API initialization and render-pass operations are complete. However, we need to set up our function shaders.

Setting up the Function Shaders

Go to File->New and create a new file. Select a "Metal File" and call it "MyShader."

I will not go into details how shaders work but know this: A Vertex shader processes incoming geometrical data. A Fragment shader sets the color of the outgoing fragment.

In step 4, we created two function objects with the names "vertexShader" and "fragmentShader." We need to create two function shaders with the same names in the "MyShader" file.

//Vertex Function (Shader)
vertex float4 vertexShader(device float4 *vertices [[buffer(0)]], uint vid [[vertex_id]]){

    return vertices[vid];

}

//Fragment Function (Shader)
fragment float4 fragmentShader(float4 in [[stage_in]]){

    //set color fragment to red
    return float4(1.0,0.0,0.0,1.0);

}

The vertex function shader receives vertex data through the argument "vertices [[buffer(0)]]".

If you look at step 10, you told the render encoder to use information in the vertexBuffer (which holds your rectangle vertices) and link it to the buffer at index 0. The Vertex function shader receives these information a vertex at a time.

Fragment function shader sets the color of the fragment to red.

And that is it. If you have an iPhone or an iPad, connect it to your Mac and run the project. You should see a red rectangle on the screen. Note, the iOS simulator does not support Metal yet.

The complete project can be found on my GitHub page.

Developing a Math Engine in C++: Implementing Quaternions

Developing a Math Engine in C++: Implementing Quaternions

Matrices are used to rotate 3D objects. However, they tend to be slow and consume too much memory. An alternative to matrices are quaternions. In this post, you will learn how to implement quaternions using C++ in the math engine.

Developing a Math Engine in C++: Implementing Matrices

Developing a Math Engine in C++: Implementing Matrices

In this post you will learn how to implement matrices in a game engine. Matrices are used to rotate, scale and skew 3D objects.

Developing a Math Engine in C++: Implementing Vectors

Developing a Math Engine in C++: Implementing Vectors

A math engine is an API that contains functions that allows 3D objects to translate/rotate. In this post, you will learn how to implement a Vector class in C++. This class will be used to translate 3D objects across a screen.