The Players

As I’ve said in many of my other posts: when you learn a new technology, it helps to view it as if you’ve just taken on a managerial role at a new sports team. The first thing you need to figure out is who your team are and what their roles are. When I started with Metal, I spent some time getting to know the rendering pipeline (potentially the most confusing, yet repetitive part of working with it). When working with graphics, the players tend to be really foreign to us as all of this is abstracted away. But with knowledge comes control and working with Metal is going to enable you to draw literally anything your mind can conjure up.

In this article I’m going to try and simplify a lot of the Metal players you’ll be working with regularly in an approachable way. Before we get started though I want to address why I’m working with Metal.

Why Metal?

Some people will ask ‘why Metal’? That’s a good question. Over time I’ve longed to get back to being more creative. Originally I was a Jazz musician and had to change career, however all that creativity is still there and recently I’ve just been so amazed at some of the stuff people have come up with. I love design and I love the Maths involved in working with graphics. The problem is I don’t know any of it right now and one of my 2019 goals was to finally address this and start working with graphics. In particular, I want to really try and grasp how things are working at a lower level, rather than just creating and dragging stuff in Unity/SceneKit (which are insanely good tools) and I am particularly fond of the idea of creating my own shaders for stuff.

With that out of the way, let’s take a look at the players you’ll be working with.

Device

When you want to render stuff using metal, a great place to start is with a MTKView. It’s similar to a UIView but handles a lot of the Metal API for you. However, to render anything you need access to a ‘Device’, which is basically access to the GPU. This is why you can’t run Metal on an iOS Playground and you’ll have to select ‘MacOs’ (something that made me wince at first haha). Or you can simply run it on your physical device.

Buffers

This one confused me at first, but thanks to a fantastic Ray Wenderlich tutorial by Caroline Begbie (whom I just love), it was quite simple to grasp. If you wanted to work with a model of a black hole for example, you would store all of the vertices in memory in these things called ‘Buffers’. This kind of makes sense as graphics memory management is going to be different in how it is accessed to standard things stored as pointers but just think of it as places where you store information on things you’ll be manipulating.

Command Queue

Okay, so another tip for learning something completely new is that we memorise things by attaching them to stuff we already know. This is known as ‘scaffolding’ in education. When I first saw ‘Command Queue’ I have to admit I thought it was pretty bizarre-sounding to say the least. Naturally I just conjured up a mnemonic about a Start Trek commander sending commands to the Starship GPU. If you want to issue low-level rendering tasks to the GPU we need a way of wrapping these into commands and we pass these commands via the Command Queue. This is a one-time setup you would do on a View Controller for example where you would pass it the device and it would setup the queue for you, ready to render any start ship you wanted!

Pipeline State

Each time we want to render things, think of it that every single bit of optimisation is going to help us out. Naturally, caching properties where we can when break-neck speed is required is going to aid us in doing this. The Pipeline State handles the current state such as how much depth will be taken into account (think of draw distance in games), the colour space and what kind of pixel space we are currently using.

Let’s Now Talk About Each Frame

Now that we have all of these things setup above, let’s look at the tasks we dispatch to the queue for rendering tasks with the hope of reaching 60 frames per second minimum: totally amazing our devices can do this!

Encoders

Whenever you hear the term ‘Encoder’ you are probably going to think of video editing (had to do tonnes of it for my recent Udemy course haha) however each task in Metal is known as an Encoder (MTLRendererCommandEncoder to be precise). Encoder tasks will access the vertices stored in the buffers, the vertex and fragment shaders that will handle what value each pixel should have and then be ‘committed’ to the GPU for processing.

Final Step

So we have our encoder tasks, we have the command queue, but how do we actually send a command into the queue? Well, we do this using a special type called a command buffer (MTLCommandBuffer). The command buffer accepts the encoder tasks and dispatches them straight to the GPU via the command queue.

Commit!

Awesome! You’ve just understood the basics of the Metal Rendering Pipeline. All you would do now is commit the task and it would be rendered to a view of some sort and you could reach in and grab the drawable view. Again, this was a simple overview of the main ‘players’ that you’ll be working with and hopefully you found it helpful.