Camera Recording in Swift & PixelKit

First make sure to check out Getting started with PixelKit.

To get access to the camera we first need permission. Add NSCameraUsageDescription to the app’s Info.plist.

Then to get the camera feed going add this to your viewDidLoad:

let camera = CameraPIX()

Now create a rec pix. Add the following code:

let rec = RecordPIX()
rec.input = camera

To start recording hook up the following to a button:

do {
    try rec.startRec()
} catch {
    print("rec failed:", error)
}

Then stop the recording and share:

rec.stopRec({ url in
    let activity = UIActivityViewController(activityItems: [url], 
                                            applicationActivities: nil)
    presentViewController(activity, animated: true, completion: {})
})

If you’d like an image instead, use the following to get an UIImage:

let image = camera.renderedImage

Particles in VertexKit & PixelKit

We’ll be creating this noise particle cloud. This tutorial applies to iOS & macOS. VertexKit is an extension of PixelKit. Get started with PixelKit in this tutorial.

Add the VertexKit extension framework as a pod:

pod 'VertexKit'

Now you can import the frameworks:

import LiveValues
import RenderKit
import PixelKit
import VertexKit

In the view controller, add black background:

// macOS
view.wantsLayer = true
view.layer!.backgroundColor = .black
// iOS
view.backgroundColor = .black

We will need high precision colors so up the bits to 16:

PixelKit.main.render.bits = ._16

Each particle (vertex) will be represented by a pixel.

Define the particle count resolution:

let pres: PIX.Res = .square(Int(sqrt(1_000_000)))

One million particles in a thousand by thousand grid.

First we create the source particle pixels with a noise pix:

let noise = NoisePIX(res: pres)
noise.colored = true
noise.octaves = 5
noise.zPosition = .live * 0.1

The live parameter is a live counting value of frames / fps.

Prepare the res for the final render:

// macOS
let res: Resolution = .cgSize(view.bounds.size) * 2
// iOS
let res: Resolution = .fullscreen

Now let’s create the particle system:

let particles = ParticlesUV3DPIX(res: res)
particles.vtxPixIn = noise - 0.5
particles.color = LiveColor(lum: 1.0, a: 0.1)

Finally we add the final pix and add it to the view:

let final: PIX = particles
final.view.frame = view.bounds
final.view.checker = false
view.addSubview(final.view)

Run the app and you should see the particle system!

If you want a resizable window on macOS make the final pix and particle pix global, then add this:

override func viewWillLayout() {
    final.view.frame = view.bounds
    particles.resolution = .cgSize(view.bounds.size) * 2
}

That’s it!

Green Screen in Swift & PixelKit

In this tutorial you can use a live camera feed or just a video file, either works.

Setup your project with PixelKit in this tutorial.

First add the camera:

let content = CameraPIX()

or your video:

let content = VideoPIX()
content.load(fileNamed: "superman", withExtension: "mov")

Then we need a background image:

let image = ImagePIX()
image.image = UIImage(named: "city")

Remember to add your photo to the Assets in Xcode.

Now let’s key the green screen away:

let key = ChromaKeyPIX()
key.input = content
key.keyColor = .green

Then finally we blend the content and the image together:

let blend = BlendPIX()
blend.blendingMode = .over
blend.inputA = image
blend.inputB = key

To view the final result:

let final: PIX = blend
final.view.frame = view.bounds
view.addSubview(final.view)

That’s it, now you can chroma key!

HDR Exposure stacking with Pixels

16bit HDR image from an iPad

I’ve been working on a framework for the past year. It’s a realtime graphics framework called Pixels. It’s built in Swift & Metal and runs on iOS & macOS.

With support for 16 bit image processing I decided to test out exposure stacking. By combining several 8 bit images at different exposures we get an 16 bit image with high dynamic range.

I’m modifying the exposure duration and ISO of the iPad camera to change the amount of light being captured. The process takes around 10 seconds for 12 images.

Here’s the result:

Xcode project on GitHub.

Getting started with Metal in PixelKit

Start by reading Getting started with PixelKit to get the PixelsKit framework up and running.

Here’s the basic setup for custom Metal shader code:

let metal = MetalPIX(at: .fullscreen, uniforms: [], code:
    """
    pix = float4(u, v, 0.0, 1.0);
    """
)

let final: PIX = metal
final.view.frame = view.bounds
view.addSubview(final.view)

The following code will display a uv map. The float4 represents red, green, blue & alpha.


Now lets a live uniform value:

let uniforms: [MetalUniform] = [
    MetalUniform(name: "t", value: .touch),
]
let metal = MetalPIX(at: .fullscreen, uniforms: uniforms, code:
    """
    pix = float4(u, v, in.t, 1.0);
    """
)

Don’t forget to add the uniforms to the MetalPIX.


Now let’s add a custom live uniform value:

let date = Date()
let uniforms: [MetalUniform] = [
    MetalUniform(name: "t", value: .touch),
    MetalUniform(name: "d", value: LiveFloat({ () -> (CGFloat) in
        return CGFloat(-date.timeIntervalSinceNow)
    })),
]
let metal = MetalPIX(at: .fullscreen, uniforms: uniforms, code:
    """
    pix = float4(u, v, in.t, cos(in.d) / 2 + 0.5);
    """
)

Here’s another setup, with touch position:

let uniforms: [MetalUniform] = [
    MetalUniform(name: "x", value: .touchX),
    MetalUniform(name: "y", value: .touchY)
]
let metal = MetalPIX(at: .fullscreen, uniforms: uniforms, code:
    """
    float x = u - in.x - 0.5;
    float y = v - in.y - 0.5;
    float a = abs(x) * abs(y);
    float c = pow(1.0 - a, 10.0);
    pix = float4(c, c, c, 1.0);
    """
)

So far we’ve been creating textures, now it’s time to merge textures.

Here’s a way to merge PIXs:

let circle = CirclePIX(at: .fullscreen)
let polygon = PolygonPIX(at: .fullscreen)
let metalMerger = MetalMergerEffectPIX(uniforms: [], code:
    """
    pix = float4(0.0, inputA[0], inputB[0], 1.0);
    """
)
metalMerger.inputA = circle
metalMerger.inputB = polygon

Learn more on GitHub.

Getting started with PixelKit

PixelKit is a graphics framework for iOS and macOS that’s written in Swift & Metal. Tho you only need to know Swift to get started.

First create a new Xcode project (iOS Single View App).

Then we use CocoaPods to install PixelKit from the terminal.

sudo gem install cocoapods
cd ~/Documents/.../Project
pod init

This will create a ”Podfile”, in this file add the following:

pod 'PixelKit'

Then run this in the terminal:

pod init

Now you can import PixelKitand add the following code to your viewDidLoad to create a CirclePIX:

let circle = CirclePIX(at: .fullscreen)

let final: PIX = circle
final.view.frame = view.bounds
view.addSubview(final.view)

Then run the app and you should see a circle on your screen.


Now let’s blend in an image.

First add the image to your Assets.xcassets. Then load it like this with an ImagePIX:

let image = ImagePIX()
image.image = UIImage(named: "image")

Then blend the circle and image together with a BlendPIX:

let blend = BlendPIX()
blend.inputA = circle
blend.inputB = image
blend.blendingMode = .multiply

Finally point final to blend and run the app.


Now let’s blur the blend.

Start by making the background color of the circle clear:

circle.bgColor = .clear

Then create a BlurPIX and add in the image:

let blur = BlurPIX()
blur.input = image
blur.radius = 0.5

The blend the textures like this with a new BlendPIX:

let blendOver = BlendPIX()
blendOver.inputA = blur
blendOver.inputB = blend
blendOver.blendingMode = .over

Finally we can add a LevelsPIX to our texture:

let levels = LevelsPIX()
levels.input = blendOver
levels.brightness = 1.25
levels.gamma = 0.75

The final code should look like this:

let circle = CirclePIX(at: .fullscreen)
circle.bgColor = .clear

let image = ImagePIX()
image.image = UIImage(named: "image")

let blend = BlendPIX()
blend.inputA = circle
blend.inputB = image
blend.blendingMode = .multiply

let blur = BlurPIX()
blur.input = image
blur.radius = 0.5

let blendOver = BlendPIX()
blendOver.inputA = blur
blendOver.inputB = blend
blendOver.blendingMode = .over

let levels = LevelsPIX()
levels.input = blendOver
levels.brightness = 1.25
levels.gamma = 0.75

let final: PIX = levels
final.view.frame = view.bounds
view.addSubview(final.view)

You can also achieve the same result with this more minimalistic way:

((image._blur(0.5) & (circle * image)) * 1.25) !** 0.75

You can find more effects in the Effect Docs.