In The Walls is an iOS augmented reality app that lets you push your face out from behind real world walls. Here’s what that looks like:
- Get the App
- Reality shaders example project (Xcode project that shows the basic technique I use in this app)
The app uses real time face tracking so that it really feels like it’s your beautiful visage there in the wall. Smile or yawn and your wall face smiles or yawns back. Move your face closer to your phone to push your face out from the wall or farther away to sink back in. Being a wall has never been so exciting!
Place your face on any wall.
Scale the face up to epic sizes.
Easily record images and videos in the app.
Adjust lighting to get your walfie just right (hey if Apple can try foisting slofie on us, I’m going all in on walfie)
For devices that don’t support real time face tracking, the app uses an adjustable static model of a face instead.
In The Walls was inspired by a scene in the original Nightmare on Elm Street in which Freddy pushes his face and gloves out from behind the wall above a sleeping Nancy Thompson. In the film, the wall distorts as if it were made of a stretchy material. It’s a neat effect and I like the simplicity of the practical effect behind it: just a taught sheet of spandex and good lighting. It also got me wondering: could I create a similar effect in real life? Seemed like it just might be possible using the power of
imagination augmented reality.
I first tried distorting a 3D model of a face by blending the face into the wall by smoothly flattening its sides. The results however were pretty laughable and the approach proved inflexible. There was no way it would support the sort of responsive face tracking I was looking for.
Next, I wasted a good deal of time trying to use a parallax map. The idea was to generate the parallax map from a scene with the face geometry generated by ARKit. However the effect also looked flat and debugging the shaders was driving me crazy.
So I finally decided to try something really stupid: recreate the spandex sheet in software. And lo and behold, cloth simulation actually worked great! Who would have thought that spandex would be the answer for both creating this “behind the wall” effect on film and in software.
I’ve extracted a small example project that documents the basic AR techniques I use to make it look like the face is distorting real world walls. The key is placing a virtual plane in the AR scene and texturing the plane with the real world texture of the area it covers. You can then apply vertex and fragment shader to the plane, which makes it look like you are applying those shaders to the real world surface.
I worked with a freelancer to take my prototype and get it into a shippable state. Vlad L. over on Upwork did an amazing job building a UI, polishing the AR experience, designing the app’s icon and identity, and creating a 3D explainer for it. If you are looking for an iOS developer, I highly recommend him!
As for what coming up for In The Walls? Well, we’ll see. If there’s enough interest in the app, I have a few ideas about extending it. I definitely also want to explore ‘reality shaders’ further too.
But until then, give In The Walls a try. Let me know if you have any feature requests too!