The most mental job I’ve ever been involved with. We strapped a Playstation move to a camera, rigged inside the MK-V AR rig, sucked all the camera tracking data out of it to give position, orientation and speed and rendered out the projection mapping in a room to the perspective of where the camera was. Essentially like when they have those funny shaped warped adverts on the cricket field but they look perfectly straight when you see them from the main camera except this is in 3D and moving and amazing.
Shot these three Playstation virals with Marshmallow Laser Feast (Dir:Memo Akten, Barney Steel and Robin McNicholas) for Ian Hambleton at Studio Output. This was months of testing, head scratching and ranting ideas to make the final three films. All finally shot in single takes each.
Watch them. Remember there is no post VFX. Everything happens in-camera from the images projected onto white walls. Occasionally there are glitches, errors and mistakes but its all real.
The AR rig was awesome for this allowing single take shots to fly anywhere in the room from 1 ft off the ground to 9ft in the air. Every detail of the shoot was pushed to the extreme to have as much go wrong as it possibly could; as we used pyrotechnics, puppetry, interactive lights on projectors to deliver the mental. Basically massively ambitious act of lunacy I am proud to have been a part off.
Projection mapping an environment from the correct perspective of a camera with a playstation move strapped to it. I would love to try and explain it all more… I’ll have a go. Imagine a virtual playstation move could see this mad environment inside a 3D package. It knows where it is precisely and in another 3D package another virtual Playstation whose movements are precisely link to the first virtual Playstation move projects what it sees onto the three white walls in that 3D environment.. Inside this second virtual environment there are 5 cameras which are all calibrated to be exactly the same as 5 projectors in the real world, calibrated in lenses, positions everything, these 5 camera/projectors film the images from their view points (warped funny looking keystoned images) and then reproject exactly those images in the real world onto the real walls.. The thing is now the real walls will look, from the exact position of where the first virtual move was like presisely the same perspective environment as was in the first virtual world… well thats exactly where the camera is! Because the camera has got a real playstation move strapped to it and that is feeding the data to define where in the world the first virtual playstation move was! Yes… and the camera can now move and all the perspectives remain perfect.. it all looks 3D and real and insane and phew… was that a good explanation?
Maybe this film below will help a little more.
Here is some camera tests we were doing early on as Memo worked out various ways to suck data out of the console and interpret it:
My mate Ceri Stokes did the music for this scratch reel.
Really the amazing stuff was not done by us but by teams of scientists and manufacturing genius’s at Playstation to bring technology that the Military would have marveled 10 years ago into our hands at a minute fraction of the cost, weight and power consumption to the point they become amazing toys. Brilliant.
And here are a few photos:
Where do we go next with this technology? Using face recognition systems to identify where people are in a room and render the environment out for them? I dunno.. I’m still trying to get over the headache caused when trying to pull this job off!
Full length of credits are coming.