This installation was developer for the Royal Australia Air Force and displayed at the Australian International Airshow – Avalon 2019.
The project consisted of 2 applications developed in Unity. An Android app running on 16 Samsung tablets and a PC app running on a highly spec’ed PC, in a 15 metre dome with 8 projectors and speakers.
Some of the technology used for this installation included Omnity for the projection output, KlakNDI for the NDI output required by the projectionists, UNET for communication between the apps, Curvy Splines for plane paths, Post Processing Stack for visual effects. To keep things G rated Bad Word Filter PRO was used and Zenject for gluing it all together, also taking advantage of its pooling for spawning/despawning planes.
The PC had a 6th gen i7, 32GB RAM and was running a RTX 2080 ti. The tablets were Samsung Galaxy Tab S2. The NDI signal ran over a 10GBE card for the bandwidth required. The tablets used a POE switch for power and connectivity so we did not have to rely on WiFi which has a tendency to be flaky at public shows like this and constant connectivity is a must.
The projection app was also built in VR using SteamVR for internal and client reviews.
Unity 2018.3 was used for the projection app and was interesting to find new features and locations of settings and also to be able to use KlakNDI. The Android app used 2018.2.
The installation ran for 6 days on average for 12 hours a day.
Spin the 360 panorama below to see the site before the start of the day (at around 7am!)
Many were involved in the project including 3D artists, animators, sound artists, producers, technical directors and myself as the developer and onsite technical director. Massive project that was a team effort.
A constant flow of people kept the numbers in the dome anywhere from 10 to 50 – except when the fighter jets were blasting past outside and shaking the dome so everyone would rush out to catch the action.
This demo was constructed over a few days for use in an Oculus Go. This was my first app built using the Go. I am very impressed with its resolution and comfort. Development wise I used Easy Input for Gear VR and Oculus Go. It was great to use and had me up and running with the controller very quick.
“Curator SVV allows you to measure visual-vestibular biais in Virtual Reality. 10 tilted famous paintings are presented on a black background. With a remote gamepad, you will adjust the painting to straighten/level it, as you would do with a painting on a wall.”
This is the description given for the research app I developed for the OpenLab. There is an actual physical test used in hospitals that places people in a pitch black environment and asks them horizontally align an object using a remote control. It’s quite a setup and not portable at all.
This app aims to replicate the usage of the hospital setup using a standalone VR headset that can be used anywhere. Currently the app works on Gear VR and Oculus Go – you can find it here – Curator SVV
The results of the test is output to a CSV file, this includes, in degrees, the starting position of the painting, the final corrected position from the user, the users head position, the time taken for each painting and a few others.
The app has been slowly evolving since it’s original release. There are several “conditions” that can now be applied to the scene instead of it being all black. A rotating sphere of dots rotating in either direction and at alternative speeds is one and surrounding the painting with a tilted frame in either a positive or negative rotation is the others.
It’s very interesting watching people use the app, for example, with the rotating dots, almost always people start to lean the direction of the rotation. Also when rotating the painting the same direction as the dots, there is a dot rotation speed that almost matches the painting rotation speed – people swear the painting is not rotating… until they realise it has!
Being the first app I’ve put into the Oculus App Store it was a learning experience also – there is a lot of documentation online, you will find most of it on the official site – https://developer.oculus.com/ but you will have to dig for it.
Working with Code on Canvas I was brought in to develop one of the components of the Westpac Innovation Cave.
The component I built was an Android app that was used to give information to visitors about the technology they were looking at. It retrieved text, images and video from a backend and was easily configurable for the installers.
In mid 2015, working along side the Imagination Hong Kong office, I was part of the Imagination Sydney team of 3 developers working on developing applications for the GE Innovation Centre to be opened in Istanbul, Turkey.
The element of the project I was involved in was building an Android tablet app that would control other apps throughout the space, as well as educate the use about the space and collate information.
The app was build in Adobe AIR using the Starling framework. It also identified and unlocked different parts of the app depending on your location, this was done by having beacons placed around the space and the tablet reacting to them.
Here is a video of highlights from the opening from GE.
A permanent space in the Sydney CommBank office for Innovation to take place
This was a big project that involved many people. I joined the project for the development phase but work had been going on months before, but I’ll just mention parts that I was exposed to and worked on.
My main role on this project was developing a tablet app that visitors to the space would be given to teach them about the space and also to be used as a tool for parts of the space. The app was developed in Adobe AIR using the Starling Framework and Robotlegs 2 to string it all together. It also used Beacon technology to detect where a user was in the space and prompt them with information.
Some of the other interactives in the space included an Oculus Rift experience built in Unity to immerse the user in customer and merchant experiences, projected applications built in Adobe AIR – one using Leap Motion technology to inform about new CommBank products and the other accessing “big data” to display findings from millions of user transactions. Another AIR applications was also developed for an interactive whiteboard that could also be controlled via the tablets.
Here is a short video showing the above interactives and the transformation of the site.
It was a great project working with some extremely talented and hard working people that pulled off this incredible project.