Lykta requires accurate positioning indoors, and the road to a functional solution has seen a couple of (educational) dead ends along the way. With the horror experience starting to come together, I think it was time to start off by talking a bit about the challenges we’ve been facing.

The Move.Me server software debug screen (i.e. what the PS3 that’s tracking the controllers sees). Controllers are visualized as swords.

Many location-aware apps use GPS, but this provides at best an accuracy down to a couple of meters outdoors, and is even more imprecise indoors. Our current solution is based heavily on Sony’s Playstation Move system and their Move.Me API, which provides sub-centimeter accuracy in positioning in an area of about 3.5×4 meters, as well as very accurate orientation tracking. However it is a rather closed system, as the calculations on the data from the controllers and tracking camera are all done in an actual PS3 system running special server software, with the position and orientation data sent over network to a computer. The picture below explains our current setup.

A PS3 receives sensor information from the wireless controller attached to the handheld device, and also tracks the position of the glowing sphere through the PS Move camera. The orientation and position of the controllers are computed from this information, and sent over the network to the smartphone in the handheld device.

When we started the project we had heard about this cool thing called sensor fusion, where different sensor data, e.g. that from the gyros, accelerometers and the compass of a smartphone, is combined to a more accurate whole. We hoped we could use this to calculate the position and orientation of the phone. We managed to create an algorithm that made it possible to track the orientation, but this was very sensitive to compass disturbances.

The data from the smartphone sensors was far from accurate enough to use for position tracking however. Our first solution used a ceiling-mounted Wiimote as a wireless IR-camera to track strong IR-LEDs on the handheld unit. However it was hard to cover a wide enough area as the Wiimote camera has a narrow field of view.

The early Lykta prototype as part of the Invisible Showroom project

We had a closer look at the PS Move and realized it had great accuracy for positioning. A camera detects the position and size of the glowing ball on the controller, and from this the controller position can be calculated in relation to the camera. Sony’s Move.Me API, which we ultimately chose to use, did not seem to be accessible outside the US, so we started off with the open source PS Move API by Thomas Pearl. We wrote a Unity plugin for it and got to try the experimental orientation tracking capabilities. They worked, but at the time they had problems with drifting which required constant recalibration, something that would be tricky to do in a real use context.

Open source PS Move API tracking glowing sphere

When we finally inquired about the Move.Me software we were lucky and got in touch with just the right person at Sony, and got hold of the Move.Me server software surprisingly quickly. It turned out to surpass our expectations, as it continually calibrates orientation on the fly by comparing movement according to the camera image with data from the sensors in the wand.  An additional bonus is that position and orientation data is given relative to flat ground. However, one hard-coded limit that hits us hard however is the maximum range of 3.5 meters, which gives a pretty small play area.