I have been the test lead for Cydalion, a product that utilises Tango technology to assist people with visual impairments. These Android devices utilise point cloud data, a 3D camera as well as the standard camera that allow the device to give feedback to users. The feedback tells them; "there are stairs in front of you", "there is a 'head height' object three feet away", "there is a trip hazard," etc. This has the potential to significantly reduce the limitations associated with mobility in regards to people who are blind.
Testing this app has been a challenge since there is very little information out there on how to test this new technology. We have utilised automated Unit and UI Testing. As well as recruited a cadre of blind and visually impaired people as manual testers. I have also had to get creative in figuring out how to motivate developers and product owners to really empathise with the target end user and develop for them and not develop based on our sighted biases.
Tango enabled devices and 3D cameras will soon be stock features on mobile hardware. They present a new challenge for testing. It is no longer a question of whether or not the the camera turns on and takes an image to be processed. The camera is on and sending a significant amount of data that is more than just pixels. Is that information useful and if so, is that information correct? Attendees will understand at a high-level a couple of ways to approach testing these devices, because, within the next few years they will probably have to.