A Semantic Touch Interface for Flying Camera Photography

Abstract

Compared with handheld cameras widely used today, a camera mounted on a flying drone affords the user much greater freedom in finding the point of view (POV) for a perfect photo shot. In the future, many people may take along compact flying cameras, and use their touchscreen mobile devices as viewfinders to take photos. To make this dream come true, the interface for photo-taking using flying cameras has to provide a satisfactory user experience.

In this thesis, we aim to develop a touch-based interactive system for photo-taking using flying cameras, which investigates both the user interaction design and system implementation issues. For interaction design, we propose a novel two-stage explore- and-compose paradigm. In the first stage, the user explores the photo space to take exploratory photos through autonomous drone flying. In the second stage, the user restores a selected POV with the help of a gallery preview and uses intuitive touch gestures to refine the POV and compose a final photo. For system implementation, we study two technical problems and integrate them into the system development: (1) the underlying POV search problem for photo composition using intuitive touch gestures; and (2) the obstacle perception problem for collision avoidance using a monocular camera.

The proposed system has been successfully deployed in indoor, semi-outdoor and limited outdoor environments for photo-taking. We show that our interface enables fast, easy and safe photo-taking experience using a flying camera.

Publication
PhD Thesis