Flexible Vision-Based Control of Rotorcraft - The Case Studies: 2DOF Helicopter and 6DOF Quadrotor
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Two case studies of flexible vision-based control structure are presented in this work. The first structure is proposed for a 2DOF (degrees-of-freedom) model helicopter. The second structure is proposed for a 6DOF quadrotor. The position-based vision control was adopted for both structures with images acquired from a single camera in an eye-in-hand configuration. The control law proposed for the 2DOF model helicopter is able to track a fast maneuvering object regardless of its initial position on image or its distance from the camera. The equations of motion of the model helicopter are presented for the development of joint level Linear Quadratic Regulators (LQR). Linearized image Jacobian relating optical flow to incremental pitch and yaw motions required to correct task errors is discussed. The speed and precision of the control law in tracking a target object are demonstrated through two sets of experiments. The first experiments have the object fixed to the background while the second tests have the object moved through physical space. In both sets of experiments, the initial position of the object on image and the distance between the object and the camera are varied. The control structure proposed for the 6DOF quadrotor has the ability to switch between human-in-the-loop and vision-based controls. The task of the vision-based control when activated by the pilot is to hover the quadrotor at user-defined pose (position and orientation) relative to an object of interest. An image processing algorithm is developed to extract five non-coplanar feature points from images of the target object in real-time. The image coordinates of the feature points are passed to a pose estimation process based on the Pose from Orthography and Scaling with Iteration (POSIT)algorithm. The human-machine control flexibility is demonstrated by multiple switching between the two control modes in flight. The vision-based controller, when active, is able to bring the quadrotor into the neighborhood of the desired hover pose at a speed comparable to that of a control structure derived based on a motion capture and tracking system. Sources of UAV pose estimation error and constraints of the proposed control structure are also discussed.