# Optical Flow
Optical Flow uses a downward facing camera and a downward facing distance sensor for velocity estimation.Video: PX4 holding position using the ARK Flow sensor for velocity estimation (in Position Mode).
An Optical Flow setup requires a downward facing camera and a distance sensor (preferably a LiDAR). These can be connected via MAVLink, I2C or any other bus that supports the peripheral.
If connected to PX4 via MAVLink the Optical Flow device must publish to the OPTICAL_FLOW_RAD (opens new window) topic, and the distance sensor must publish to the DISTANCE_SENSOR (opens new window) topic.
The output of the flow when moving in different directions must be as follows:
|Vehicle movement||Integrated flow|
For pure rotations the
integrated_y) have to be the same.
An popular setup is the PX4Flow and Lidar-Lite, as shown below.
Sensor data from the optical flow device is fused with other velocity data sources. The approach used for fusing sensor data and any offsets from the center of the vehicle must be configured in the estimator.
# Flow Sensors/Cameras
PX4Flow is an optical flow camera that works indoors and in low outdoor light conditions without the need for an illumination LED. It is one of the easiest and most established ways to calculate the optical flow.
# ARK Flow
ARK Flow is a DroneCAN optical flow sensor, distance sensor, and IMU. It has a PAW3902 optical flow sensor, Broadcom AFBR-S50LV85D 30 meter distance sensor, and BMI088 IMU.
# PMW3901-Based Sensors
PMW3901 is an optical flow tracking sensor similar to what you would find in a computer mouse, but adapted to work between 80 mm and infinity. It is used in a number of products, including some from: Bitcraze, Tindie, Hex, Thone and Alientek.
# Other Cameras/Sensors
It is also possible to use a board/quad that has an integrated camera. For this the Optical Flow repo (opens new window) can be used (see also snap_cam (opens new window)).
# Range Finders
You can use any supported distance sensor. However we recommend using LIDAR rather than sonar sensors, because of their robustness and accuracy.
Estimators fuse data from the optical flow sensor and other sources. The settings for how fusing is done, and relative offsets to vehicle center must be specified for the estimator used.
The offsets are calculated relative to the vehicle orientation and center as shown below:
Optical Flow based navigation is enabled by both the availableestimators: EKF2 and LPE (deprecated).
# Extended Kalman Filter (EKF2)
For optical flow fusion using EKF2, set the use optical flow flag in the EKF2_AID_MASK parameter, as shown using QGroundControl below:
If your optical flow sensor is offset from the vehicle centre, you can set this using the following parameters.
|EKF2_OF_POS_X||X position of optical flow focal point in body frame (default is 0.0m).|
|EKF2_OF_POS_Y||Y position of optical flow focal point in body frame (default is 0.0m).|
|EKF2_OF_POS_Z||Z position of optical flow focal point in body frame (default is 0.0m).|