Note: This is a work in progress, and will be updated regularly. If you would like more information on a specific section, or would like to suggest a section let us know. Future sections are planned to cover all aspects of Dawn, and will include setup guides, API documentation and more.
Dawn is not picky about what hardware you use. Generally speaking, any linux based OS should be capable of running Dawn. That being said, the Dawn features you can run and to what degree you can run them will directly depend on the capabilities of the hardware you are using. As we continue our testing we will populate this page with benchmarks for Dawn features across various companion computers. If you have a companion computer in mind you would like to see benchmarked, let us know.
We highly recommend using an Nvidia TX1 or TX2 based companion computer for deploying Dawn. Both systems give you access to the full suite of Dawn features concurrently.
Dawn relies exclusively on cameras for its interpretation of the world. Cameras connected to Dawn can do multiple things at once. For example, a stereo camera used to gather depth information can also be used to detect aircraft and specific objects.
There is no theoretical limit to how many cameras Dawn can handle. That being said, there are many practical limitations. Bandwidth is one of the major ones, as is processing power. When it comes to interfacing with cameras Dawn uses Gstreamer or simple V4L2, depending on the interface type (USB vs CSI-2 for example). Which can easily be graphically specified when setting up Dawn.
AEIOU will offer a line of hardware built specifically to run Dawn, including stereo cameras, monocular cameras and a companion computer. Additionally, AEIOU will also offer a ready to run "plug and play" version of Dawn, where a companion computer, Dawn license and cameras are bundled - to allow the quick and easy integration of Dawn onto a UAS. AEIOU specific hardware can be expected to become available for pre-order in August of 2018. Though AEIOU hardware is not in any way required to run Dawn, we highly recommend it for most applications.
Dawn uses both stereo and monocular depth sensing. Through deep learning, Dawn is capable of accurately sampling depth from single frames of monocular vision in real time, across multiple instances. For monocular or stereoscopic depth estimation cameras must be calibrated using our calibration tools, and the focal length and baseline (in the case of stereoscopic cameras) must be known accurately.
Dawn will prioritize obstacle avoidance based on configurable parameters. The parameters for this prioritization can be configured on the fly with an App developed through the Dawn API, or at setup during basic Dawn configuration. Real Time adjustment of this prioritization is especially useful in applications like surface inspection or subject filming. Available constraints include:
All prioritizations are kinematically configurable to include position, velocity and acceleration.
Dawn defaults to operating via velocity based control. Dawn will internally determine the best velocity for the drone based on a Speed Profile. Speed profiles can be configured in real time via the API, or at setup graphically.
Dawn communicates via Mavlink protocol with flight controllers. Dawn can be configured to output either position setpoints or velocity ones, though the default of velocity based setpoints is recommended for smoothness.
Depending on how Dawn is configured by the APP being run, Dawn will either handle the entire mission (in the case of subject tracking for example) or only take over for collision avoidance. For example, Dawn's inspection App, which is provided for free, works with mission files output by ground control software like Mission Planner, and tries to fly as closely to the desired path as possible, deviating only when necessary due to obstacles or aircraft.
Dawn will pull the craft velocity, position, acceleration and other properties from the flight controller via Mavlink. These properties are used by Dawn internally.