You are on page 1of 3

Nick Parks

Per. 2
Research section

My Senior project involves hundreds of people controlling a drone in real time using
online controls. Doing this requires something called a Real-Time protocol. Real-Time protocols
are built on UDP multiplexing. Real-Time protocols support data transfer to multiple
destinations, which is useful for sending data to a live video feed, smartphone, and one to a
computer system at the same time. Using this protocol allows us to send data in multiple forms,
our most useful form of data will be in buffers and integers to process video and to decode what
type of directional tool is being used on the webpage. (Schulzrinne, Henning)
Looking more towards the actual goal of the project we learn about how multiple people
can actually control the drone simultaneously using a real time protocol. A real time protocol can
transfer data within milliseconds of the networking call (Schulzrinne, Henning). Using this near
instant delay we can send video buffers directly to a web server that can then render out the
video using an FFMPEG library. This library can decode the buffers in bytes and render them
using client side rendering processed. (Murphy, Greg). Since this video is live users will be able
to get a true instant view of the drone from across the world or directly under the drone, all at the
same time. The rendering process is very optimized and takes advantage of the client's graphics
processor and whatever resources they can provide. Due to bilayer segmentation rendering these
buffers is not very intensive on most computer systems and the video quality is not affected at
all. (Criminisi, Antonio).

Coming up the physical drone itself we obviously want it to be as stable as possible since
crashing is not a good thing. By calibrating how much much movement is being applied against
the drone during its takeoff procedure and measuring movements that are not sent by a user we
can determine how much wind is pushing the drone. By taking that amount we can automatically
change the drones pitch so it stays stable while hundreds of feet in the air. (LR Garcia). Now
since users are controlling the drone we have to put in some caution flags so they can't just ram
the system into a wall. By tracking objects in the systems camera feed we can determine the
distance of objects from the drones front. The calculations for this are fairly simple by taking an
object's texture, identifying any possible pattern, and then calculate the distance away by using a
couple buffers of images. (Achtelik, Markus). Another caution area is the three axiss we have to
work with. X, Y, and Z are the usual names for them. Auto-hovering and stabilization is done
with internal tools and altimeter built directly into the drone hardware. Using the processor of the
drone and its aerodynamic shape the heavy processing internals are handled by the drone
manufacturer for the most part. (LaFleur, Karl). Taking those inner systems we can produce a
automatic control system that can be somewhat controlled by users across the world and produce
a safety system that protects the drone from bad/not experienced users that may fly the drone into
a wall or object. Taking use of the inner emergency system of the drone we can then drop it to a
safe altitude and produce a ready to go package for hundreds of users to fly a drone together.
(Bristeau)

Works Cited
LaFleur, Karl, et al. "Quadcopter control in three-dimensional space using a noninvasive motor
imagery-based braincomputer interface." Journal of neural engineering 10.4 (2013):
046003.

Achtelik, Markus, et al. "Visual tracking and control of a quadcopter using a stereo camera
system and inertial sensors." 2009 International Conference on Mechatronics and
Automation. IEEE, 2009.
Carrillo, LR Garcia, Alejandro Dzul, and Rogelio Lozano. "Hovering quad-rotor control: A
comparison of nonlinear controllers using visual feedback." IEEE Transactions on
Aerospace and Electronic Systems 48.4 (2012): 3159-3170.
Schulzrinne, Henning, et al. RTP: A transport protocol for real-time applications. No. RFC 3550.
2003.
Schulzrinne, Henning. "Real time streaming protocol (RTSP)." (1998).
Bristeau, Pierre-Jean, et al. "The navigation and control technology inside the ar. drone micro
uav." IFAC Proceedings Volumes 44.1 (2011): 1477-1484.
Criminisi, Antonio, et al. "Bilayer segmentation of live video." 2006 IEEE Computer Society
Conference on Computer Vision and Pattern Recognition (CVPR'06). Vol. 1. IEEE, 2006.
Murphy, Greg. "System and method for sending live video on the internet." U.S. Patent No.
6,564,380. 13 May 2003.

You might also like