Our aim is to create an application to allow the user to control a drone via the use of a 3D camera- and specifically, their hands, making it easier and more intuitive.
The application is divided into 2 sections-
On the right side we have the user controll area- The Intel RealSense camera is facing towards the user, and they can view their hand’s current position, and the section division graphically.
On the left side we have the regular camera’s feedback, which is facing towards the drone. This way the user doesn’t need to look up from the screen in order to see the results of his hand movements. Demonstration: Controls The application supports 27 possible movements for the drone, described in the following manner:
Movement on the Z axis (forward and backwards) is indicated by the color of the highlight- Forward is red, Idle is green and Backwards is blue.
Approach: Our approach is to detect the key point in the hand- the wrist, and use that as a joystick, to allow the user control over the drone. For this we will need a 3D camera (will use- Intel’s RealSense), and a drone (will use- the LadyBird drone) Steps:
Connecting the RealSense camera and performing basic actions.
Detecting the wrist point correctly.
Implementing graphic feedback for the user.
Establishing a connection with the drone.
Controlling the drone via the drone’s API, and the use of the IR cameras in the room.
Linking the position of the wrist to the movements of the drone.
Minimizing movement errors created by the rapid frame rate.
Configuring an additional camera to monitor the movements of the drone from within
the application project by: Hadas Shahar,Lidya Tonkonogy and Yuval Brave.