PATT-E
The Portable Autonomous Tracking Telescope- Experimental


Systems Overview

anno.jpg

PATT-E consists of multiple hardware and software subsystems which will enable the device to track a high-power rocket throughout its boost phase. The primary subsystems are Detection and Recording which consists of the Raspberry Pi and its peripherals, and the Powered Gimbal which consists of the gimbal and its control hardware. During operation, the Raspberry Pi makes real-time inferences of the tracked object position using onboard software, and transmits motion commands to the Powered Gimbal.

Currently the device is in the prototyping phase and has not yet demonstrated integrated operation.

Detection and Recording Subsystem

IMG_20201230_132905.jpg

Detection and Recording Hardware:

The Detection and Recording Subsystem has relatively few hardware components consisting of: a Raspberry Pi High Quality Camera with Telephoto Lens for video collection, a Raspberry Pi for image processing and stage two tracking capabilities, and a Coral Edge TPU Accelerator for stage one tracking. During the prototyping phase, an external monitor and Joystick is attached to D&R.

Detection and Recording Software:

The D&R system uses three software modules. A neural network based object detector is used while objects other than the rocket are in frame (the launch pad or horizon). While this system is in use, the device is in “stage one tracking”. A canny edge based object detector is used while the rocket is far from the camera and no other objects are in the frame. This is referred to as “stage two tracking”. While either tracker is running, the gimbal interface module uses a PID loop to translate inferences into commands for the Powered Gimbal and transmits via UART.

Powered Gimbal Subsystem

Powered Gimbal Hardware:

The Powered Gimbal Subsystem is designed around the 3D printed pan/tilt gimbal. The gimbal consists of stack of turntable bearings and pulleys. The first pulley (white) is connected to the tilt stepper and the tilt gear (also white). Through the white gear train and the tilt belt (left) the motion of the tilt stepper is transmitted to the tilt stage (orange). The second pulley (black) is simply connected to the gimbal fork and the pan stepper motor. The name “pan stepper motor” is a misnomer as if only it turns, the camera will both pan AND tilt due to the white gear train. The remaining hardware simply controls and powers the gimbal motors: the Teensy, the stepper drivers, the power supplies, and the interconnect board.

Powered Gimbal Software:

A Teensy 4.0 provides ample speed and GPIO for the control of both steppers and interfacing with the Detection & Recording System. The AccelStepper library is used to manage the step timing for the motors, while custom logic is used to translate pan/tilt, position/speed commands into actual motion of each motor. This is more complicated than in a typical gimbal as motion of the “pan motor” will actually result in a combined pan/tilt motion. In addition, the software enforces safe operating boundaries on movements and handles edge cases such as one axis hitting its limit while the other has not.


Project Gallery

Electrical Diagram Mid November 2020

Electrical Diagram
Mid November 2020

Device CAD (Solidworks) Mid November 2020

Device CAD (Solidworks)
Mid November 2020

The project started with the creation of hardware CAD and wiring diagrams to ensure that all interfaces were accounted for and to verify hardware feasibility. In addition, time was spent studying the basics of Python and OpenCV, as I had no experience with either coming into the project.

Printed hole guide for MDF base 11/26/20

Printed hole guide for MDF base
11/26/20

Gimbal Uprights with Bearings 11/26/20

Gimbal Uprights with Bearings
11/26/20

As soon as I got home for Thanksgiving, I began assembling the hardware according to the CAD. Thanks to the time spent reviewing the model, minimal adjustments were needed. The use of both 2D and 3D printing greatly accelerated the speed at which this phase was completed.

Teensy and Interconnect Board 11/29/20

Teensy and Interconnect Board
11/29/20

Interconnect Board Underside 11/29/29

Interconnect Board Underside
11/29/29

Belt Sizing and Electronics Integration Testing 11/28/20

Belt Sizing and Electronics Integration Testing
11/28/20

While additional motion components were in the mail, time was spent testing the belt sizing and motor positioning. Minor modifications were needed to the pan stepper mount due to unforeseen belt sizing troubles.

Pan Motion Testing 12/2/20

Pan Motion Testing
12/2/20

First Integrated Motion Test
12/4/20

After this integrated motion test, visible progress slowed drastically as I entered the non-embedded software portion of the project (and finals week of school). While this phase was pretty painful due to my limited software experience, I learned a lot about CV and the Raspberry Pi. One of the important discoveries of this phase was the necessity of the use of TensorFlow and an external TPU (Tensor Processing Unit)- which was purchased and integrated. Following simple testing, I determined that there were two significant stages of flight- one where the vehicle is silhouetted against the horizon or trees, and one where the rocket is in the distance and visible as not much more than a dot. These phases necessitate different approaches to detecting the object. Near the ground it is difficult for a simple program to determine where the rocket is, so a TensorFlow object detector is used. In the air, this complex software is not needed as the rocket is (if the flight is legal) the only object in the frame and can be tracked by simply finding the highest contrast/sharpest edged region of frame and keeping it in the center.

Face Detectors- Fun for the Whole Family 12/18/20

Face Detectors- Fun for the Whole Family
12/18/20

Model Launch for collecting training and validation data
12/28/20

RICO (Rockets In Context) Trained MobileNetSSD  12/29/30

RICO (Rockets In Context) Trained MobileNetSSD
12/29/30