A proof of concept project was devised that produced an autonomous wireless mobile robot platform for sensing, locating, and inspecting a line of rivets. More, the robot system had to adhere to an aircraft surface when at a high degree of tilt angle, it needed to posses wall adhesion capabilities. Early on a decision was made to use the MSP430 MCU and the CRIM-mote design as the basic sensing and control technology on the robot system, since the CRIM is experienced with this processor and the learning curve was minimized by using this technology. Next, a literature search produced examples of autonomous wall crawling robot platforms. An Internet search revealed a possible robotic platform solution, a low-cost out of the box radio controlled mobile car that was capable of adhering to a vertical wall. RC cars were purchased and they became the basis of the autonomous wall attachment robotic system. Suction is created on the underside of the chassis by a Venturi vacuum, this suction is strong enough to hold the RC car to a vertical wall. The box system RC system was modified to give autonomous control, and in particular for crawling on a vertical wall, see Figure 2.
A camera was the sensor of choice for locating the simulated rivets in the demonstrator. Aircraft rivets are aluminum, and are on sculpted surfaces (Figure 1), so a Hall effect sensor was inappropriate for detecting river faults and controlling the robot under development. And, since rivets are generally circular and circles can easily be found using image processing a camera was chosen to detect rivet faults and for navigation between rivets, or a line of rivets. In choosing the camera it was decided to do all the heavy image processing computation locally, on a powerful computer suited to the task. For this reason the mote had to be taken out of the feedback loop. A camera with a 2.4 GHz transmitter and receiver seemed to be the best solution and this was mounted onto the platform. Now, all the decentralized processing and the video from the robot were transmitted to a remote computer, one with significantly more calculating power than the onboard processor could handle. A desktop computing platform, running Matlab – a mathematics software suite – and RoboRealm, a computer visioning application to handle our controls.
Video is streamed over a dedicated 2.4GHz channel from the robot directly to the computer, to be analyzed by RoboRealm, Figure 3. RoboRealm processed the image information to determine the location of circles relative to the x-y coordinate system of the camera image frame. RoboRealm then feeds a data stream of directional information to Matlab. When the program is in automatic control mode Matlab, using the RoboRealm data stream and PID user input from a custom Matlab GUI, auto-generated the left and right PWM values for the robot. When Matlab was in manual mode, the user controlled the robot via the custom GUI. The GUI both sends data and receives data from the robot. The GUI queries the 3D accelerometer axes values (tilt angle) of the robot to control the vacuum system.