Transcription

5 DOF Robotic ArmECE 478/578 Embedded RoboticsFall 2009Portland State UniversityMatt BlackmoreJacob FurnissShaun Ochsner

Project DescriptionThis project involves designing and building a mobile robot capable of human-like behaviors.The means to which these behaviors are achieved are contained in three aspects of the robot‟sdesign. The first is a mobile base that allows the robot to navigate through an indoor space.The second is a head and neck capable of producing several human-like gestures coordinatedwith speech. The third is an arm and hand assembly used to supplement the robot‟s gesturesand allow it to grasp and move objects. Each of these design aspects is integrated to completethe first phase of the project and produce a platform for further research. Subsequent phaseswill involve using existing artificial intelligence, vision, speech, and dialog software to expand thecapabilities of the robot. Also, the robot will be used to research a new type of artificial neuralnetwork at Ohio University.This paper details the design and development of the arm and hand assembly within the firstphase. The arm and hand, henceforth referred to as arm, are designed to meet the followingrequirements. First, it must have the ability to grasp an object and place it in a different location.Second, it must be similar in scale to that of a human arm and be able to replicate similarmotions. The final design should be made with standard components, such that it could beeasily reproduced and mirrored to create left and right versions. Finally, the arm should beeasily mounted to the mobile base.The scope of the first phase of the arm‟s development is defined by the following limitations.The arm is built with as many un-modified components as possible. In addition to reducing theamount of time necessary for fabrication, this facilitates the final design being easily reproduced.The ability of the arm to move an object is primarily symbolic. With this in consideration, anobject with a negligible mass is chosen. This reduces the torque requirements for the arm‟sactuators and allows the overall design to be built to a larger scale. To simplify the softwaredesign, each joint on the arm will be manually controlled. Although future phases of the projectwill likely require more sophisticated motion control, this simplified approach allows the robotsactuators and ranges of motion to be controlled and tested in a straightforward way. As a firststep to higher levels of motion control, the forward kinematic equations for position are includedin the first phase.With these limitations in mind, the details of the projects are broken down into the following.The mechanical design of the arm deals with its physical construction and range of motion ofeach joint. System modeling relates the position of the hand to the angles of each joint in thearm (kinematics). Software design includes the programming methods for commandingactuators to move a joint to a specified position in addition to a description of the programmingenvironment. The electrical components encompass the hardware necessary to control andpower the system, actuators, and the devices to send command signal to them. The first step isto complete the mechanical design, the details of which are described below.1

Mechanical DesignThis project involves using an existing head and neck and modifying an existing mobile base.The arm, however, is designed and built from scratch. For this reason, the majority of work onthe arm in the first phase revolves around its mechanical design and construction.The first step in the mechanical design of the arm is to define its degrees of freedom. A degreeof freedom, or DOF, is an independent displacement associated with a particular joint. Jointscan be ether prismatic or revolute, or both. Prismatic joints are capable of linear motions whilerevolute joints are capable of rotating. In this case each of the arm‟s joints is revolute, and thus,each degree of freedom is a rotation. Each of these DOFs is controlled by an actuator.The human arm is considered to have seven degrees of freedom. These consist of threerotations at the shoulder, one at the elbow, and three rotations at the wrist. The actuators thatcontrol the shoulder and, to a lesser degree, the elbow have to carry the load of the entire arm,hand, and payload. These actuators must be capable of producing substantially greater torquethan actuators at other joints. To reduce the number of high-torque actuators required, theshoulder is designed with only two DOFs. Although the wrist does not have to carry a high loadlike the shoulder, space at this point on the arm is limited. For this reason, the wrist is givenonly two DOFs. This leaves a total of five degrees of freedom for the arm instead of seven.The human hand has twenty seven degrees of freedom, most of which are associated with thefingers. To grasp a simple object, the motions of the fingers are not needed. This assumptionallows the hand to be designed with one degree of freedom, thus greatly simplifying the design.A simple representation of the arm is shown in the Figure 1 below. The red arrows representthe axis that each DOF can rotate about. Although the hand is shown, its DOF is not labeled.Figure 1: Robot arm’s degrees of freedom.As mentioned above, it is important that the final robot design be easy to reproduce and mirror.This is facilitated by using TETRIX components whenever possible. TETRIX is a componentsystem originally designed for use in high school robotics competitions. The system consists ofa variety of prefabricated aluminum components that are designed to be easily modified andconnected to one another. Also included are high torque DC gear motors, servos, and motordrivers. These components are compatible with the LEGO Mindstorms system. The LEGOsystem not only includes components for building robots, but includes a series of Plug „N Play2

sensors and peripherals in addition to a controller and programming environment. Togetherthese systems allow a designer to quickly build robot prototypes with little or no fabrication. Thedetails of the LEGO controller, programming environment, and electronic components aredescribed in later sections. Figure 2 shows the basic TETRIX robotics kit.Figure 2: TETRIX robotic kit.Although the use of TETRIX components reduces the effort and time required to design andbuild the system, not all of the components were initially available. Thus, the arm needed to bedesigned before the components were acquired. The Solid Works CAD tool was used toaccomplish this. Solid Works is a modeling and simulation environment capable of representingthree dimensional shapes in space in addition to material properties. An online CAD library wasused to acquire models of most of the TETRIX and LEGO components. These individualmodels are combined in an assembly that defines the spatial and kinematic relationshipsbetween them. The resulting virtual assembly is used to evaluate the moments of inertia, mass,volume, physical dimensions, etc. at a component or system level. Also, this assembly is usedto simulate motion between the components. This allows the designer to check for collisionbetween parts, analyze the range of motion of the entire system, and visualize its performancebefore anything is physically made. Approximately eight design iterations were investigatedwith this tool before parts were ordered and very little was changed from the CAD model once itwas actually built. Figure 3 shows the final model of the arm without any custom fabricatedparts.Figure 3: Solid model of robot arm and hand assembly.3

This model does not include some of the hard ware necessary to complete the assembly inaddition to the hand. Figure 4, shows the complete TETRIX assembly and hand.Figure 4: Final TETRIX robot arm and hand assembly.In addition to the stock TETRIX components, welding rod was used to fashion the fingers of thehand. The plate that the fingers attach to was also fabricated. A list of the modified TETRIXcomponents is in the appendix.Electronic ComponentsThe electronic components used to operate the arms consisted of two electronic motors, fourelectronic servo motors, one motor controller, one servo controller, and the NXT brick. Themotors were used on the elbow and shoulder joints to provide more torque and stability whileservo motors were used to control the hand, wrist, and arm rotation. All these components arepart of the Lego Textrix Robotics division. Using the Tetrix parts along with the NXT brickallowed for less time spent integrating and developing drivers, because when programmed withRobotC, the drivers and control functions are already integrated into the system allowing formore of a plug-and-play environment. This saved time in developing code for controlling thearm.The main control of out arm is done by the NXT brick.This control unit is run by a 32bit ARM7 microprocessorand an 8 bit AVR microcontroller. It has 4 six wire inputports, and 3 six wire output ports. It also contains aUSB port for programming and debugging. It is mainlyprogrammed using the NXT graphical interfacelanguage, LabVIEW, RobotC, or NXT . We chose touse RobotC, which is a subset of the C programming language since that is what our group was4

the most familiar with. This will be discussed further later on in the report. The RobotC interfaceallowed us to download and run programs on the NXT unit, and once downloaded could be rundirectly from the NXT without needing to be hooked to a computer. For our application we wereusing Tetrix products to interface with the NXT we ran all our components from Sensor Port 1 ofthe NXT. The NXT allows up to four controllers to be daisy chained to each sensor port. Thesecontrollers can be a combination of servo controllers and motor controllers which will bediscussed later. Any sensor that will be used for additions for arm control will also be plugged into the NXT.The motors we used were Textrix DC motors available from Lego Robotics. The motors run at152rpm at full power and provide 300oz-in torque and require 12V to operate. Within thesoftware the speed can be controlled by setting the percentage ofthe motor speed to lower the RPM of the shaft. This gives themotors more versatility when used in projects where more torquethan can be provided by a servo is needed, but the slower speed ofthe servo is still desired. This was useful in our application a servomotor would not have been able to hold up the weight of our roboticarm, but we still needed slower movement for a more realistic appearance and allow morecontrol for the user. The disadvantage of using motors in this situation is they are heavy andmore difficult to mount than a servo would be. We installed encoders for position control, but wedid not use them for this part of the project. The operation of the encoders will be talked aboutlater in the report.The motors are powered and controlled using aHiTechnic DC motor controller. This motor controllerinterfaces the motor with the NXT brick as well asproviding power to the motor itself. Each motorcontroller can operate two 12V Tetrix motors as well asinterface with motor encoders which will be discussedlater. It is this motor controller that allows the motorspeed to be adjusted by changing the power levelsupplied to the motor by using an internal PID algorithm.Encoders are installed on the two motors used on the robot. Theseencoders are made by US Digital. They are used to allow positioncontrol of the motors so they can perform similar to servos. Theencoders used are optical quadrature encoders. These encoders usetwo output channels (A and B) to sense position. Using two code trackswith sectors positioned 90 degrees out of phase, the two output channels of the quadratureencoder indicate both position and direction of rotation. If A leads B, for example, the disk isrotating in a clockwise direction. If B leads A, then the disk is rotating in a counter-clockwisedirection. The encoder also allows the system to use PID control to adjust the speed of theshaft.5

The servo motors used were three HS-475HB servos and one HS755HB all made by Hitec. Both servos are 3 pole with karbonite gearsthat can be run at 4.8V or 6V. The 475HB provides about 80 oz-in oftorque and the 755HB provides 183 oz-in of torque. The 755HB is alarger servo than normal is used with the Tetrix system, but the servowire is the same for both servo types, so they can both be used withservo controller. The downside of this servo type not being availablefor the Tetrix system is that there is not mounting hardware availableso a mount had to be fabricated to attack the servo to the Tetrix stockparts. The servos have a range of 0 to 255 so they give you excellentposition control. The motors inside the servo only hold position when powered so when thepower is removed any weight bearing servos release. The wrist on the robot is an example ofthis. When the program is running the wrist servo supports the hand, but as soon as power isremoved or the program is ended the hand falls to one of the servo extremes.Like the motors, in order to interact with the NXT devicethe servos must attach to a HiTechnic servo motorcontroller. The servo controller requires a 12V supply andit divides this down to 6V to operate the individual servos.The servo controller can hold up to six servos together,and like the motor controllers the can be chained togetherto allow the use of more servos than on controller couldhandle.SoftwareThe programming of the arm movement was done using RobotC language. RobotC wasdeveloped by Carnegie Mellon University and is a subversion of the C programming language. Ituses the same syntax as C but a does not have access to the same libraries, so the commandavailability is somewhat limited. There are specific libraries for some aftermarket parts, andlibraries can be made to incorporate new parts for use with the NXT. The PSP-Nx-lib.c librarywas used in order to use as PS2 controller to operate the arm.The software to control the hand can be broken up into three sections: controlling the DCmotors, controlling the servos, and integrating the PS2 controller. The code for each was testedprior to being compiled into the final program. We will start with describing how the DC motorsare programmed, followed by the servos, the controller integration, and finally how the finishedcontrol program works.The software allows the DC motors to be turned on, turned off, set the power level as well asallowing encoders to be used. In order to use the motors the configuration coding should beentered at the top of the top of the program. The line “#pragma config(Hubs, S1, HTMotor,HTMotor, HTServo, none)” sets sensor port 1 (S1), and configures it to have two motorcontrollers and one servo motor controller chained together. After that each hub must be set. To6

configure a motor you use the line “#pragma config(Motor, mtr S1 C1 1, motorD,tmotorNormal, openLoop, reversed, encoder)” this line sets the first controller (C1) on sensorport 1 (S1) as a DC motor plugged into the motor 1 slot. The command motorD, sets the nameof the motor to be used in the program (motorA, motorB, and motorC are designated for NXTmotors) and tmotorNormal sets the motor in normal mode. The motor can be set in openLoop orPID to use the internal PID controller. The PID mode can only be used if an encoder is attachedto the motor and activated. The motors can also be switched between forward and reversedmodes in this line. Once these lines at entered it allows you to use motor control commands.The following code is a sample motor program:#pragma config(Hubs, S1, HTMotor, HTServo, none,#pragma config(Motor, mtr S1 C1 1,motorD,tmotorNormal, openLoop)none)task main(){motor[motorD] 75;level.wait1Msec(4000);milliseconds// Motor D is run at a 75 powermotor[motorD] 75;// Motor D is run at a 75 power// The program waits 4000level.wait1Msec(750);milliseconds}// The program waits 750The code runs the motor forward for 40 seconds and backwards for 7.5 seconds.Servos are programmed in a similar way. The hub must be configured for a servo controller inone of the spots. The line “#pragma config(Servo, srvo S1 C3 1, ,tServoNormal)” sets the third controller (C3) on sensor port 1 (S1) as a servo plugged into theservo1 slot. Unlike motors the tServoNormal command is the only command that needs to beentered, but an empty placeholder spot still have to be left. The following code is a sampleservo program.#pragma config(Hubs, S1, HTServo, HTServo,#pragma config(Servo, srvo S1 C1 1,,tServoNormal)none,none)task main(){while(true){if(ServoValue[servo1] 128)to 0 (than 255):{while(ServoValue[servo1] 255)servo1 is less than 255:{servo[servo1] 255;to position to 255.}}7// If servo1 is closer// While the ServoValue of// Move servo1

wait1Msec(1000);second.if(ServoValue[servo1] 128)to 255 (than 0):{while(ServoValue[servo1] 0)of servo1 is greater than 0:{servo[servo1] 0;to position to 0.}}wait1Msec(1000);}}// Wait 1// If servo1 is closer// While the ServoValue// Move servo1// Wait 1 second.This program reads the servo value and moves it to the closest end stop.The controller we used required the add on library "PSP-Nx-lib.c" to make the buttons resondproperly. A wireless PSP controller was used to control the robot using one button to controleach degree of freedom. The layout of the controller buttons is as follows and their names:L1L2dacbl j bl j xl j yR1R2triangesquarecirclecrossr j br j xr j yThe line “PSP ReadButtonState(SensorPort, Addr, currState)” checks to see if any of thebuttons have been pressed using a Boolean state, 0 for pressed, 1 for not pressed. Thejoysticks return a 0 at center and has a range from -100 to 100.Combining the above knowledge we were able to create a program to run all the abovecomponents of the arm. Motor 1 controls the shoulder, motor 2 controls the elbow, servo 1controls the wrist up and down, servo 2 controls the wrist left and right, servo 3 open and closesthe hand, and servo 4 moves the entire arm left and right. The pseudo code for the controlprogram is as follows:IfIfIfIfIfIfIfIfIfIfIfIftriangle is pressed move shoulder upsquare pressed move shoulder downcircle pressed move elbow upx pressed move elbow downjoystick2 pushed up move wrist upjoystick2 pushed down move wrist downjoystick2 pushed left move wrist leftjoystick2 pushed right move wrist rightR1 pushed close handL1 pushed open handR2 pushed move arm rightL2 pushed move arm left8

System ModelingAs explained previously, this phase of the project is limited to manually controlling each degreeof freedom. The operator moves each joint to a new angle and this places the arm in a newconfiguration. For each configuration the hand is moved to a specific location and orientation.The equations that relate the arm‟s configuration to the hand‟s location and orientation arecalled the forward kinematic equations for position. What is more useful however, is the abilityto determine the arm configuration that will achieve a desired hand location and orientation. Inother words, the position and orientation of the hand must be defined in terms of the jointangles. This is called inverse kinematics. The forward kinematic equations for the arm aredeveloped below followed by some possible solution techniques for the inverse kinematicproblem. Developing these equations is the first step to implementing a more sophisticatedmethod of motion control. Although this development is not an exhaustive description of themathematics involves, it highlights the basic concepts. References are given in the appendix.Before developing the forward kinematic equations it is necessary to describe how a frame inspace can be represented by a matrix. Also, it is necessary to understand how a transformationmatrix can map a frame with particular position and orientation to another. The following 4x4matrix represents a frame in Cartesian space.Here, the P elements represent components of a position vector that defines the location of theframe relative to a fixed frame. The n, o, and a elements are components of unit vectors thatdefine the x, y, and z axis of the frame respectively. These vectors determine the frame‟sorientation relative to the fixed frame. The bottom row is necessary to keep the matrix square.A transformation matrix, in this context, defines the necessary translations and rotations tomove from one such reference frame to another. These transformations can be combined for aseries of reference frames such that the resulting relationship defines the last frame relative tothe first. In the case of the robot arm, the first frame is the fixed origin and the last is the hand.This is done by simply post-multiplying each transformation matrix with the next. For example, ifT12 represents the transformation between frames 1 and 2 and T23 represents thetransformation between frames 2 and 3, the total transformation between 1 and 3 can becalculated as follows.Using this methodology, a reference frame can be assigned to each joint on the robot arm.Through successive transformations between each frame, the total transformation can be9

determined starting at the fixed base of the arm and ending at the hand. This will define theabsolute position and orientation of the hand and be the basis for the forward kinematicequations.The Denavit-Hartenberg representation specifies a systematic method for assigning thesereference frames such that the form of the transformation matrix between successive frames isthe same. The details of this method are not described here, but the assignments of eachframe according to this conversion are shown in Figure 5. It is important to note that, althoughthis robot has only revolute joints, the Denavit-Hartenberg method works for prismatic joints or acombination of the two. It will not however, model robots with motions in the Y-direction.Figure 5: Reference frames bases on Denavit-Hartenberg representation.Using the schematic above, the so called DH parameters are determined. These are shown inTable 1 50090Table 1: DH parameters for robot arm.Indices for each degree of freedom are listed on the left. The values of each DOF arerepresented by the θ values which are unknown. The „joint offset‟ is represented by d. This iszero in all cases for this robot because each joint is in the same plane. The lengths of each link,in meters, are listed in the column labeled a. The last column lists the angles between the xaxis of successive frames.10

These parameters are used to define the transformation matrix between frames. This generalform of this matrix is shown below.Using this matrix, the following relationship defines the forward kinematic equation where eachA matrix is written in terms of the corresponding parameters from the table above.The individual equations for each element in terms of the joint angles are given in the appendixin addition to MATLAB code that can be used to compute the result for a given set of angles.As can be seen by the resulting equations in the appendix, the inverse kinematic solution will bedifficult to achieve. Each equation involves multiple coupled angles which make the problemdifficult to solve analytically. A closed form solution for a simple five DOF robot such as thisdoes exist, but in general the solution must be achieved numerically.An attempt was made to use an artificial neural network to map the desired location andorientation to the corresponding joint angles. This was implemented using MATLAB‟s NeuralNetwork Toolbox. A two layer, feed-forward network, with 20 neurons was trained using theLevenberg-Marquardt method. This was done with a built-in GUI tool. The results of thisexperiment were not accurate. Without understanding neural network theory better, theseresults can‟t be further interpreted. A link to the MATLAB code is listed in the appendix.PerformanceStructuralThe mechanical arm is built from almost entirely pre-fabricated Tetrix aluminum components,two DC motors with gears, several small-scale servos, and is built to full-scale human arm size.Due to this, it takes a minimal amount of torque to cause vibration in or possibly warp the basecomponents. This means that the mechanical arm cannot carry large amounts of weight. It isestimated that it can pick up slightly less than three pounds at full extension. However, thedesign is robust and allows large range of movement without detrimental effects on thestructure, thus providing the possibility for a very human-like interaction with this arm.11

Position ControlCurrently there are encoders attached to the two DC motors which control the „shoulder‟ and„elbow‟ vertical movements however they are not used. The encoders cause difficulty with themotors because the motors resist instantaneous position correction and they lock-up. Currentlyall position control is manual and user-operated through a RF wireless joystick controller.Object GraspingThe hand attached to the mechanical arm is designed to mirror a human hand. Currently it onlyhas one DOF, its ability to open and close by moving the thumb. This however is sufficient forgrasping and picking up objects. The movements are relatively slow so that they are somewhatmore realistic. Additionally, if the servos speed is increased accuracy is lost.Future WorkSensorsFuture work that can be performed on this mechanical arm includes adding sensors. Thesecould include sonar range finder, stereo (two) camera vision, or even an experimental smelldetector. This would allow automation of the entire robotSpecify End-effecter PositionAdditionally, and more specifically for the mechanical arm, future work could involve solving theinverse kinematic equations for all degrees of freedom in the arm. This would allow the user oran automated intelligent program utilizing sensors to specify a position and orientation that thehand should be in. All the angles of rotation for each motor and servo would be automaticallycalculated and moved to that position.Trajectory planningWith the addition of sensors, the arm and hand could potentially utilize trajectory planning. Thiswould entail sensing an object coming toward the robot, calculating its speed and trajectory, andmoving the arm and hand to a position along that trajectory to potentially deflect or catch theincoming object. The movement of the arm and hand would have to be sped up and positioncontrol accuracy would have to be increased for this to be possible.Genetic Search AlgorithmAs long as memory in the NXT brick allows, genetic algorithms could be implemented to allowfor room mapping and searching. This would allow the robot, and more specifically the arm, toknow the position of objects in the room and potentially interact with them.12

Image processingImage processing would be an essential upgrade with vision sensors so that the incoming datacould be interpreted properly. Intelligent processing would allow more accurate readings andwould provide optimized responses.ConclusionOverall, the goals of the first phase of the robot arm have been meet. The final assembly ismade almost entirely of stock TETRIX components. The parts that were modified were done insuch a way would be easy to reproduce. Although the method of motion control is limited in thecurrent state, it serves as a strong foundation on which to test the performance and interface ofthe electronic components. The forward kinematic equations have been developed and theprocess has been well documented for future research with this robot.13

AppendixReferences:Niku, A.B. (2001). Introduction to Robotics: Analysis, Systems, Applications.Upper Saddle River, New Jersey: Prentice Hall.Braunl, T. (2008). Embedded Robotics: Mobile Robot Design and Application with EmbeddedSystems.Berlin Heidelberg: Springer-Verlag.Hartenberg, Richard and Jacques Danavit.(1964). Kinematic Synthesis of LinkagesNew York: McGraw-HillUseful Links and Downloads:TETRIX solid model library (SolidWorks 2009):http://web.cecs.pdx.edu/ furnissj/tetrix models.zipSource for TETRIX solid ?arg tetrixSolid model of robot (SolidWorks 2009):http://web.cecs.pdx.edu/ furnissj/final assembly.zipOptical encoder manufacture‟s ental/rotary/kit/e4p/Optical encoder data mental/rotary/kit/e4p/TETRIX gear motor data sheet:http://web.cecs.pdx.edu/ furnissj/Tetrix%20DC%20drive%20motor.pdfC code:http://web.cecs.pdx.edu/ furnissj/armjoystick.cPhoto /72157622850565385/Video of final demo:http://web.cecs.pdx.edu/ furnissj/robot arm.wmvMATLAB code for neural network inverse kinematic solution:http://web.cecs.pdx.edu/ furnissj/neural network MATLAB.zip14

RobotC code:#pragma config(Hubs, S1, HTMotor, HTServo, none,none)#pragma config(Motor, mtr S1 C1 1,motorD,tmotorNormal, openLoop)#pragma config(Motor, mtr S1 C1 2,motorE,tmotorNormal, openLoop)#pragma config(Servo, srvo S1 C1 1,,tServoNormal)//*!!Code automatically generated by 'ROBOTC' configuration **********************************//**//* Program Name: PSP-Nx-motor-control.c*//* *//**//* Copyright (c) 2008 by mindsensors.com*//* Email: info ( at ) mindsensors ( dot ) com*//**//* This program is free software. You can redistribute it and/or modify *//* it under the terms of the GNU General Public License as published by *//* the Free Software Foundation; version 3 of the License.*//* Read the license at: **************/const ubyte Addr 0x02;const tSensors SensorPort S2;#include "PSP-Nx-lib.c"intintintintint// Connect PSPNX sensor to this port!!nL

HiTechnic DC motor controller. This motor controller interfaces the motor with the NXT brick as well as providing power to the motor itself. Each motor controller can operate two 12V Tetrix motors as well as interface with motor encoders which will be discussed later. It is this motor controller that allows the motor