TV remote too far (◐▂◑ )
💡Let me use REMOTEMATE to grab the remote! 😉
WOOOO 😝
This project propose to design and build a fully remotely controllable robotic arm which perfectly mimics human arm movement, can pick up and clamp onto objects, and has tactile feedback based on pressure asserted to objects. The movement of the robotic arm will be controlled via a wearable glove that the operator wears, and there will be a central web application which will control the robotic arm’s features which include clamp pressure options, movement controls for extra axes of freedom, and movement sensitivity options.
This Project was done for Final year CAPSTONE between a group of students
The following diagram above shows a general block diagram overview of the entire system. This serves to show the end-to-end control of the robotic arm and serves to show the emulation process for the problem at hand. Highlighted in blue is the glove control which consists of an MPU6050, two flex sensors and Laptop A which produces a total of 4 values that are to control 4 motors located on the robotic arm. Laptop A serves as a general transmitter for the data sent as well as a general control station. The wireless data transfer highlighted by the cyan colour serves as the wireless channel which includes a server, WLAN network produced by some router or mobile hotspot and Laptop B which serves as a receiver of the data sent. The web application is initiated by the server and can be seen by the control user located at Laptop A and provides WASD control for the remaining 2 motors as well as other implemented toggle features. The robotic arm block highlighted in red is the receiving control station which handles the entirety of the robotic arm movement. Finally, the feedback block highlighted in green serves in handling the video stream sent back to the control user and emulate vision. It includes a 2D dimensional identification algorithm with a use of a reference object.
- MPU6050
- Yaw, pitch and roll values are used properly to control 3 joints on the robotic arm including shoulder, eblow2 and wrist. Movement is smooth and fairly accurate considering the hardware used.
- Flex Sensors
- Opening and closing of the fist controls the clamp
- Gimbal Lock Check
- Robotic arm control successfully locks when the user hand is out of bounds. Implementation of the x-y-z hierarchy is successful
- Moving Average Filter
- Moving average filter implementation on the sampled flex sensor values is successful. The clamp control is less jittery and user has smoother control
- Spike filter
- Filter function on the receiving side successfully discards values when data sent is present with too high of an increment (beyond threshold value). Robust control of the arm is maintained.
- WASD
- WASD control successfully controls the remaining 2 joints on the robotic arm including the base and elbow1.
- Precision
- Precision includes 3 settings namely “Low”, “Medium” and “High”. These settings successfully change the precision increments of the WASD control for the base and elbow1 control.
- Clamp Feedback Toggle
- Feedback toggle is successfully sent to the robotic arm. This enables the clamp to take into consideration the FSR sensor force values and make comparisons to the user input coming from the flex sensors. Clamp successfully stops applying further motor torque/pressure when threshold is surpassed.
- Web application can successfully retrieve the video stream hosted by the camera on the robotic arm receiving side. This is seen through a hosted Youtube stream.
- Successfully handles incoming and outgoing data through web socket connection. Latency is dependent on the signal strength and possible interference or network congestion.
The following section discusses the performance of the integrated prototype as a whole. The prototype is completely functional and all promised features have been delivered. The prototype contains 2 major components including the input/output control of the robotic arm and the web application.
The final prototype for the globe and robotic arm modules are shown below. The glove serves as the input control unit and the robotic arm serves as the output control unit.
The final web application design along with its implemented features are shown below.
Things to improve:
- Make the Robotic arm Controlled through Raspberry Pi instead of 2 LAPTOPS
- Make it automated and aesthetically pleasing
- Admin, Site. “HCPCA9685 - Library for PCA9685 16ch 12bit PWM Controller.” Hobby Components, 13 June 2016, forum.hobbycomponents.com/viewtopic.php?t=2034.
- Christensen, Jack. “MovingAVG.” GitHub, Mar. 2012, github.com/JChristensen/movingAvg.
- Einaros, et al. “Node.JS Websocket Library.” NPM, www.npmjs.com/package/ws.
- Reconbot, and HipsterBrown. “Serialport.” NPM, www.npmjs.com/package/serialport.