Robotics Graduate for hire
PROJECTS
I've been a part of many projects involving the design of Robotic Control System like autonomous navigation and vision based manipulation. My Latest Projects are listed below.
Latest Projects

AMAZON PICKING CHALLENGE
We are a team of seven working for the Amazon Picking Challenge, ICRA 2015. The objective of the event is automated picking which involves object and pose recognition, compliant manipulation, task planning, error detection and recovery. The task is to autonomously detected an item from a complex scene, pick the item and place it in a delivery bin.
Our team implemented MOPED (Multiple Object Pose Estimation and Detection) framework to detect and estimate the 6 Degrees of Freedom (x,y,z,pitch,roll,yaw) pose of the object in real world. We generated 3D models of new objects using BundlerPy and detected the modeled objects with their poses in a scene. This pose is sent to the Baxter controller to position the end effector of the redundant 7 DoF arm in that Cartesian Coordinates. After getting the pose, Moveit! Calculates the Inverse Kinematics and sends it out to actuate the joints of the arm. Forward Kinematics is calculated for error detection and recovery. The primary issue of removing the bins from the region of interest was resolved with the 'pcl_segmentation' algorithm of the Point Cloud Library that segments the image data into distinct clusters to process the object independently . The project is well on course and we are now performing several tests on different objects to find out the best grab position for individual objects. Hopefully we will rock the stage at ICRA 2015 !!
This is a video showing the baxter robot detecting, localizing, and grasping an object from a cubby for the Amazon Picking Challenge.

KINODYNAMIC RRT FOR NEEDLE STEERING ROBOT
The ability to steer the path of an asymmetric-tip flexible needle while inside soft tissue offers great clinical benefits. Underactuated Control of Asymmetric Tip Needle Steering Robot is a complex task due to the kinematics of the bevel-tip needle which may be modeled as a non-holonomic system. As a team of 4, we developed a motion planning framework using kino-dynamic RRT for optimal trajectory generation for steerable needle robot. The needle has 2 Degrees of Freedom : Inserting and Rotating. Intense motion planning is required to steer the needle which is initially simulated in a platform like MATLAB. The motion planner gets the desired 3D position input and it outputs the needle trajectory. The simulator gets in the trajectory and maneuvers the needle position. The model of the needle is created in a Design software like Solidworks. Stereo-vision camera system monitors the three dimensional real time position of the needle.


SMART STOOL FOR ASSISTIVE LIVING
This is the course project for the course 'Model Based Design'.
There is an increasing concern in elderly living alone and the dangers associated with this. For this
reason, there is a desire to assist the living at home for the elderly with smart materials, smart systems called Cyber Physical Systems (CPS). The goal of our CPS is to develop the core system science needed to engineer complex cyber- physical systems upon which people can depend with high confidence. Elderly people cannot lift and move a stool at the same time, despite desiring a stool for a variety of purposes including object placement, seating, or use as a footrest. Stools serve as an extension to people which provides them flexibility, increases comfort, and supports them. It is important to help the elderly in improving their quality of life.
Our team successfully developed a differentially driven autonomous indoor navigator 'Smart Stool' . A Turtlebot2 with Kobuki base acts as the stool . Turtlebot2 runs on Robot Operating System (ROS). The turtlebot maps the entire room along with the static obstacles like furnitures with the help of Simultaneous Localisation and Mapping (SLAM) and Adaptive Monte Carlo Localisation (AMCL) packages of ROS. Given the map, A* algorithm plans the trajectory of the robot. The position from which the user summons the turtlebot gives its target coordinates.
They Estimote Beacons that are used for indoor localization broadcast using Blueetoth LE to provide indoor location features such as proximity-gate functionality, context aware, temperature and motion. The Estimotes transmit important information through bluetooth such as the UUID (Universally unique identifier), Major, Minor and RSSI (Received signal strength indicator) . The process of calculating the user location from the received RSSI is given in the attached project
The smart stool can perform several secondory tasks such as picking mail, clearing path, checking floor uniformity and so on.

OBJECT TRACKING WITH A QUADROTOR BY RECURSIVE BAYESIAN ESTIMATION
This was my Bachelor's thesis and our team received huge applauds from the Department head and senior staffs.
The main aim of the project is to realize an object tracking algorithm in an autonomous Unmanned Aerial Vehicles (UAV) that enables it to track ground moving object. A camera fitted with the aerial UAV monitors the ground object and continuously tracks it. This object tracking algorithm used in this project is a particle filter in combination with SURF feature detectors. The particle filter
also known as condensation algorithm is an algorithm based on Bayesian inference that represent the pdf (Probability Density Function) as a set of particles and uses a colour histogram as the observation model of the reference image to be tracked. The SURF (Speeded Up Robust Features) is an improved version of SIFT(Shift Invariant Feature Transform).It applies mathematical operation to the reference image and stores the keypoints in the form of vectors. These keypoints also called features, are very distinct to any image. The SURF is rotation, scale, and illumination invariant. In our hybrid approach we use both the above mentioned algorithms. Normally, SURF keeps on tracking the object.
But at times SURF can be detect many false positives. Our program checks if the object detected by SURF is the object to be tracked, if not then the particle filter takes over. The processor used here is an advanced digital signal processor from Blackfin BF561-EZ-kit.
Please find the complete project documentation along with the entire openCV code in the Appendix section of the project report.

HAND GESTURE BASED WHEELCHAIR NAVIGATION EQUIPPED WITH A ROBOTIC PICK AND PLACE ARM
This project was funded by Center for Technology Development and Transfer (CTDT) of Anna University, India under Innovative Student Research Support Scheme. This is a prestigious fund that is granted for research projects containing novel ideas.
The main objective of this project is to construct a wheelchair fitted with a robotic arm and to control both using hand gestures captured by an overhead camera. This project uses a vision based scheme for gesture recognition. This project is aimed at those patients who are paralyzed at the legs and those who have weak reflexes. It enables them to move freely by navigating the wheelchair and using the same gesture pickup any object in the vicinity by controlling the robotic arm. Images taken by camera are processed in laptop and serial commands are given to the Arduino. An Arduino is used as the main
processor to control both the wheelchair navigation and the robotic arm control. A relay circuit that we designed is required to drive the wheelchair actuators through signals from Arduino. The arm uses inverse kinematics from its appropriate gripper position in 3-D co-ordinate space and drives the servo motors of the arm. We successfully developed the wheelchair and the robotic arm. The robot arm we developed did not have sufficient force control so that it w as not able to pick an object and bring it to the user. From this we obtained the lessons that the maximum torques of the servo motors play an important role in designing and adaptive PID should be implemented for obtaining results in force control.
Please find the complete project documentation along with the entire openCV code in the Appendix section of the project report.