On going projects

                               
 I am yet to coin a cool name for this robot, so for now its just a 4WD (4 wheel drive) mobile robot.
The inspiration for this project is a class that I took in the spring of 2015  by Prof. Alonzo Kelly and was fascinated by the theory of mobile robot design. Subsequently my internship at Near Earth Autonomy ( a R & D startup based out of Pittsburgh) gave me a glimpse into the world of perception and using Lidars to perceive.
LIDAR's although extremely efficient and precise are very expensive. However, I came across this small pulsed light lidar on RobotShop's website. It is basically a ranging device that uses pulsed light. It generates 0 dimensional data ( a point, with a corresponding range vale of the point in 3D space). Adding a servo (to pan) and sampling points at equally spaced intervals will generate a 1D data ( a line of points, an array of range (data)). Adding another servo (to tilt while simultaneously panning) will generate a 2D data. Correlating the range data along with the pose values ( the angular values of the servo's while the range values were sampled) and doing some coding in ROS will help generate a scan. This is the idea. For now, I am generating only 1D data. I have been able to build the drive system for this. It is basically a L298D based driver and has encoders with 8 steps.That's where I am at now.
I was able to make some progress after my graduation. So currently, I can control the robot using a rqt plugin in ros. This will come in handy when I interface a joystick. Given a fast and reliable network, I must be able to stream videos over the network and do some smart of primitive SLAM. Updates are on the horizon...

Sorry about the poor lighting in the video. I tried to adjust the contrast and this happened..


You can't see the GUI anymore.. I will upload better videos as and when there is progress.

Autonomous Indoor Navigation

This is another project that I am pursuing in parallel.  Bascially, the idea is to have a MAV(micro aerial vehicle) navigate indoor environments autonomously. 

 MAV configuration:

1. 2250 KV brushless motors
2. 12 A BLHeli ESC's. 
3. 6x4 props
4. CCD flight contoller
5. 250 mm frame. 

For now, the CCD flight controller is being used to test the functionality of the Quadrotor. Eventually, this flight controller will be replaced by the pixhawk to exploit its "off-board" control functionality. Also, there is a really good interface between ROS and the Pixhawk stack that can be leveraged.  Up until now, I have been successful in building the Quadrotor. It flies (just to be clear). I was also able to interface Pixhawk with ROS. There are a bunch of initial setups that need to be done to interface Pixhawk and ROS and establish communication between them. The gory details are not important. What is important is that the Pixhawk can be commanded through an external computer ( a single board computer) to change its attitude. This can be leveraged to control the quadrotor in GPS denied environments by means of other sensors such as monocular camera's, lidars and so forth. The plan is to interface the Pixhawk stack with ROS to send "waypoints" (points in space to go) to the quadrotor and have the quadrotor follows these waypoints (it's probably not going to happen in the first 100 trials). 
I will update this post as and when I make progress.
 

No comments:

Post a Comment