Hands-on session

        All of the above concepts would be implemented by the students on the Fire-Bird II robotic platform. Fire-Bird II is a universal robotic research platform used in IIT Bombay to teach concepts of embedded systems.
        This session will start with an introduction to ICC AVR and AVR Studio which are IDEs used to program the AVR series of microcontrollers. Using these environments we will teach the students to program the microcontroller on the Fire Bird II platform while interfacing it with all its peripherals.
· I/O Port Programming
· Motion and Position Control
· LCD Interfacing
· Robot Sensor Interfacing
· Serial Communication
· Wall Hugging
· White Line Following
· Adaptive Cruise Control

I/O Port Programming
        To start off with the basics, we will guide the students through configuring the ports as input or output ports. A simple program to turn ON and OFF a buzzer is written, downloaded onto the Fire Bird and executed. Then programming the ports to read input from the bump sensors, we would program the buzzer to beep every time the bump sensor is pressed. 

Motion and Position Control
        Next we move onto evoking the actuators on the robot. We have two DC geared motors on board which we would program to move the robot forward, backward, left and right using differential drive configuration. But, simply motion of the robot isn?t enough. How would we control the distance it needs to travel or the degree it needs to turn? That brings us to the notion of a closed loop system. A feedback loop would be required to control the exact position of the robot. This is achieved by using shaft/position encoder. These shaft encoders give their feedback to the interrupts on the microcontroller thus providing the system with a closed loop control.  In this session participants will also learn concept of pulse width modulation (PWM), Interrupts. 

LCD Interfacing
        We now move onto interfacing the LCD with the microcontroller. LCD is one of the most essential parts of mobile robot for robot-human interaction. Writing code for the LCD for the first time is often a tedious job. Programming the LCD requires lengthy and intricate codes. In this session we will teach how to program the LCD. This is made simple through our libraries which would be available to the students. The tricks and simplified methods for LCD interfacing would be taught to the students. This would be used ahead to display various sensor values and other parameters required for proper functioning of the robot.

Robot Sensor Interfacing
        Fire Bird II carries a large array of sensors like five bump sensors, three Sharp linear distance measuring sensors, three white line sensors, two shaft encoders and one directional light sensor. Most of these sensors give out analog values as their outputs, which are converted to digital data by the inbuilt analog to digital converter of the microcontroller. In this session we learn how to acquire sensor values and process them and display it on the LCD.

Serial Communication
        We learn how to control the robot using serial port of the robot. This would give the students a clear idea of the fundamentals of serial communication and explain them the intricacies involved in working with the serial port, which they could go ahead and use for several other applications of control and data acquisition. 

Wall Hugging
        Our next step is to integrate all these acquired skills to combine sensing, intelligence, and actuation. By giving intelligence to the robots they can react intelligently to surrounding environment. For this experiment we will use the bump sensors and as each sensor is activated we will control the motion of the robot. This would be our first step to giving intelligence to the robot. 

White Line Following
        To guide the robot to follow a path i.e. to define its motion on the go, depending upon the road ahead would be a great asset. We will do this by using a closed loop control. The analog values we get from the white line sensors are used as input to controlling the robot. The robot is put into a continuous loop where it scans the sensor values and makes its own decision to traverse forward following the white line on the ground.
        From here, we come to the concepts of localization i.e. navigating the robot on a grid of white lines. These are among the problem statements of various competitions like ROBOCON and inter-collegiate Techfests. The mission being to program an autonomous robot to navigate through a grid and perform required tasks. We ask it to detect nodes i.e. intersections using white line sensors and take turns depending upon the route chosen at the Nth node as decided. These decisions could also be made with respect to the obstacles detected by bump sensors and further processed to give the robot a well defined and accurate control.

Adaptive Cruise Control
        Combining the above programs we would like to perform a challenging task to teach our robot to cruise on its own maintaining a safe distance and following a specified route. In this experiment we will use the distance acquired from the SHARP sensors for maintaining the safe distance from the front robot, while the white line sensors will be used to follow white line.
        This workshop would thus give you a clear idea on how you could use the different sensors and even construct a few on your own. These sensors when compounded can be used to do several complex tasks as per requirement which is the actual basis of this entire workshop i.e. to move robotics to the next level, a step ahead from the traditional wired and wireless robots which are becoming obsolete to something looking towards the future.