Loading AI tools
Leading robotic research center in India From Wikipedia, the free encyclopedia
The Guidance, Control and Decision Systems Laboratory (GCDSL) is situated in the Department of Aerospace Engineering at the Indian Institute of Science in Bangalore, India. The Mobile Robotics Laboratory (MRL) is its experimental division. They are headed by Dr. Debasish Ghose, Full Professor.[1]
This article needs additional citations for verification. (August 2018) |
Type | Public |
---|---|
Established | 2002 |
Location | |
Campus | Indian Institute of Science |
Website | guidance |
GCDSL was established in 1990 (the MRL in 2002) and is considered as one of the leading robotic research centers in India. GCDSL/MRL has close research collaborations with eminent academic groups in countries such as USA, UK, Israel, South Korea etc. It also has multiple Industry project grants.
GCDSL was started with the primary aim of performing research in the fields of Swarm robotics, Multi-Robot Systems and Cooperative Robotics with applications to tasks such as cooperative transportation, robotic formations, cooperative search/rescue, and odor source localization. in MRL, several robotic platforms have been built in-house and used for real-world-experiments in order to validate algorithms related to some of the above research problems.
The group is dedicated towards creating intelligent systems that are able to autonomously operate in complex and diverse scenarios. They are interested in the mechatronic design and control of vehicles that efficiently adapt to different situations and perform in dynamic environments. This includes development of novel methods and tools for perception, mapping and path planning.
Over the years research has extended in the fields of Simultaneous Localization and Mapping (SLAM), Aerial Robotics and machine vision. Recently there's been an emphasis on computer vision and Machine learning for improving versatility and cognitive abilities of robotic platforms.
The goal is that MBZIRC 2020 will be based on autonomous aerial and ground robots, carrying out navigation and manipulation tasks, in unstructured, outdoor and indoor environments. All the sub-challenges involve cooperation between multiple UAVs and swarm-abilities. These Challenges are (1) grip a swinging ball hanging from a fast-moving drone, (2) Three UAVs and one UGV has to pick up bricks and build a wall, (3) A set of four vehicles (3 UAV + 1 UGV) to douse a series of simulated fires in a high-rise building using a pressurized canister. These missions are at the frontier of Intelligent Aerial Robotics technology and are meant for real-world application.[2] The IISc-TCS team has been selected for an interim award of $100,000 (milestone prize i.e. stage-based).[3]
The project focuses on using UAVs to gather information about an unfolding flooding disaster, allowing emergency response units to prioritise resources and deploy them effectively. It will also address the challenges associated with flying UAVs in difficult situations, as well as how the data can be combined with accelerated flood inundation models to generate detailed evacuation plans, build community flood resilience, save lives and reduce economic damage.[4]
The glowworm swarm optimization (GSO) algorithm is an optimization technique developed for simultaneous capture of multiple optimums of multi-modal functions.[5] The algorithm utilizes agents called glowworms which use a luminescent quantity called Luciferin to (indirectly) communicate the function-profile information at their current location to their neighbors. The glowworm depends on a variable local-decision domain, which is bounded above by a circular sensor range, to identify its neighbors and compute its movements. Each glowworm selects a neighbor that has a Luciferin value more than its own, using a probabilistic mechanism, and moves towards it. These movements that are based only on local information enable the swarm of glowworms to split into disjoint subgroups, exhibit simultaneous taxis-behavior towards, and rendezvous at multiple optimums (not necessarily equal) of a given multi-modal function. The algorithm was tested on a custom designed system of robots called Kinbots.
Histogramic intensity switching (HIS) is a vision-based obstacle avoidance algorithm developed in the lab. It makes use of histograms of images captured by a camera in real-time and does not make use of any distance measurements to achieve obstacle avoidance. An improved algorithm called the HIS-Dynamic mask allocation (HISDMA) has also been designed. The algorithms were tested on an in-house custom built robot called the VITAR.
Implementation of occupancy grid mapping using a miniature mobile robot equipped with a set of five infrared based ranging sensors is explored in this research. Bayesian methods are used to update the map. Another variant of this technique will utilize a single IR-range sensor to obtain range to different distinctive features in the surrounding environment and utilize the readings obtained to make the SLAM converge. These techniques will be extended to a swarm of robots. These robots would communicate using the ZigBee protocol among themselves and with a global coordinator (PC) which would be responsible for map merging. Simulation experiments are being carried out using the Player/Stage software. The robotic platform is built using a custom designed set of swarm robots called Glowworms.
A quadrotor micro-air-vehicle (MAV) is a rotor-based craft with four rotors, usually placed at the corners of a square frame. The four motor speeds (and hence thrusts) are the control inputs which result in motion of the quadrotor. The dynamics of this vehicle are fast and highly coupled, and hence presents a challenging control problem.[6]
A quadrotor and control test-bed has been fabricated in-house at the Mobile Robotics Lab. Experiments on control are being conducted on the quadrotor, beginning with yaw, pitch and roll stabilization.
A robotic platform consisting of four wheeled-mobile robots have been developed in the lab for multi-robot testing. They are similar in principle to Braitenberg Vehicles and use simple perception/interaction/actuation techniques to achieve individual vehicle complexity and produce effective group behavior through cooperation. These robots have been used to test out the GSO algorithm
These miniature robots are developed based on Kinbots.[5]
VITAR (Vision based Tracked Autonomous Robot) consists of a tracked mobile robot equipped with a pan-tilt mounted vision sensor, an onboard PC, driver electronics, and a wireless link to a remote PC. It has been utilized to test vision based algorithms such as the HIS and the HIS-DMA.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.