Current work includes evolutionary approaches to gait creation and optimization based on inverse kinematics and also forward kinematics approaches.
Another project investigates low level motion control and action selection (e.g. having the robot learn which kick to perfom in what situation).
RobotControl is a tool that is used in conjunction with the code developed for the robot. It allows us to connect to the robot through WLAN and view internal states of the robots (e.g. processed images, percepts, world model, etc.), to record sensor data into log files and to simulate up to 8 robots. This allows for simplify off-line development and debugging of our code.
The software is part of the code released by the GermanTeam each year after the RoboCup event.
The Extensible Agent Behavior Specification Language (XABSL) is an XML dialect for describing the behaviors of autonomous agents. In XABSL, an agent consists of a hierarchy of behavior modules called options that contain state machines for decision making. The options are ordered in a acyclic directed graph with basic behaviors at the terminals of the graph.
XABSL was developed for the use in the RoboCup domain and was first used by Aibo Team Humboldt at the GermanOpen 2002. An largely improved version of XABSL was the basis for the behavior developments of all the German universitites in the GermanTeam for GermanOpen 2003 and RoboCup 2003.
Perception and Active Vision
Active Vision is currently used to verify world model hypothesis. Future
work will investigate the possibilities of implementing low level camera
control based on biological examples such as the ocular vestibular reflex
and saccades. These will be based on robot movement and low level image
information (e.g. contrast) rather then actual world model information.
Auto-adapting Image Processing
Currently under development is an approach to image processing that is
based on a qualitative color table that is adapted in real time using
knowledge about the camera images' content.