Table of Contents
Lab 1 - Agents
Note: For this lab, you can work together in teams of up to 4 students, although ideal is probably 2-3. You can use Piazza to find team mates and discuss problems.
You will need a Java Development Kit (JDK) and a Java IDE (a text editor should do as well).
Problem Description
Implement the control program for a vacuum cleaner agent. The environment is a rectangular grid of cells. Some cells may contain dirt. The agent is located somewhere in this grid and facing in one of the four directions: north, south, east or west. The agent can execute the following actions:
- TURN_ON: This action initialises the robot and has to be executed first.
- TURN_RIGHT, TURN_LEFT: lets the robot rotate 90° clockwise/counter-clockwise
- GO: lets the agent attempt to move to the next cell in the direction it is currently facing.
- SUCK: suck the dirt in the current cell
- TURN_OFF: turns the robot off. Once turned off, it can only be turned on again after emptying the dust-container manually.
The robot is equipped with a dust sensor and a touch sensor. If there is dirt at current location of the robot, the agent will sense “DIRT”. If the robot is bumping into an obstacle or wall, the agent will sense “BUMP”. The goal is to clean all cells and return to the initial location before turning the robot off. Note, a full charge of the battery of the robot will only last for a limited number of actions.
To make this a bit easier you can use the following assumptions:
- The room is rectangular (not necessarily quadratic). It has only 4 straight walls that meet at right angles. There are no obstacles in the room. That is, the strategy “Go until you bump into a wall then turn right and repeat” will make the agent walk straight to a wall and then around the room along the wall.
- The room is fairly small, so that 100-200 actions are enough to visit every cell, suck all the dirt and return home given a halfway decent algorithm.
Tasks
- Characterise the environment (is it static or dynamic, deterministic or stochastic, …) according to all 6 properties mentioned on slide 13 (Agents) or section 2.3.2 in the book.
- Develop a strategy for the agent such that it cleans every cell and outline the agent function.
- Implement the missing parts of the vacuum cleaner Java program (see below) such that it encodes your agent function.
- Test your program with all three provided environments. Record the number of steps it takes to finish each environment and the resulting points.
- Is your agent rational? Justify your answer.
Material
The Java project for the lab can be found on Skel. See, instructions below on how to get it.
The ZIP archive contains code for implementing an agent in the src directory. The agent is actually a server process which listens on some port and waits for the real robot or a simulator to send a message. It will then reply with the next action the robot is supposed to execute.
The zip file also contains the description of three example environments (vacuumcleaner*.gdl) and a simulator (gamecontroller-gui.jar). To test your agent:
- Start the simulator (execute gamecontroller-gui.jar with either double-click or using the command “java -jar gamecontroller-gui.jar” on the command line).
- Setup the simulator as shown in this picture:
- If there is a second role called “RANDOM” in the game, leave it as “Random”.
- Run the “Main” class in the project. If you added your own agent class, make sure that it is used in the main method of Main.java. You can also execute the “ant run” on the command line, if you have Ant installed.
- The output of the agent should say “NanoHTTPD is listening on port 4001”, which indicates that your agent is ready and waiting for messages to arrive on the specified port.
- Now push the “Start” button in the simulator and your agent should get some messages and reply with the actions it wants to execute. At the end, the output of the simulator tells you how many points your agent got: “Game over! results: 0”. In the given environment you will only get non-zero points if you manage to clean everything, return to the initial location, and turn off the robot within 100 steps. If the output of the simulator contains any line starting with “SEVERE”, something is wrong. The two most common problems are the network connection (e.g., due to a firewall) between the simulator and the agent or the agent sending illegal moves.
- The output of the simulator shows the true state of the environment (which your agent can not see). Use that for debugging, e.g., to see if you managed to return to the home position.
You can see here, what the example environment looks like. Of course, you shouldn't assume any fixed size, initial location or locations of the dirt in your implementation. This is just an example environment.
Hints
For implementing your agent:
- Add a new class that implements the “Agent” interface. Look at RandomAgent.java to see how this is done.
- You have to implement the method “nextAction” which gets a collection of percepts as input and has to return the next action the agent is supposed to execute.
- Before you start programming a complicated strategy, think about it. The things your agent has to do are:
- execute TURN_ON
- visit every cell and suck up any dirt it finds on the way
- return to the initial location
- TURN_OFF
- For this your agent needs an internal model of the world. Figure out, what you need to remember about the current state of the world and the agent.
As a general hint: “Days of programming can save you minutes of thinking.” Think of a strategy, the rules to implement it and the information you need to decide on the actions before you start implementing it.
Handing In
Connect to skel.ru.is using your favorite ssh client and unpack the assignment into your home directory by running the following commands:
[student14@skel ~]$ tar xvf /labs/arti16/lab1/lab1.tar [student14@skel ~]$ cd arti16/lab1 [student14@skel hw1]$ ls answers build.xml dist Makefile questions src
You can copy the code to your own machine for development by using any SCP or SFTP client (e.g., WinSCP). However, you need to copy it back to skel into the same place before handing in.
To answer the questions run make answers
while being in the directory containing Makefile
.
Single answers will be put into files called answers/answerX.Y.txt
, which you can also edit.
Finally, to hand in your answers run make handin
while being in the directory containing Makefile
.
This should produce a file ``/labs/art16/.handin/lab1/student14/handin.tar.gz''. You can check if it exists using the ls command.