AI and communication

While we still don't have anything new to show you right now, though that doesn't mean that we're slacking. We've started working on two different things that together should enable us to create a working prototype for our mapping robot.

I (Josef) am working on an AI for the robot so it can navigate through the room while scanning. I tried to create my own navigation class that would calculate the position of the robot based only on the rotation of the wheels, but for some reason it's not working. The algorithm to calculate the angle its facing seems to be working, but only certain values for some reason. Seems like a rounding error somewhere, but I don't know where.

public double getRobotangle(){
robotnewangle = robotangle - (Math.toRadians(CS.getDegreesCartesian()));
robotangle = (Math.toRadians(CS.getDegreesCartesian()));

leftwheelangle = ((Math.toRadians(Motor.A.getTachoCount())) - leftwheeloldangle);
rightwheelangle = ((Math.toRadians(Motor.B.getTachoCount())) - rightwheeloldangle);

leftdist = ((wheeldiameter*Math.PI)*leftwheelangle/(2*Math.PI));
rightdist = ((wheeldiameter*Math.PI)*rightwheelangle/(2*Math.PI));

return robotangle;

The algorithm to calculate it's position doesn't seem to work at all, even if I use values from the compass sensor. For some reason it never returns any data.

public Point getRobotpos() {
float x, y, hypotenuse;

hypotenuse = (float)(Math.sqrt((2*Math.pow((getRobotAverageDist()/robotnewangle),2))

x = (float)(robotpos.x + (Math.cos(getRobotAverageangle()*hypotenuse)));
y = (float)(robotpos.y + (Math.sin(getRobotAverageangle()*hypotenuse)));

robotpos.x = x;
robotpos.y = y;

return robotpos;

Feel free to check them both out on our Google Code site and come with tips if you have. Right now the code is a bit unstructured however. The algorithm to calculate the angle based on the rotation of the wheels is a comment right now in favor for a similar method that uses the compass sensor instead.

When my own class didn't work I was forced (for the time being anyway) to use the navigation class in leJOS, which unfortunately limits me to use only the methods provided by the class to navigate the robot. I'd rather be able to manipulate the motors at will, for example to be able to follow a wall easily.

I'm using behavior programming for my work, a really smart way of creating AI's, since it's so easy implementing, editing or removing different parts of it. I recommend reading the leJOS tutorial about it if you're interested.

Peter is working to improve our communication, which as you could see in our movie works already, but could be made a lot smoother and more structured. The idea of the new communication class is that you add data that you want to send to a queue. The class will then process one item at a time and send it over either USB or Bluetooth (depending on what you want to use). The receiver will add each received item to a list and you may then process them whenever you want. There is one client (NXT) part and one server (PC) part in this.

Hopefully, when we're done with these things, or at least got something that works, we can combine it with the scanning algorithm we already have and with small tweaks to the Graphic Interface a crude prototype that should be able to move around and scan a room, and in real-time paint it up on a computer screen.


Inga kommentarer:

Skicka en kommentar