Improved UI

Today I improved the UI that controls the robot and that displays the map.
It now looks like:

It has support to zoom in the map by scrolling the mousewheel or by draging the slider on the left panel. If you want to pan the map you just have to drag it.
It's quite the same as Google Maps.

The left panel has also been more structured and now you can easily choose between USB/Bluetooth and change what NXT to connect to.

I also refactored the code so that it looks a little bit more structured :)

/Peter F


A working prototype!

We are happy to announce that we now have a working prototype. Our robot can scan its surroundings and return data to a server which in turn visualizes it.

This was the try ever in creating a real map, and the results aren't very impressive. Fortunately we've already managed to improve it quite a bit from that, although there is still very much left to be done.

A simple AI controls the robot to reach around to be able to scan. Because of bugs in the current release of leJOS we have been unable to use the Compass Sensor and the as a consequence the accuracy has been suffering. This is something we hope will improve fairly soon.

Also, the AI is so far very simple, but thanks to the behavior based programming we're using, it will be relatively simple to continue working on what we have and expanding it until we some day will have an algorithm which can tackle most problems with reasonable accuracy.

The bar to the left is a control bar which shows some data from the robot and also has options for what you want to be painted. The labels are pretty self-evident, save for "Bumps" which is simply when a behavior has taken over from the default forward in a straight line. Blue indicates that the ultrasonic scanner in front has detected an obstacle and red that the robot has crashed into something with its bumper.

The arena of today was a (roughly) 3 by 1.5 meter rectangle, and you can clearly see the shape of the area, although it's far from perfect. Considering how early in the project we are we're quite satisfied. In fact, we're following our schedule pretty nicely. This far was about what we planned to be by the end of next week.

Now we will continue to work on mainly the AI, to give it an intelligent algorithm instead of the almost random we have now. Naturally we will continue to improve the other aspects as well, so stay tuned!

/Josef and Peter


New Robot

As promised we have constructed a new robot that probably will be the one, or at least very similar to the final product. It is based on the Explorer robot from Nxtprograms.

Our requirements for the physical units was that it has to be able to rotate around its own axis (a requirement for the leJOS navigation classes) and that the IR sensor has to have a 360 degree field of view. A bonus with this model compared to others we have considered is that the IR sensor rotates directly on a motor. In earlier versions we had to convert between the rotation of the motor and the sensor, something we don't have to worry about anymore.

Aside from changing the sensor on top to our IR sensor, we have added a compass sensor and fitted the Ultrasonic sensor in front instead. We also slightly modified the top motor and sensor so that its rotational axis is between the wheels, something that will facilitate calculating the position of obstacles.

Today we have also done several tests to determine how accurate the navigation classes are, and the results are somewhat disappointing. For some reason navigation with the help of the compass sensor is actually less accurate than a simple tacho navigator that only uses the rotation of the wheels to navigate. Also, bugs with both classes severely limits the amount of methods we can use and still maintain a reasonable accuracy. For now we have settled on using Forward(), Backward() and Rotate(), which is accurate enough for our needs. Hopefully we (or leJOS, which is still in beta after all) will solve some of the problems and we won't limited to these forever.

/Josef and Peter


Communication framework and a broken NXT

When I first bought the NXT the sound from its speakers were not good at all, at first I thought it was supposed to be like that, it's just Lego and probably a cheap speaker. Anyhow just a couple of days later there were no sound from it at all so for a couple of weeks ago I sent it in to be repaired/replaced and now the new one has arrived :) So that's why we haven't been able to do and write especially much the last time.

PenemuNXTFramework 0.1

Anyhow what I have done while the NXT was gone is that I've written a communication framework that's supposed to be a middle layer between the existing communication classes in leJOS and the programmer. It's based at some interfaces that you must implement and then it will handle a lot of stuff for you.

Some key features are:
Queue: Let you to setup a queue of set of data to send. This allows you to send data when the NXT has time, maybe there is much to do at the moment.
Priority: Give priority to a set of data, this means that it will be sent first of all items in the queue and it will be processed first at the receiver. Good to use for shutdown commands.
Consistent: The syntax and classes used is exactly the same ones at the NXT as at the PC.
Choose type: When you setup the communication you specify if to use USB or bluetooth. You only specify it once and you don't have to change anything else.

This is how the base part might look:
// Setup data factories
// They produce empty instances of the data objects
NXTCommunicationDataFactories DataFactories = new NXTCommunicationDataFactories(
new ServerMessageDataFactory(), new TiltSensorDataFactory());

// Setup ..
NXTCommunication NXTC = new NXTCommunication(true, DataFactories,
new NXTDataStreamConnection());

// .. and start the communication

// Setup a data processor
// It will be given the incoming queue of data and handle it
ServerMessageDataProcessor SMDP = new ServerMessageDataProcessor(NXTC, DataFactories);

//Add some data to the send queue
NXTC.sendData(new TiltSensorData(TS.getXTilt(), TS.getYTilt(), TS.getZTilt()))

A data factory might look like this:
public class SensorDataFactory implements INXTCommunicationDataFactory {
public SensorData getEmptyInstance() {
return new SensorData(NXTCommunicationData.MAIN_STATUS_NORMAL,

public INXTCommunicationData getEmptyIsShuttingDownInstance() {
return new SensorData(NXTCommunicationData.MAIN_STATUS_SHUTTING_DOWN,

public INXTCommunicationData getEmptyShutDownInstance() {
return new SensorData(NXTCommunicationData.MAIN_STATUS_SHUT_DOWN,

And part of a data processor like this:
public void ProcessItem(INXTCommunicationData dataItem,
NXTCommunication NXTComm) {
SensorData SensorDataItem = (SensorData) dataItem;

Acceleration.add(new AccelerationValues(SensorDataItem.getAccX(),
SensorDataItem.getAccY(), SensorDataItem.getAccZ()));

So it's not finished yet but soon I hope. Anyhow this will make things much easier for us when we want to share data to the computer and back.

Right now we are rebuilding the robot and hopefully we will upload some pictures later this evening.
We are basing the new model on the Explorer from NXTPrograms.com, it's much more stable the our previous construction.

/Peter Forss