2009-11-12

Latest improvements

We've done some improvements to the project, I'll list them below:
  • Renaming of the projects
    • PenemuNXTExplorer
    • PenemuNXTExplorerClient
    • PenemuNXTExplorerServer

    • PenemuNXTFramework
    • PenemuNXTFrameworkClient
    • PenemuNXTFrameworkServer
  • Rearranged UI
    • New colors, logo, menu, margins...
  • Timeline
    • We can now "play" saved data and go to specific positions in it
  • Save and open
    • Save and open maps as files (*.penemunxtmap and *.penemunxtmapxml)
  • Filters
    • A function to filter out irrelevant data and draw lines between the correct data
  • View raw data
    • A table shows the raw data (sensor values from the robot) we base the map on
  • Hotspots
    • Shows places on the map where we have accurate data with big red circles, less accurate with small green circles
Screenshots







/Peter F

2009-11-10

Improvements and show at the school

We've made loads of improvements the last days. The reason is that tomorrow we will show the robot live @ our school for the first time. It's only a demo of the prototype.

Screenshots







I will write a more detailed description of the improvements tomorrow.

If you want to see us demo the robot you should be @ Berzeliusskolan tomorrow (11/11) @ 18.00.

/Peter F

2009-11-05

Save and Open map data

I've now implemented a save and open functionality. This enables us to save the raw data (positions, headings, distance etc.) as a XML file that we later on can open and import into the application.
This is a useful function in many ways. We can now work on the server functionality even though we don't have the NXT with us (for example when we are in school). We just use saved data and let the serverapp process this instead of data retrieved by Bluetooth. As it is for now all the data is imported at once but I will implement a emulating functionality. This will add the saved data with the same time delays as it was recorded. This will enable us to play back what happened. I think I will implement this by adding one more connection type. Instead of choosing between USB and Bluetooth you will be given the choice File and instead of passing the name of the NXT you pass a filepath to the file containing raw values.

One of our goals is also to publish the data to a webserver so that a client can connect and see the map being created in real time. The export function is one step towards that.

This is how the new control panel looks like:

Intelligent A.I.

So far we have put all the work towards making a prototype that can do the basic goal we have, and we have more or less succeeded. However the AI that controls the robot has so far been pretty stupid to say the least. Simply put it has gone in a straight line until it gets too close to an obstacle and then turned. Now we have started working on something that can manage a little more than that.

The strategy we've planned for the robot is the following:

  • Start by either scanning around or simply drive forward until it approaches an obstacle.
  • Follow the outline of that obstacle until the robot reaches a position it has already visited.
  • If the first obstacle was an object the robot had driven around then let the robot simply drive in another direction and find a new obstacle. If it was the circumference of the area ( such as the walls of a room) the robot will begin scanning the remaining unexplored area until it's sure it has found all objects.
  • Eventually we will hopefully implement something which uses the map it has created, such as pathfinding or cover the area in an efficient way (something that could be of use for e.g. automated vacuum cleaners).
This is still a very rough draft and will most likely be subject to significant changes, but it's a start none the less.

Right now we're on the second point: we're working on an algorithm which will follow the curves and turns of a wall or the side of an object. The first iteration will consist of the behavior we already have (turn left when too close to something), a behavior to align the robot parallel to the wall and a behavior to detect when the wall turns away from the robot. If done right this should be enough to work in the vast majority of all cases.