Lab 2: Programming in LEGO Mindstorms NXT

5 stars based on 61 reviews

We tested the sensors and motors in the context of a semester thesis. Afterwards we designed and programmed an own robot that explores an unknown room and stores a map of this room in the Targa TGA image format. The semester thesis is divided into following subsections: Report on experiences and knowledge gained during work.

The first step on our list of tasks was to get a profound understanding of NXT's sensors and motors. Of primary interest was the ultrasonic sensor being the only one using digital communication with its own built-in micro-controller. Despite its generally high accuracy, the sensor seems to have some weak points when measuring certain distances. The remaining sensors were tested as well, although not as extensively as the ultrasonic, the results are shown in the following subsections.

In the end we robot lego mindstorms nxt sensor programming a closer look a the revised motors, now including rotation sensors. Following Parts of the NXT kit have been tested as part of the semester thesis: To get an idea of the ultrasonic sensor's accuracy and its rounding behavior we examined the sensors behavior towards objects within small distances. The Object we used was a small cardboard box Distances smaller than 3 cm can not be measured.

The biggest deviance of 2. The sensors mean deviance of 0. The second tests purpose was to gain knowledge of the sensors field of vision. The same box as in the first test was used It was moved to different distances and angles relative to the ultrasonic sensor while noting the resulting distance readouts. The sensor was placed in horizontal as well as in vertical position, Figure 1. The results show that the ultrasonic sensor should always be placed in horizontal position, other positions decrease as well the field of vision as the sighting distance of the sensor.

The sensor seems to be a bit 'blind' on the left eye, which can be explained by the fact that the left eye is actually the receiver of the ultrasonic wave while the right eye is the sender.

After these experiments on statical behavior of the ultrasonic sensor we moved on to some dynamic tests. The diagram in Figure 1. The data for this diagram was gained by writing a program robot lego mindstorms nxt sensor programming LEGO software based on LabView that stored the current ultrasonic value along with the current angle of one of the moving motors we used the TriBot model into a file on the NXT brick.

This robot lego mindstorms nxt sensor programming was then downloaded an processed. The dynamic test revealed two weaknesses of the ultrasonic sensor. The first issue is that it showed some areas where the sensor tends to measure cm instead of the actual distance. The second even more important issue is the critical area in between 25 cm and 50 cm where the sensor has a high probability of returning robot lego mindstorms nxt sensor programming wrong value of 48 cm.

The primary goal concerning the robot lego mindstorms nxt sensor programming sensor was to see to what amount it is able to distinguish different colors. The measurements where done in the sensor's reflected light mode. The results show that the sensor readout depends on the distance of the measured object, which makes it difficult to generally assign certain colors to sensor values. The touch sensor has been examined towards the force needed to close the touch circuit. Adding weights to the vertical positioned sensor showed that at 34 grams the appears as touched.

So in general a force of 0. The motor has been tested towards its linearity between power and rotation speed. A very rudimentary perl script that is able to download a file from the brick has been elaborated and is available in the Thesis Results section.

The perl script uses the Win As part of the thesis we had to plan and conduct a project. The goal of the project was to build a fancy robot that robot lego mindstorms nxt sensor programming the capabilities of the new "LEGO Mindstorms NXT" generation to the limits of the included sensors and motors.

We decided to go for a map building robot that can explore any given room, hence the name Explorer. The project is divided into following subsections: The walls of the unknown room should be aligned in more or less regular angle to each other. As concluded by J. Map building Motion planning Collision avoidance Localization These tasks have been implemented separately and in the given order but for the localization function omitted because of lack of time.

The program was elaborated through foollowing stages: Explorer 1, Explorer 2 and Explorer 3. A robot deemed fit for the needed requirements was robot lego mindstorms nxt sensor programming.

The robot lego mindstorms nxt sensor programming then travels forward 5 cm and repeats the first step and so on. Motion planning Collision avoidance Start positioning The main idea here was to explore the room in a clockwise direction by always robot lego mindstorms nxt sensor programming the wall on the left side. Once the robot reaches the position it originated from the program stops.

The result we achieved with our Explorer project was satisfying. It got worse tough, the lower the batteries got. The thesis and its Powerpoint summary are written in German and are available for download. Further below are the source code and binary of the explorer project, followed by the perl script to communicate with the NXT brick.

Gunbot trading botautomated cryptocurrency trading boteast coast crypto bittrex trading bot

  • Ethereum launch pad

    Blackjack dogecoin miner

  • Dogecoin explained in detail

    Litecoin difficulty projections

Virtualbox litecoin exchange 2010

  • Name download free my top bitcoin trading toolsmp3 duration 9 min 33 sec uploaded by chris dunntotal

    Bitcoin all time high chart demonstrations

  • Set up bitcoin wallet linux bitcoin wallet linux litecoin scrypt parameters us based litecoin exchan

    Ethereum white paper summary

  • Lego mindstorms nxt 2.0 price in uae

    Bitcoin price index india

Bitcoin mining farm fire bcaa

15 comments O que e bitcoin chart

Dogecoin 1 4 out of sync meanings

See also the lab introduction slides. In this assignment, you will learn how to do basic real-time programming on an embedded device with a runtime that supports real-time tasking. It will run an Ada runtime system based Ravenscar Small footprint profile.

Solve this assignment in your groups. The lab should be done in groups of 3 people, or in exceptional cases in groups of 2 people. Submissions by a single student will normally not be accepted. All students participating in the group shall be able to describe all parts of the solution. The box includes all the necessary parts for solving the assignment, and the group is responsible for handing the package back at the end of the course.

Solutions have to be submitted via the student portal, deadline for submission is September 26th, No submissions will be accepted after this point. Hand-ins with non-indented code will be discarded without further consideration. The last part of this assignment consists of building a robot car that can follow a car on a track see below.

All groups have to show that their car is able to complete the tour. Please sign up for the demonstration times on this Doodle poll.

The NXT brick can be used to control actuators, like an integrated sound generator, lights, and motors, and read input from various sensors, like light sensors, pressure sensors, rotation sensors, and distance sensors. Ravenscar Small Footprint Profile SFP supports a subset of original Ada language suitable for predictable execution of real-time tasks in memory constrained embedded systems.

The Ada runtime system it uses is very small lines! In this lab, we will not use a real-time operating system, only a runtime system supporting Ada tasking and scheduling features. Ada programs will run in RAM, so after turning off the robot the program will be gone and you need to upload it again in the next run. Unfortunately there is no proper API documentation for this driver library.

The way to learn programming with these drivers is to check the driver specifications in their respective. You can find some packages of drivers and example code as part of getting started session below. The compilation toolchain first compiles the Ada file into an ARM binary and then generates the whole system's binary by merging the driver binaries with it. This includes definitions of all tasks, resources, event objects, etc.

All software necessary to work with Ada and NXT platform is installed on the Windows lab machines in the lab This includes software for flashing the firmware, compiling programs and uploading them.

Cygwin is a shell program which emulates Unix environment inside windows. In order to compile Ada NXT programs, all you need to do is to have an appropriate makefile in the current directory.

It is recommended that you use a different subdirectory for each part of the assignment. For compiling use "make all" command. Compiler will compile all the required drivers and at the end will generate a compressed file with same name as the main procedure no extension in Windows, with.

In order to start with the lab, you first need to change a setting in the original firmware of the robot. Now put it into reset mode by pressing the reset button at the back of the NXT, upper left corner beneath the USB connector for more than 5 seconds. The brick will start ticking shortly after.

This means you robot is ready for uploading the code into its ram. In this lab, the robot will be always on "reset mode" when you upload a program as the code of the previous run can not reside in the ram after turning it off. The original firmware can be flashed back with the help of TA which you please do before handing back the box. Successful upload will show something similar address may be different: Image started at 0xc.

Now you can disconnect the robot and try testing it. You can turn off the robot by pressing the middle orange button. If you like to work at home, you can install the compilation and upload toolchain yourself. Since this depends heavily on your setup, we can't give you any direct support. However, installation in Windows is farely simple and instructions for Windows and Linux installation can be found in instruction file. The program you will write is a simple "hello world!

Priority'First ; begin Tasks. Note that we assigned lowest priority to this procedure by using attribute 'First which indicates the first value of a range. This procedure is calling a procedure background the main procedure of Tasks of package Tasks.

Your code should do the following:. Light sensors are bit tricky to initialize. Try different procedures of the nxt-display package to master output in the display. For more advanced kind of display you can use the nxt-display-concurrent package from facilities. Make sure your code compiles without error and executes as desired on the NXT brick.

Try to measure light values of different surfaces light ones, dark ones, In this part, you will learn how to program event-driven schedules with NXT. The target application will be a LEGO car that drives forward as long as you press a touch sensor and it senses a table underneath its wheels with the help of a light sensor.

For this purpose, build a LEGO car that can drive on wheels. You may find inspiration in the manual included in the LEGO box. Further, connect a touch sensor using a standard sensor cable to one of the sensor inputs. Ideally events generated by external sources are detected by the interrupt service routines ISRs. This allows to react immediately to signals from various sources. Unfortunately, most of the sensors on the NXT are working in a polling mode: They need to be asked for their state again and again, instead of getting active themselves when something interesting happens.

Our workaround for this is to create a small, second task that and checks the sensors periodically about every 10ms. If the state of the sensor changed, it generates the appropriate event for us.

Integer; -- Event data declaration Signalled: This protect object can be used by different tasks to communicate between them. For example, a task can block on receiving event:. In order to do this, declare and implement a task "EventdispatcherTask". It should call the appropriate API function to read the touch sensor and compare it to it's old state.

A static variable may be useful for that. If the state changed, it should release the corresponding event by using signal procedure of the Event protected object. Just as in part 1, put your code in an infinite loop with a delay in the end of the loop body. As suggested by the names of the events, the idea is that they should occur as soon as the user presses and releases the attached touch sensor.

In order for MotorcontrolTask to have priority over EventdispatcherTask, make sure to assign a lower priority to the latter. Otherwise, the infinite loop containing the sensor reading would just make the system completely busy and it could never react to the generated events.

Add further some nice status output on the LCD. This should complete your basic event-driven program. Compile and upload the program and try whether the car reacts to your commands. Attach a light sensor to your car that is attached somewhere in front of the wheel axis, close to the ground, pointing downwards.

Extend the program to also react to this light sensor. The car should stop not only when the touch sensor is released, but also when the light sensor detects that the car is very close to the edge of a table.

You may need to play a little bit with the "Hello World! The car should only start moving again when the car is back on the table and the touch sensor is pressed again. The edge detection should happen in EventdispatcherTask and be communicated to MotorcontrolTask via the event protected object. Use two new events for that purpose.

Make sure you define and use all events properly. Further, the display should provide some useful information about the state of the car. Please hand in only the source of the full second program that includes the light sensor code.

Make sure you include brief explanations and that your source is well-commented. Note that hand-ins without meaningful comments will be directly discarded. Real-time schedulers usually schedule most of their tasks periodically. This usually fits the applications: Sensor data needs to be read periodically and reactions in control loops are also calculated periodically and depend on a constant sampling period.

Another advantage over purely event-driven scheduling is that the system becomes much more predictable, since load bursts are avoided and very sophisticated techniques exist to analyze periodic schedules. You will learn about response-time analysis later during the course.

The target application in this part will make your car keep a constant distance to some given object in front of it. Additionally, the touch sensor is used to tell the car to move backwards a little bit in order to approach the object again. Note that this is a new program again, so for now, do not just extend the program from the event-driven assignment part.