Dyalog APL now available for the Raspberry Pi!

Although the news had not yet appeared on the Dyalog webpage when this was written, the CTO blog has access to exclusive sources and is therefore able to present this scoop: The big day has finally come – Dyalog APL version 13.2 is now available to anyone with a Raspberry Pi, and can be downloaded immediately from http://packages.dyalog.com! We will of course be making official announcements via various channels over the next few days, so keep an eye on our web page – but remember that you saw it here first!

In the above clip, you can see that the Dyalog C3Pi got a little over-excited: while celebrating (and testing) the new release with an autonomous drive on my kitchen floor (in the middle of dinner preparations), the robot got a bit too close to an obstacle and dragged the on/off switch along a pillow, switching itself off in the process! A single infra-red sensor doesn’t give much information for autonomous driving, but we have placed orders for a high-definition sonar, which should arrive next week. Stay tuned for further developments!

A Full Implementation of Dyalog APL

Note that, although the Pi version of Dyalog APL is free for educational and non-commercial use, is not technically restricted in any way – it has exactly the same features as any other 32-bit Linux-based (Unicode) version of Dyalog APL.

You are also welcome to take a look at the User Guide before installing the software – it also contains useful links to other resources that you can use to learn about Dyalog APL. If you would like to take a look at APL but do not (yet) have Raspberry Pi, educational and non-commercial licenses are also available for Linux/x86 and Windows – and you can also try APL online at http://tryapl.org (apologies in advance if you have a tablet, good support for tablets is coming soon to TryAPL).

Fun with APL on the Raspberry Pi (without a Robot)

The official release of Dyalog APL for the Raspberry Pi now looks as if it going to happen on Friday! In preparation for this, we have been working on some examples to demonstrate things you can do on your Pi without a set of wheels attached – like making lights blink!

 

Quick2Wire Interface Boards

It is possible to connect input and output devices directly to your Pi. However, we have elected to do our experiments using the interface boards from Quick2Wire – to be precise the “Port Expander Combo“, which protects you from damaging your Pi by making wiring/soldering mistakes, and makes a number of interfaces more easily accessible. The photo below shows the LED bar used in the above video, attached to the combo:

pi-with-quick2wire-boards

Raspberry Pi with Quick2wire expansion and bar LED boards

The LED Pattern Generation Language (LEDPGL)

The Bar LED has 8 controllable LED’s. APL is actually a neat language for generating boolean patterns, for example:

      3↑1    ⍝ "3 take 1"
1 0 0
      8⍴3↑1 ⍝ "8 reshape 3 take 1"
1 0 0 1 0 0 1 0
      ¯1⌽8⍴3↑1 ⍝ "negative 1 rotate" of the above
0 1 0 0 1 0 0 1 
      ⍪¯1 ¯2 ¯3⌽¨⊂8⍴3↑1 ⍝ "columnise the neg 1, 2 and 3 rotations..."
 0 1 0 0 1 0 0 1
 1 0 1 0 0 1 0 0
 0 1 0 1 0 0 1 0

Once you get into the habit of working with APL, you quickly learn to create small, functional “Embedded Domain Specific Notations” to work with your data. The LEDPGL namespace contains a small “pattern generation language”, which allows expressions like:

       LEDPGL.(0 4 shift 4 / 1 0)
 1 1 1 1 0 0 0 0   0 0 0 0 1 1 1 1
       LEDPGL.(show 0 4 shift 4 / 1 0)
 ⎕⎕⎕⎕....
 ....⎕⎕⎕⎕

Finally, the namespace contains a Demo function which takes a time interval in seconds as its argument (or 0 to use the “show” method to display the patterns in the session log). This is what was used to create the video at the top of this post:

       LEDPGL.Demo 0.1
 0 4 shift 4/1 0 ⍝ Left Right
 cycle 4 4/1 0   ⍝ Barber pole
 4 cycle 8 repeat head 4 ⍝ Ripple
 binary ⍳256             ⍝ Counter
 mirror cycle head 4     ⍝ Halves-in
 6 repeat 3 repeat x∨ reverse x←mirror cycle head 4 ⍝ Half-flip
 12 repeat ¯1↓x,1↓reverse x←mirror cycle head 4     ⍝ Out-In-Out

 

 

LED Morse Code

The Quick2wire expansion board (largest, closest board to the Pi in the first picture) has a single LED mounted, which can be controlled via GPIO. Our first example, which is included in the “Getting Started” guide which is included with Dyalog APL for the Pi – and available on GitHub – is an encoder for Morse Code. For example, the video below was generated by passing a character vector containing the only morse message that most people would be able to read:

GPIO.Morse 'SOS'

Visualising Sensor Data using APL on the Robot

As described in a recent post, our robot now has an Infra-Red distance sensor, which allows us to measure the distance from the front of the robot to the nearest obstacle. With respect to the autonomous navigation code that we wish to write, this will be the cornerstone! In order to evalute the perfomance of the sensor, we surrounded the robot with obstacles and commanded it to rotate slowly in an anti-clockwise direction, while IR data was collected 20 times per second:

RotatingC3Pi

Surrounded by obstacles, C3Pi rotates anti-clockwise and returns IR distance data every 0.05 seconds

Collecting the Data

We initialized the workspace by loading first the “RainPro” graphics package (which is included with Dyalog APL on the Pi), and then the robot code:

    )load rainpro                   Loads the graphics workspace
    ]load /home/pi/DyaBot           Loads the robot control code

The following function loops 300 times (once every 0.05 seconds), repeatedly collecting the value of the robot’s IRange property (which contains the current distance measured by the IR sensor). The call to the UpdateIRange method of the bot ensures that a fresh sample has just been taken (otherwise, the robot will update the value automatically every 100ms).

      ∇ r←CollectIRangeData bot;i;rc
 [1]    r←⍬
 [2]    :For i :In ⍳300
 [3]        :If 0=1⊃rc←bot.UpdateIRange
 [4]            r←r,bot.Irange
 [5]            ⎕DL 0.05 ⍝ Wait 1/20 sec
 [6]        :Else
 [7]            ∘∘∘ ⍝ Intentional error if update fails
 [8]        :EndIf
 [9]    :EndFor
      ∇

We can now perform the experiment as follows:

      iBot←⎕NEW DyaBot ⍬           ⍝ Instance of the robot class
      iBot.Speed←40 0              ⍝ 40% power on right wheel only 
      ⍴r←CollectIRangeData iBot    ⍝ Check shape of collected values 
300
      iBot.Speed←0                 ⍝ Let the robot rest its batteries
      1⍕10↑r                       ⍝ First 10 observations to 1 decimal 
15.3 12.8 13.7 12.7 13.7 11.4 9.9 11.1 10.4 9.4

Now that we have the data, the following function calls will create our first chart:

     ch.Set 'head' 'IR Sensor Data' ⍝ Set Chart Header
     ch.Set 'ycap' 'Distance (cm)'  ⍝     Y caption
     ch.Set 'xcap' 'Time (s)'       ⍝     X caption
     ch.Set 'xfactor'(÷0.05)        ⍝ Scale the x-axis to whole seconds
     ch.Plot data                   ⍝ Create the chart
     '/home/pi/irline.svg' svg.PS ch.Close ⍝ Render it to SVG

ir-sensor-data-raw

Removing the Noise with a Moving Average

The chart above suggests that the robot performed a complete rotation every 5 seconds or so with just under 100 observations per cycle. The signal seems quite noisy, so some very simple smoothing would probably make it easier to understand. The following APL function calculates a moving average for this purpose – it does this by creating moving sums with a window size given by the left argument, and dividing these sums by the window size:

       movavg←{(⍺ +/ ⍵) ÷ ⍺}  ⍝ Define the function 
       3 movavg 1 2 3 4 5     ⍝ Test it
 2 3 4

We can re-use the existing chart settings and plot smoothed data as follows:

      #.ch.Plot 7 movavg data      
      '/home/pi/irline.svg' svg.PS ch.Close

ir-sensor-data-smooth

Making Sense of It

The pattern is now nice and clear – but how does the map compare to the territory? We can use a “Polar” chart of the distance to see how the measured distances compare to reality:

     ∇ filename PolarDistance data;⎕PATH;mat;deg;window;smoothed;angles;movavg
[1]   ⍝ Polar IR Distance plot - "data" is one cycle of observations
[2]   ⍝ Note the -ve right margin to get the chart off-centre!
[3]
[4]    ⎕PATH←'#.ch'      ⍝ Using RainPro ch namespace
[5]    window←7          ⍝ Smoothing window size
[6]    deg←⎕UCS 176      ⍝ Degree symbol
[7]    movavg←{(⍺+/⍵)÷⍺} ⍝ Moving average with window size ⍺
[8]
[9]    angles←360×(⍳⍴data)÷⍴data ⍝ All the way round
[10]   smoothed←window movavg data,(¯1+window)↑data
[11]   data←(⌈window÷2)⌽data     ⍝ Rotate data so centre of window is aligned with moving average
[12]
[13]   Set'head' 'Infra-Red;Distance;Measurement'
[14]   Set('hstyle' 'left')('mleft' 12)('mright' ¯60)
[15]   Set'footer' 'Distance measured;every 0.05 seconds;while 3Pi was rotating'
[16]   Set'style' 'lines,curves,xyplot,time,grid,hollow'
[17]   Set'lines' 'solid'
[18]   Set'nib' 'medium,broad'
[19]   Set('yr' 0 60)('ytick' 10)
[20]   Set('xr' 0 360)('xtick' 15)('xpic'('000',deg))
[21]   Set('key' 'Measured' '7 MovAvg')('ks' 'middle,left,vert')
[22]
[23]   Polar angles,data,⍪smoothed
[24]   filename svg.PS Close  
     ∇

To align the chart with the picture, we need to:

  1. Extract the first 99 observations – corresponding to one rotation
  2. Reverse the order of the data, because the robot was rotating anti-clockwise
  3. Finally, rotate the data by 34 samples, to align the data with the photograph (the recording started with the robot in the position shown on the photograph)

We can do these three operations using the expression on the next line, and then pass this as an argument to the PolarDistance function, which creates another SVG file:

     onerotation←¯34⌽⌽99↑r
     '/home/pi/irpolar.svg' PolarDistance onerotation

ir-sensor-data-polar.jpgIf we compare the red line to the picture we started with, and take into account the fact that the robot was rotating quite fast and the IR sensor probably needs a little time to stabilise, it looks quite reasonable. The accuracy isn’t great, but with a little smoothing it does seem we should be able to stop little C3Pi from bumping into too many things!

Stay tuned for videos showing some autonomous driving, and the code to do it…

 

C3Pi Opens Eyes at the APL Moot

The "Three Blind Mice" at the APL Moot at the YHA in the Lee Valley

Sam Gutsell, Shaquil Sidiki and James Greeley (aka “Three Blind Mice”) at the APL Moot at the YHA in the Lee Valley

This weekend, the Dyalog C3Pi reached the final stop on the European spring tour, attending the British APL Association’s Annual General Meeting and “Moot” just north of London, where the robot met the famous mice from Optima. The day before, the C3Pi also travelled to an OSHUG meeting in London on Thursday, where Romilly Cocking was talking about quick2link. Alas, poor C3Pi was confined to a cardboard box due to problems with wiring up its new “eyes”:

Sharp "GP2Y0A21YK0F" IR sensors attached to the Dyalog C3Pi

Resistance is Futile!

Asimov’s third law of robotics states that a robot must protect its own existence. Thanks to the addition of a SHARP infrared distance measuring sensor, our robot is now capable of at least not running head first into walls or other obstructions (and if the obstruction is a spectator, we’re also providing some support for the first law)!

first-third-laws

C3Pi obeying the First and Third Laws of Robotics

Connecting the Sensor to the Raspberry Pi

The sensor is attached to an analog input pin on the Arduino (we picked pin #0). Our Arduino command interpreter, which allows APL on the Rasperry Pi to use the Arduino as a “controller” for analog and digital I/O, was extended with an “Analog Read” command consisting of the letter “a” followed by a byte giving the pin number and a dummy pad byte in order to ensure that the command length is 3 bytes (the fixed length simplifies the interpreter). Thus, if APL transmits (97 0 0) to address 4 on the I2C bus, and then issues an I2C read command, it will receive a string containing the current voltage (up to 5v, in 1024ths) measured – for example “a0:480;” if the input is 2v. We elected to include a confirmation of the pin number in the result, and separators which will allow us to send several sensor input values in a single string, as we add more sensors to the robot.

In the APLBot GitHub Repository, The DyaBot class has been extended to run a background thread which updates the value of a new property called “IRange”, every 100ms (a public method UpdateIRange can be called at any time to refresh the value). The input voltage is converted to a distance in centimetres, using the tables from the sensor datasheet. The next blog post will illustrate some of the data that we are now able to collect from the sensor.

The DrivePi game code also runs a background thread which monitors the value of IRange, and stops the wheels if the measured distance drops below 20cm – but this code still needs some testing and tweaking.

Accelerometer and Gyro on the Way

My experience to date is that controlling the robot is a little bit tricky, because the output of the wheels seems quite variable (changes from day to day, and from hour to hour). Determining the position of the robot using “dead reckoning” based on the voltage applied to the wheels seems unlikely to succeed. We either need to find some more reliable wheels – or find some sensors that can help us understand what is going on.

I have been able to lay my hands on an MPU-6050 motion tracking device from InvenSense. We’ll be trying to wire this up over the next couple of weeks and see whether it allows us to accurately track the motion of the robot. We will also soon take delivery of an ultra-sonic sonar mounted on a servo, so we can start measuring longer distances accurately (the IR sensor is really only suitable for collision avoidance). Once these are wired up, all we have to do is write “a little” APL code to do the localization and mapping!

Driving my Pi

Back from the APL meeting in Hamburg, where my C3Pi made it’s first appearance on German soil (and a few days meeting with APL users in Milan). I’ve extended the control program which was used to make last week’s figure 8 video to give me some “hand controls”. I do appreciate that it is inadvisable for humans to control motor vehicles directly, but in the privacy of my own home I have risked a little careful driving:

The APL Code

You can see the latest code at https://github.com/APLPi/APLBot. However, beware that significant changes will be made next week – both to the APL code and the Arduino / I2C interface layers. We will be adding support for our first sensor – an Infra-Red sensor that will allow us to measure the distance to the nearest obstacle (in front of the ‘bot) – and this requires some extensions.

The APL code currently consists of three files: I2C.dyalog, DyaBot.dyalog and DrivePi.dyalog. These need to be placed in the same folder on your C3Pi.

The I2C.dyalog file relies on the library lib2c-com.so, which will be installed along with Dyalog APL for the Pi when this becomes available.

The final piece of the puzzle is the control program for the Arduino. That’s in APLPi/I2CArduinoComm. Install it by following the relevant part of Jason’s instructions on building your own C3Pi.

The three APL code files implement the following layers:

I2C: A namespace which loads libi2-com.so and makes four interface functions available in the active workspace. Strictly speaking, this file belongs in the libi2c-com repo, and will move there when I have time to co-ordinate rewriting some test cases with Liam.

DyaBot: A class, built upon I2C, which allows APL applications to control the robot by setting a field named “Speed” to a 2-element vector containing values between ¯100 (full reverse) and 100 (full speed ahead) for each of the 2 wheels.

DrivePi: A namespace, built upon DyaBot, containing a function Run which provides a very simple “game interface” for driving the C3Pi, and a function Play which accepts 3-column matrices of (Right Wheel Speed, Left Wheel Speed, Duration) records, which are “played back”.

What next?

The ultimate goal of our project is to write some code which allows the C3Pi to perform Simultaneous Localization and Mapping (SLAM): The robot should be able to examine its surroundings, build a map, and accept high-level instructions to move from point to point, avoiding obstructions. As mentioned above, we’ll be addding the first collision-avoidance sensor next week. Once that is done, I’ll be back to discuss the other sensors that we are planning to add to the C3Pi in the coming weeks. I hope that some of you will join us in building your own robots and helping write the software!

APL – Coming to a Robot Near You

APL on the C3Pi

One of the biggest joys of being the CTO of Dyalog is that you get to write code to help evaluate prototypes of proposed language features, or new tools and tutorials that we are planning to make available. This past week has been one of the most exciting for a very long time, because it started with Jason and Liam presenting me with the first fully-assembled Dyalog-built C3Pi clone!

The first Dyalog-build C3Pi CloneJason Rivers has been constructing the hardware, and Liam Flanagan is responsible for the port of Dyalog APL to the Raspberry Pi, which will be the seat of the “brains” of the robot. We have been talking to the folks at Quick2Wire over the winter about this project, primarily with Romilly Cocking, who is one of those select individuals who can boast that he has already achieved the dream of every programmer.

This week, all the bits came together:
• Jason finished assembling the first C3Pi mockup (we are building 4 units to play with, while we wait for the finished item from Quick2Wire).
• The CTO purchased and charged two sets of six AA batteries.
Dyalog APL version 13.2 (Unicode edition only) passes our tests on the Pi, except for an issue with de-normal floating-point numbers caused by a known issue in the Linux kernel. We aim to officially release a free non-commercial version for the Raspberry Pi by mid-May, contact us for an advance copy if you can’t wait.
• Liam has built a shared library which interfaces APL to the I2C bus which is used to make life a little easier for an APL developer trying to use I2C. We use I2C talk to the Arduino which controls up to 11 outputs and 6 inputs, if I am reading the specification correctly. At the moment, we use three outputs for each of two motors: One analog output to control the speed and two digital outputs to activate the motor going either forwards or back.
• Liam has also constructed some temporary code we are using while we wait for the updated Quick2Link code to become available: This is a “command interpreter” running on the Arduino which accepts 3-byte commands {pin type, pin number, output value) over the I2C interface, providing a very easy mechanism for APL to drive the Arduino outputs.

Now, that’s a pretty exciting package to find on your desk on a Monday morning! OK, Monday afternoon, to be honest – the morning was spent with Fiona Smith, our first full-time Technical Writer – another person who will soon be contributing to our “Pi” project. Needless to say, I shortened every meeting for the rest of the week by 5 minutes, and thus managed to find the time to write some APL code capable of driving the robot (press “play” below).

We’ll be keeping the various bits of code up-to-date in repositories under https://github.com/APLPi. If you think you might want to build your own hardware and play along with us, Jason will soon be starting his own series of blog posts about how to build a C3Pi. I will write a bit more about the APL code, and the longer term goals of this project, in my next post.