Aligning Diff Output

‘Bots are off limits this week so here is a story from this year’s Iverson College – a fantastic week spent in the company of a wonderful mixture of array and functional language gurus and newbies, all learning from each other. One evening, Dhru Patel presented a problem that he was working on which involved displaying the results of a “diff” side by side with the matched rows aligned. For example, the input might be:

      OLD NEW
┌────────┬────────┐
│This    │This    │
│is the  │original│
│original│is not  │
│text    │        │
└────────┴────────┘

Edited by Yoda, the text was. The selection of rows to be aligned requires finding the longest sequence of rows from the original data that matches rows in the edited data without ever skipping backwards through either sets of data. In this case, it is the original first and third row, matching the first two edited rows. We will return to how we might identify these rows in a future blog entry (meanwhile, try to grok John Scholes’ YouTube video on Depth First Searching and think about it). For now, we will provide this information as a left argument, a vector of Boolean vectors that marks the location of the matched rows:

      (1 0 1 0)(1 1 0) AlignMatched OLD NEW
┌────────┬────────┐
│This    │This    │
│is the  │        │
│original│original│
│text    │        │
│        │is not  │
└────────┴────────┘

At Iverson College there was general agreement that “there should be a non-looping solution”, and Devon McCormick immediately stated that he would bet that it involved grade (). Let us explore:

      masks←(1 0 1 0)(1 1 0)
      ⎕←matched←∊masks ⍝ The two match masks catenated together
 1 0 1 0 1 1 0 
      ⎕←origin←(≢¨OLD NEW)/0 1 ⍝ 0 for items from the old array, 1 for new
 0 0 0 0 1 1 1 
      ⎕←block←∊+\¨masks ⍝ running count of matched rows
 1 1 2 2 1 2 2   
      ⎕←data←block,(~matched),origin,OLD⍪NEW
1 0 0 This    
1 1 0 is the  
2 0 0 original
2 1 0 text    
1 0 1 This    
2 0 1 original
2 1 1 is not  

Our goal is to create an expansion mask for each argument; this is going to insert blank lines at the points where a non-matched row from the other argument is included. The next step is to reorder everything by ascending block number, and within each block move the matched rows to the front (ascending by ~matched), as follows:

      ⎕←data←data[⍋data[;1 2];]
1 0 0 This    
1 0 1 This    
1 1 0 is the  
2 0 0 original
2 0 1 original
2 1 0 text    
2 1 1 is not 

This contains all the items from both texts in the order that they would need to appear in the final results. We can extract the reordered flag vectors:

       matched←~data[;2] ⋄ origin←data[;3]

Now, (origin=0) is an expansion mask that would expand OLD to match the above, and (origin=1) would do the same for NEW:

      0 1 {(origin=⍺)⍀⍵}¨OLD NEW
┌────────┬────────┐
│This    │        │
│        │This    │
│is the  │        │
│original│        │
│        │original│
│text    │        │
│        │is not  │
└────────┴────────┘

To align the matched rows, we need to eliminate inserted blanks that correspond to matched rows from the other side:

      0 1{((~matched∧origin≠⍺)/origin=⍺)⍀⍵}¨OLD NEW
┌────────┬────────┐
│This    │This    │
│is the  │        │
│original│original│
│text    │        │
│        │is not  │
└────────┴────────┘

In the final function, which collects the relevant lines of code from our experiments, we do not create the temporary “data” matrix, but reorder the origin and matched vectors individually – and instead of doing grade up on a two-column matrix, we compute an integer vector that will sort by descending matched within ascending block (because we know that grade up on a simple small-range integer vector will run like greased lightning):

 AlignMatched←{                ⍝ align matched rows of 1⊃⍵ and 2⊃⍵
     matched←∊⍺                ⍝ matches are marked by 1⊃⍺ and 2⊃⍺
     origin←(≢¨⍵)/0 1          ⍝ identify origin of items in matched (0=old. 1=new)
     block←∊+\¨⍺               ⍝ running count of matched rows
     order←⍋(2×block)-matched  ⍝ Order so matching rows are adjacent and order of
     (origin matched)←(⊂⊂order)⌷¨origin matched ⍝ items following matched row is preserved
     0 1{((~matched∧origin≠⍺)/origin=⍺)⍀⍵}¨⍵ ⍝ Expand exluding matched from "other" list
 }

 

Reviving Lost Arts

The algorithm above makes use of techniques that were well-known in APL circles in the 1980s, but atrophied after nested arrays arrived on the scene and applications tended to keep parts of the data in separate leaves of an array rather than using simple data structures.

If you would like to read up on some of the old techniques, you might enjoy browsing the FinnAPL Idiom Library, and Bob Smith’s immortal Boolean Functions and Techniques. Although nested arrays might have made them a little less relevant in the 90s and 00s, the search for high-performance parallel solutions could bring them back, as explained in this session from Dyalog ’12, on Segmented Scans and Nested Data Parallelism by Andrzej Filinski.

The Blog is Back!

It is now 3 weeks since we shipped Dyalog version 14.0 and released the new Dyalog web site, so it’s probably time to stop celebrating and get back to work. The ‘bot batteries have been recharged and the ‘bots are learning to work as a team using v14.0 futures and isolates. That’s all I can say at this time as the ‘bots are rehearsing for a gig at the J Conference on Friday 25th July and I have been sworn to secrecy until after the show.

Bots 00 and 04 preparing for the J Conference and thinking about whether to attend the IPSA 50th renunion on October 4th

Bot 04 and Bot 00 hanging out in Rochester NY rehearsing for the J Conference and considering whether to return to Toronto with me for the IPSA 50th reunion on October 4th this year.

The next major step in the robot project is to make use of the tiny red board attached to Bot 00 (on the right) – an MPU-6050 accelerometer. At the Dyalog Seminar in New York last Thursday I finally had the pleasure of meeting @alexcweiner in person, and we vowed to crack this nut together; since @romillyc has promised to join in as well, failure is not an option. Stay tuned to hear more about that adventure in the weeks to come!

Welcome to The Development Team Blog

The really good news is that this blog is no longer simply “the CTO blog” but a blog that will be shared by the entire development team as well as invited guests. We look forward to sharing details of the things we are working on with you all…

APL-Controlled Robot Performs Death-Defying Stunts Using PiCam

Regular readers will remember my whining about the poor precision of both infra-red and ultrasonic sensors. But today, the Raspberry Pi / Dyalog APL – controlled “DyaBot” was observed driving on a dinner table – where the slightest navigational error could mean a 3-foot plunge and certain death! How can this be?

The answer is: the “PiCam” finally arrived last week!!!

Capturing Images with The PiCam

I no longer need to complain about sonar beams being up to 30 degrees wide: the PiCam has a resolution of up to 1080×1920 pixels! So all we need is some software to interpret the bits…First, we capture images using the “raspistill” command:

raspistill -rot 90 -h 60 -w 80 -t 0 -e bmp -o ahead.bmp

The parameters rotate the image 90 degrees (the camera is mounted “sideways”), set the size to 60 x 80 pixels (don’t need more for navigation), take the picture immediately, and store the output in a file called “ahead.bmp”. (Documentation for the camera and related commands can be found on the Raspberry Pi web page.) Despite the small number of pixels, the command takes a full second to execute – anyone who knows a way to speed up the process of taking a picture, please let me know!

In the video, each move takes about 3 seconds, this is simply because each cycle is triggered by a browser refresh of the page after 3 seconds. Capturing the image takes about 1 second, the “image analysis” about 40 milliseconds, so we could be driving a lot faster with a bit of Javascript on the client side (watch this space).

Extracting BitMap Data

Under Windows, Dyalog APL has a built-in object for reading BitMap files, but at the moment, the Linux version does not have an equivalent. Fortunately, extracting the data using APL is not very hard (after you finish reading about BMP files on WikiPedia):

tn←'ahead.bmp' ⎕NTIE 0     ⍝ Open "native" file
(offset hdrsize width height)←⎕NREAD tn 323 4 10 
                           ⍝ ↑↑↑ Read 4 Int32s from offset 10
data←⎕NREAD tn 80 (width×height×3) offset ⍝ Read chars (type 80)
data←⊖⎕UCS(height width 3)⍴data ⍝ Numeric matrix, reverse rows

The above gives us a 60 by 80 by 3 matrix containing (R,G,B) triplets. This code assumes that BMP file is in the 24-bit format created by raspistill; I will extend the code to handle all BMP formats (2, 16 and 256 colours) and post it when complete – but the above will do for now.

Where’s the Edge?

At Iverson College last week, I demonstrated the DyaBot driving on a table with a green tablecloth, using code which compared the ratio between R,G,B values to an average of a sample of green pixels from one shot. Alas, I prepared the demo in the morning and, when the audience arrived, there was much more (yellowish) light coming in through the window. This changed the apparent colour so much that the bot decided that the side of the table facing the window was now unsafe, and it cowered in the darkness.

Fortunately, I was talking to a room full of very serious hackers, who sent me off to Wikipedia to learn about “kernels” as a tool for image processing. Armed with a suitable edge detection kernel, I was able to test this new algorithm on a dozen shots taken at different angles with the PiCam. Each pair of images below has the original on the left, and edges coloured red on the right. Notice that, although the colour of the table varies a lot when viewed from different directions – especially when there is background glare – the edges are always correctly identified:

Edge Detection Examples

Except for the image at the bottom left, where the bot is so close to the edge that the table is not visible at all (and the edge is the opposite wall of the room, ten feet away), we seem to have a reliable tool.

The PiServer Page

The code to drive the bot is embedded in a “PiServer” (MiServer running on a Rasperry Pi) web page. Each refresh of the page takes a new picture, extracts the bits, and calls the main decision-making function. The suggested action (turn or drive straight ahead) is displayed in the form, and the user has the choice of pressing OK to execute the command, pressing the “Nah” button to take a new photo and try again (after moving the bot, changing the lighting in the room, or editing the code). There are also four “manual” buttons for moving the Bot. After testing the decision-making abilities of the code, brave programmers press the “Auto” button, allowing the robot to drive itself without waiting for confirmation before each command (see the video at the start of this post)!

Robot Views Table at Dyalog House

The Code

The central decision-making function and the kernel computation function are listed below. The full code will be uploaded to the MiServer repository on GitHub, once it is finally adjusted after I find a way to attach the PiCam properly, rather than sticking it to the front of the Bot with a band-aid!

∇ r←StayOnTable rgb;rows;cols;table;sectors;good;size;edges
 ⍝ Based on input from PiCam, drive at random, staying on table
 ⍝ Return vector containing degrees to turn and 
 ⍝               #seconds to drive before next analysis

 (rows cols)←size←2↑⍴rgb
 ⍝ First detect edges in each colour separately
 edges←EdgeDetectAll∘AK¨⊂[1 2]rgb
 ⍝ Call it an edge if any r, g or b result is >75% of original
 edges←∨/(↑[0.5]edges)>0.75×1 1↓¯1 ¯1↓rgb
 ⍝ Look for lowest edge in each column 
 table←+⌿∧⍀~⊖edges
 ⍝ Divide into 3 equally sized sectors (left, centre, right) 
 sectors←((⍴table)⍴(⌈(⍴table)÷3)↑1)⊂table
 ⍝ Find the LOWEST edge in each sector
 good←⌊/¨sectors

 :If good∧.>15 ⍝ More than 15 pixel rows of table in all sectors
    r←0,0.1⌈(good[2]-15)÷25 ⍝ Carry straight on for a while
 :Else ⍝ Some sectors have less than 15 pixel rows
    :If 0≠1↑previous ⋄ r←previous 
        ⍝ Once started turning, keep turning the same way
    :Else 
        r←((1+>/good[1 3])⊃45 ¯45),0 ⍝ Turn in "best" direction
    :EndIf
 :EndIf
 previous←r ⍝ Remember last turn for next decision
∇

The function AK (Apply Kernel), and the kernel are defined as follows:

∇ r←kernel AK data;shape
   shape←⍴kernel
   r←(1-shape)↓⊃+/,kernel×(¯1+⍳1↑shape)∘.⊖(¯1+⍳1↓shape)∘.⌽⊂data
∇

 EdgeDetectAll ⍝ Our 3x3 kernel
¯1 ¯1 ¯1
¯1  8 ¯1
¯1 ¯1 ¯1

Note the similarity of AK to the APL Code for Conway’s Game of Life!

 

 

PiServer at Thor8.dk

I’ve been wanting to run a Web Server from my home for many years. In fact, I have been paying for the domain Thor8.dk (my street address is Thorsvænget 8), and a fixed IP address for 5 years now – without getting it implemented. But today, with a little help from Jason Rivers who helped me set it up so it starts automatically on reboot, I have been able to deploy a MiServer – a web framework written entirely in Dyalog APL – on my Raspberry Pi (the one that is not embedded in the DyaBot) – at the address http://thor8.dk. Please be gentle with it, it is a very small machine on a regular ISP connection with only 1Mb upload capacity!

MiServer + Raspberry Pi = PiServer

The MiServer is a project that has been evolving for the last several years, with the goal of putting web application development in easy reach of those APL developers who are not also comfortable with mainstream web technologies like Microsoft IIS / ASP.NET, Apache, PHP etc. It is a web framework written entirely by APLers, for APLers – as an open-source project which is now available at https://github.com/Dyalog/MiServer. The motto of this project is that “Anyone who is able to write an APL function should be able to host it on the web”.

The site at “thor8” is a slightly modified version of the demo site that is included with the MiServer installation. Note that one of the features of the site is that, from any page, if you click on the Viking amulet in the top left corner, you can see the source code for the page. If you’d like to learn more about the MiServer, the User Guide is also available on GitHub.

The Mortgage Example

A contributing factor to my finally getting my act together this week is that Nick Nickolov’s NGN/APL project, is an APL interpreter written in CoffeeScript, sparked a thread that has been running on LinkedIn for a while, which included a discussion on alternative mechanisms for implementing web apps in APL. While I think Nick’s project is very cool, and I look forward to meeting him at Iverson College next week, I could not avoid feeling a powerful urge to demonstrate that it now straightforward to write interactive web sites using a state-of-the art, industrial strength APL system. Many thanks to Brian Becker, who has done most of the work on recent features of the MiServer (and is currently working on MiServer 3.0 which aims to make the development of web pages much more similar to desktop application development), for his assistance in putting together the example.If the PiServer is still up and running you can view the example at http://thor8.dk/mortgage. Here is an image, just in case it isn’t:

PiServer Mortgage

If you modify the principal amount, interest rate or the term, the calculator will calculate a new monthly payment. If you adjust the payment to something that you can afford, the principal amount will be adjusted accordingly.

The Code

As previously mentioned, if you click on the orange amulet at the top left, the code for the page will be displayed. I’d like to explain selected lines of the code here:

:Class mortgage : MiPage

Every MiServer Page (MiPage) is a Dyalog APL class which derives from the base class MiPage. This base class adds the CSS (style sheet) and behaviour like the display of code.

Public Fields

    :Field Public event         ⍝ the triggered event
    :Field Public what          ⍝ the id of the element
    :Field Public prin←'100000' ⍝ principal field
    :Field Public rate←'4.5'    ⍝ rate
    :Field Public term←'30'     ⍝ term (years)
    :Field Public pmt←''        ⍝ payment

The main job of the MiServer is to move values arriivng from the browser, in the form of fields in an HTML form, or JSON data, to instance variables in your class. You need to name the fields that you want the MiServer to look for in order for this to happen (you can also retrieve other values using API function calls, but that is a longer story).

Application Code

tonum←{{(,1)≡1⊃⍵:2⊃⍵ ⋄ ''}⎕VFI ⍵} ⍝ check for a single number
calcpmt←{0::'Err' ⋄ p r n←⍵÷11200(÷12) ⋄ .01×⌈100×p×r÷1-(1+r)*-n}
calcprin←{0::'Err' ⋄ r n m←⍵÷1200(÷12)1 ⋄ .01×⌈100×m÷r÷1-(1+r)*-n}

These three lines of code are the actual application code – one function to compute the payment based on principal, rate and term – and another to calculate the principal based on the other three values.

Defining The Form

 inputs←1 2⍴'Principal'('prin'Edit prin) 
 inputs⍪←'Interest Rate'('rate'Edit rate)
 inputs⍪←'Term (years)'('term'Edit term)
 inputs⍪←'<b>Payment</b>'('pmt'Edit pmt)
 html,←'id="mortgage"'('POST'Form)Table inputs

Within the function Render, which is called when the page is rendered, these lines of code create a 4×2 nested array containing labels in the first column, and HTML edit fields in the 2nd column. The function Table wraps an HTML table around this array and the Form operator wraps that inside a form which will use the POST method (although in this case, that actually never happens, we drive all activities of change events on the edit fields).

Wiring up the Callbacks

  selector←'#mortgage input'
  event←'change'
  returndata←'formdata' '#mortgage' 'serialize'
  html,←req #.JQ.On selector event returndata

These four lines of code set up a client-side JavaScript callback using a technology called JQuery, which will post a serialised copy of the form back to the server using an “AJAX” style transaction (we call it “APLJAX”). In a larger form we might want to be more selective about what we send back, but in this case the entire serialised form is still a very small amount of data.

Updating the Form

    :CaseList 'prin' 'rate' 'term' 
       resp←('execute'('$("#pmt").val("',(⍕calcpmt p r n),'")'))
    :Case 'pmt' ⍝ payment changed, calculate principal
       resp←('execute'('$("#prin").val("',(⍕calcprin r n m),'")'))

The above lines are “the beef” in terms of the interaction: A response is constructed which will cause the value of either the pmt or the prin field to be updated, by sending a snippet of JavaScript back to the client to be executed dynamically.

We are actually working hard to eliminate the use of the above feature (the ability to execute JavaScript), because we don’t think most APLers will want to learn even this amount of JS. If you can make do with two buttons marked “Calculate payment” and “Adjust Principle”, rather than reacting each time a field is changed, no JS is required – as is the case for most of the other MiServer sample pages. Version 3.0 of MiServer will hopefully have a mechanism to update values of selected parts of the DOM without injecting JS. However, this example should hopefully demonstrate that you can write an APL server-side application which does *anything* using this technology.

Please take it for a spin – we are happy to receive comments on this blog, or via e-mail to apltools at dyalog dot com.

The DyaBot will of course be running a PiServer too. More about that next week, the bot is also coming to Iverson College.

 

Comparing Sonar and InfraRed Data

Of course, it was an illusion that I would be able to get straight back to the robot after vacation, there were a few other jobs waiting like the Dyalog’13 Conference Programme (the DyaBot will be making a couple of appearances, of course!). However, it appears I have now made enough flour to be able to remove my nose from the grindstone for a bit and spend more time with the robot, so I expect to be back to posting every week or so from now on, as we prepare to demonstrate an autonomously navigating robot at the conference!

Anyhow, here is a chart with the first recorded data from the LV-EZ4 ultrasonic sensor (sonar) – using RainPro, as demonstrated in the post on visualising sensor data. Many thanks to Stefano Lanzavecchia for help with the math required to generate the co-ordinates of a rectangle in a polar chart:

sonar-and-ir-data

The robot was placed in my kitchen (the setting for several robot videos), about 80cm from the back wall, 240cm from the front, and roughly halfway between the sides. The green rectangle labelled “actual” above shows the location of the walls relative to the bot. It was commanded to rotate through 360 degrees, recording the readings from the IR (red line) and Sonar (blue) sensors, which were both pointing straight ahead. Ideally, the blue line should have traced an identical rectangle. The red line essentially traces an 80cm circle plus noise, this is because SHARP GP2Y0A21YK0F infrared sensor has a maximum range of 80cm.

Sonar Issues

The ultrasonic sensor has a range of at least 6-7m, and is our choice for long-range distance measurements. However, what I believe the graph shows, is that we have a serious problem with reflections. Essentially, it appears that when the sonar beam hits a wall at certain angles, there is very little direct return; instead the beam is reflected. The return signals that we are getting from roughly 60 and 315 degrees look like reflections of the back wall – these reflections may be enhanced by the fact that there is a refrigerator at 60 and a dishwasher at 315 (metal surfaces). The signals at 135 and 220 degrees could be reflections of the robot itself, the beam having been reflected twice in a corner of the room.

Another thing that is not immediately evident from the above image, but took me by surprise studying the data sheet, was that the sonar beam is a quite a bit wider than I expected – as much as 30 degrees at close range, getting narrower as the distance increases (see the data sheet). In other words, we have some work to do in order to generate accurate maps of the universe based on the current set of sensors.

Will we succeed? To be continued …

Return to the Robot

DSC_0016

The Anna Elise

It has been more than a month since my last blog post, and you’d be forgiven for thinking that I’d lost interest in robots. It *is* true that I have been rather distracted by customers, including a trip to the USA. It is also holiday season, and I have been to a music festival. However, except for last week when I was skipper and this week when I am chillng out on the Anna Elise, the robot has been in my suitcase on every trip, and I have actually been working hard on it. I just haven’t been able to get anything new to work until very recently.

One problem was that robots do not travel particularly well. Put them in a suitcase every 3-4 days and bits quickly start falling off (if you have ever looked out the window of your plane and watched a baggage handling crew at work, that should be no surprise). I now have both 220v and 110v soldering irons (Thank You RadioShack!), and have so far been able to put humpty dumpty together again on arrival. But my last desperate attempt to collect some sonar data that I could publish before going sailing ended like this:

Rewriting the Raspberry Pi – Arduino Comms Program

The BIG reason for the lack of visible progress is that I decided that the time had come to rewrite the control program for the Arduino. The robot uses a Raspberry Pi running Dyalog APL as the main “brain”, but because a high-level language on a Pi isn’t a “real time” programming environment, we are using an Arduino to control the I/O pins. The idea is that, by implementing a very simple “front-end processor” on the Arduino, we will have a tool that APL can control at a high level, leaving timing-dependent work like controlling pulse-width-modulated output pins, which need to be continuously updated, to the Arduino.

The old control program had a number of limitations; most importantly, it was only really possible to monitor a single input pin. This was good enough until the sonar was added to the infra-red sensor. The program also contained hard-coded information about how pins were being used. Finally, it was quite simply a badly-written piece of Arduino C, subject to intermittent timing-related glitches because the original authors were unaware of certain limitations of the Arduino.

So I decided to write about 5 pages of new C code, and it took me a month and a bit to get back to blogging. Apart from the distraction of having to pretend to still be the CTO of Dyalog Ltd, the real problem is that I am not (usually) a C programmer. C is a rather unforgiving language. Or rather, it is a VERY forgiving language – it allows you to write code that will compile without warnings but just not work, because you unwittingly asked it to do odd things like truncate the content of a variable that you have used in a context which assumes a shorter type (etc, etc, and etc).

To stack the odds further against me, the default Arduino environment has no real debugger. In addition, the I2C bus that we are using to communicate between the Pi and the Arduino is very timing-dependent, which means that any attempt to step through the code will necessarily cause it to fail. Even the standard Arduino practice of monitoring diagnostic messages written to the serial interface cannot be used as it alters the timing enough to cause many I2C requests to fail. So after 2-3 weeks of wailing and gnashing teeth, I implemented my own simple logging mechanism to return diagnostic info via I2C and finally managed to move very slowly forward and complete the new control program. And I even RTFM and a few web pages written by people who had done this kind of thing before – in particular, this post by Wayne Truchsess was extremely helpful.

I was hoping to be able to publish some results of using the new Sonar, but I tried to do this on a jet-lagged Sunday morning sandwiched between returning from the USA and heading off to sail, and absolutely everything I did went wrong. The batteries died in the middle of my data collection and, to make the day perfect, when I found a new set of batteries, a wheel fell off (see the video above)…so I won’t be able to blog about the sonar data until I am back from the sailing trip at the end of July. However, I would like to talk a bit about the new control program.

I2CArduinoComm v0.2

As previously mentioned, we have a piece or Arduino C code (available on GitHub), which supports a simple command language which APL on the Raspberry Pi uses to issue instructions to set or read I/O pin values. The BIG difference compared to the old version is that ALL code that either reads or sets pin values has been moved to the loop() function, which (if my understanding is correct) is dispatched by the Arduino “operating system” when the O/S is not busy with other tasks. The old program would read and update pins immediately upon receipt of I2C messages. The I2C bus is quite critical with respect to timing, and the old code was spending too much time reading pin values when a data request arrived – so we were missing the timing window for getting a response back in time – and possibly interfering with other timing-dependent activities on the Arduino. The new program constantly maintains arrays containing the latest input values (done in the “loop” function), so all an I2C read request needs to do is to transmit the current values.

Command Language Extensions 

From a functional point of view, the main improvement is that the pin usage is completely configurable using a new “setup” command. At a lower level, the command language has been made more flexible. We no longer use a fixed command length; instead the first byte transmitted declares the length of the command (the first byte of all results is also the length of the response). The following commands are supported:

Command
Character
Command Name Description / Comments
I Identify Returns two bytes containing the major and minor version number of the ArdCom program (currently 0 2), followed by two bytes per defined pin (pin #, pin type).
R Reset Clears all pin definitions.
S Setup Declares pin types. Following the command character, triplets of (pin#, pin type, additional info) for each pin (see the pin type table below for details). “S” can be called several times in succession if the overall length of the command would otherwise exceed 32 bytes).
W Write Sets pin values: the command char is followed by pairs of (pin#, value).

Pin Setup

Each pin declaration included in a setup command consists of a triplet of bytes, containing the pin#, type character, and “additional info”. Only the “p” pin type currently makes use of the additional info – for other pin types this value is ignored.

Pin Type
Character
Type Name Description / Comments
A Analogue output Uses the analogWrite function to set the value.
D Digital output Uses digitalWrite.
S Servo output Uses Servo.write to update the pin value.
a Analogue input Uses analogRead.
d Digital input Uses digitalRead.
p Pulse input Uses pulseIn to read a pulse-width modulated input. If “additional info” is set, then this should be a digital pin number which is given a 10ms HIGH signal to tell the device to provide an input pulse.

See the Arduino Reference for descriptions of analogWrite, digitalWrite and the other functions mentioned above.

Reading Input Values

When an I2C read request is made (by APL on the Pi), the current values for all pins defined as inputs (in ascending order) are returned in a single transmission. The data stream contains two bytes (Most Significant Byte followed by LSB) for each input pin. If the control program wants to return diagnostic information, then the first data byte will have the value 254 and the rest of the transmission will be textual “log” data. 254 is an impossible value for the MSB of any input, because the maximum input value is 1023, which has an MSB value of 3. 254 was chosen rather than 255, which frequently occurs when there are transmission failures or other errors.

The ArdCom Class

A new APL class has been written to communicate with the new Arduino code. You can find the source code in the file ArdCom.dyalog in the GitHub repository. The DyaBot class has been updated to make use of the ArdCom encapsulation, the new version can be found in the Exampes folder in the repository. I will describe the in detail in my next blog post – probably not for another week, as the weather forecast looks superb for the next few days (probable route Faaborg-Aabenraa-Augustenborg-Gråsten-Flensburg-Sønderborg-Ærøskøbing).