ACKNOWLEDGEMENT: My thanks to Andy Shiers for simplifying the examples
Visitors to APL from other languages are sometimes a bit sniffy about our use of numbers (1 0) for Boolean values True and False. They’re further bemused by our fondness for moving conditional processing out of code and into data. For example, instead of:
0=2|⍵: ⍵÷2 ⋄ ⍵ ⍝ even number: half
we might say:
⍵ ÷ 1+0=2|⍵ ⍝ half if even
At first blush, this seems a bit loony; such transformations appear to produce code which is both slower and more obscure! Here’s another example:
2|⍵: 1+3×⍵ ⋄ ⍵ ⍝ odd: one more than its triple
vs:
p ← 2|⍵ ⍝ parity
p + ⍵ × 1+2×p ⍝ one more than triple, if odd
But think arrays! Data-driven processing of conditionals is inherently parallel and so performs significantly better for large arrays.
The Collatz conjecture asserts that the sequence “half if even, else one plus triple if odd” always converges to 1. For example, for the number 7 this sequence is:
7 22 11 34 17 52 26 13 40 20 10 5 16 8 4 2 1
To test this up to a million we can borrow time, a coarse-grained timing operator, from dfns.dws:
scalar←{
{ ⍝ scalar function
2|⍵:1+3×⍵ ⍝ odd
⍵÷2 ⍝ even
}⍣{⍺=1}¨⍵ ⍝ applied under each
}
vector←{
{ ⍝ vector function
p←2|⍵ ⍝ parity
u←⍵÷2-p ⍝ half of evens
v←p+u×1+2×p ⍝ 1+3× odds
∪v~1 ⍝ without 1s and duplicates
}⍣{⍺≡⍬}⍵ ⍝ applied to whole vector
}
scalar time ⍳1e6 ⍝ scalar processing 8¾ minutes
08:46.03
vector time ⍳1e6 ⍝ array processing 8¾ seconds
08.74
In the above, time
and ⍣
are high order “operators”.
Defined operator time takes a function left operand, which it applies to its argument, returning the execution time in minutes and seconds.
Primitive operator ⍣
(power or fixpoint) applies its left operand function repeatedly until its right operand function, which is supplied with the result (⍺
) and argument of each application, returns True (1).
For further discussion of the Collatz function, see http://dfns.dyalog.com/n_osc.htm.