13 Feb 2012 20:05

## Re: [stack] Jon Purdy: Why Concatenative Programming Matters

On 02/13/2012 12:08 PM, William Tanksley, Jr wrote:

> description, but doesn't communicate very well. I don't know
> what else to say.

Apologies: This email is a bit of a brain dump and ill-proofread.

I think maybe the most interesting thing about concatenative
languages isn't stressed enough in the article: Unlike most every
other type of programming language on the planet, there is no
syntax for function application. All you get is composition and,
if higher-order, quotation.

I think the second most interesting thing is that all functions
operate on possibly empty sequences containing arbitrary amounts
of program state. It's this "all the state" element that gives
concatenative languages their character. For example, Backus's FL
does not do this; functions that need one number take just that
number and functions that need two take a pair; never do
functions take the entire state of a program as they usually do
in Forth or Factor.

For example, a tail-recursive 'length' function in a
concatenative language might be:

length = [0] dip loop where
loop = [null?] [drop] [[1 +] dip tail loop] if
end

And in FL:

def length ≡ f ∘ [id, ~0] where {
def f ≡ isnull ∘ xs → n; loop ∘ [tl ∘ xs, + ∘ [~1, n]]
def xs ≡ s1
def n ≡ s2
}

A quick guide to the FL program:

- '∘' is composition
- '~' is constant (i.e. \x y -> x)
- the square brackets denote "construction"; construction returns
a function that, when applied to an argument, returns a list of
length N where N is the number of functions provided by the
programmer; each element in the result corresponds to one
function after it was applied to the same input as all other
functions (e.g. '[f, g]:x == <f:x, g:x>', where ':' is FL's
syntax for application and angle brackets denote lists)
- conditionals are written 'a → b; c', where 'b' and 'c' are the
functions conditionally applied to the input to get the result
- 's1' and 's2' select the first and second elements from a list
- 'tl' is the tail operation

These two programs are, to me, radically different. The
concatenative program is decidedly "flat" and feels very
imperative. The FL program, on the other hand, is "nested" like
applicative programs usually are and feels, at least to me,
rather declarative; the fact that you can consider each function
within the two constructions to be completely independent is
probably the main reason for this. The independence of functions
also allows you to write aliases for selectors as I did above. In
fact, FL offers a form of pattern matching for conveniently
locally-defining such selector functions via double-square
brackets:

def length ≡ f ∘ [id, ~0] where {
def f ≡〚xs, n〛→
isnull ∘ xs → n; loop ∘ [tl ∘ xs, + ∘ [~1, n]]
}

This is very different from concatenative languages where
combinators like 'bi <at> ' allow the functions provided to step on
each other as the second function runs on top of the stack
returned from the first. This is the reason I gave up on
concatenative languages; the equational reasoning possible when
all functions operate on "the stack" just isn't great. I think
the fact that faking variables in FL is trivial while faking
variables in something like Factor requires a rather complex
transformation is evidence of this. I also think the lack of
useful equational reasoning is why things feel so imperative once
you get past the pretty examples in the Joy literature. Maybe I
just need more combinators.

To be fair, I should point out that having all functions work on
"the stack" does have its advantages. In particular, higher-order
programming in FL is pretty gnarly. In fact, doing it requires
pervasive use of "lifting" which is, essentially, a way of
modifying functions so that they take an additional argument full
of all the other stuff you want to use (and hence reminds me a

http://theory.stanford.edu/~aiken/publications/trs/FLProject.pdf

I guess what I'm trying to say is that "every function takes the
entire program state as input", or at least a good chunk of it
(as with Joy's 'infra'), is *really important*. While this fact
is arguably implicit in the definition on Wikipedia, it's not
spelled out at all. Maybe I can add some insight to it in the
next few days.

- jn

__._,_.___
Recent Activity:
.

__,_._,___
13 Feb 2012 21:20

### Re: [stack] Jon Purdy: Why Concatenative Programming Matters

John Nowak <john <at> johnnowak.com> wrote:
> I think the second most interesting thing is that all functions
> operate on possibly empty sequences containing arbitrary amounts
> of program state. It's this "all the state" element that gives
> concatenative languages their character.

Indeed. And you're right when you elsewhere say that this gives
concatenative languages an imperative character.

You're right that any arbitrary function must be able to access all of
the state; but it need not be true that any specific function must be
able to access all the state. Modern concatenative languages use
typechecking to make sure that no function attempts to consume stack
state that doesn't match the function's declaration; but other types
of checking don't have such an obvious typechecking solution. (Were
you the one who designed a concatenative language with side-effect
checking?)

It seems obvious to me that in order to add effect checking, you have
to formally define the effect. Forth's dictionary could be formally
defined and checking there from implemented.

Of course, this is easier to say than to do, and I haven't even said it.

I also admit that by saying this, I have conceded your point. This has
the result of making reordering hard.

> This is very different from concatenative languages where
> combinators like 'bi <at> ' allow the functions provided to step on
> each other as the second function runs on top of the stack
> returned from the first. This is the reason I gave up on
> concatenative languages; the equational reasoning possible when
> all functions operate on "the stack" just isn't great. I think

That specific example need not retain all of its problems, obviously
-- you can define similar combinators to require the forks to have
identical static types, and then effectively isolate the forks from
each other. Factor chooses not to do that, and thereby adds semantics
which are useful for its purposes.

> (as with Joy's 'infra'), is *really important*. While this fact
> is arguably implicit in the definition on Wikipedia, it's not
> spelled out at all. Maybe I can add some insight to it in the
> next few days.

Fair point. We await your input. Fthagn.

> - jn

-Wm

__._,_.___
Recent Activity:
.

__,_._,___
13 Feb 2012 22:23

### Re: [stack] Jon Purdy: Why Concatenative Programming Matters

On 02/13/2012 03:20 PM, William Tanksley, Jr wrote:

> You're right that any arbitrary function must be able to access
> all of the state; but it need not be true that any specific
> function must be able to access all the state. Modern
> concatenative languages use typechecking to make sure that no
> function attempts to consume stack state that doesn't match the
> function's declaration;

This is certainly true; or at least it would be if the languages
actually excited. With types, something like 'bi' can be
restricted so that the second function uses no more than one
element and thus you can consider both functions independently.
(For the record, I mistakenly referred to 'bi <at> ' in a previous
email. I meant 'bi'.)

Even without types, this is possible if you have 'infra' to run a
function on a stack-on-the-stack and some sort of 'concat'
operation to join a stack-on-the-stack onto the main stack
itself. Assuming curly braces denote stacks, we need only the
following primitives:

{A} [B] infra == {A B}
A {B} concat == A B
{A} B push == {A B}

This is enough to write a "safe" 'bi':

bi = [keep] dip [{} swap push] dip infra concat

With this version of 'bi', you'll get a stack underflow error if
the second quotation to 'bi' attempts to use more than one
element. Since we're not using types, you might as well pass an
argument to some generic version of 'bi' that says how many
arguments to reserve for the second function; just push that
many onto the stack before calling 'infra'.

> Were you the one who designed a concatenative language with
> side-effect checking?)

Yes, although it is just a little trick of attaching a variable
to function types that can either be "effectful" or "not
effectful". You can infer it via unification. Not much to it.
It's a bit coarse though because it doesn't offer any form of
local effects with a pure interface as you can get with the ST
monad or with linear or uniqueness types (though they could be

- jn

__._,_.___
Recent Activity:
.

__,_._,___
13 Feb 2012 22:25

### Re: [stack] Jon Purdy: Why Concatenative Programming Matters

On 02/13/2012 04:23 PM, John Nowak wrote:

> This is certainly true; or at least it would be if the languages
> actually excited.

Existed. Sorry.

- jn

__._,_.___
Recent Activity:
.

__,_._,___
13 Feb 2012 23:05

### Re: [stack] Jon Purdy: Why Concatenative Programming Matters

John Nowak <john <at> johnnowak.com> wrote:
> William Tanksley, Jr wrote:
>  > You're right that any arbitrary function must be able to access
>  > all of the state; but it need not be true that any specific
>  > function must be able to access all the state. Modern
>  > concatenative languages use typechecking to make sure that no
>  > function attempts to consume stack state that doesn't match the
>  > function's declaration;

> This is certainly true; or at least it would be if the languages
> actually excited.

Fair point. (I misread this word as "exited", and thought you were

> With types, something like 'bi' can be
> restricted so that the second function uses no more than one
> element and thus you can consider both functions independently.
> (For the record, I mistakenly referred to 'bi <at> ' in a previous
> email. I meant 'bi'.)

Well, you can also copy the needed elements into place -- type
checking can handle the rest. That way you can provide every branch
with the data it needs according to the documented semantics of the
forking word (whether each branch sees the same stack, or the branches
see sequential data from the initial stack).

And you're right that the easy way is to use "infra"... Much less
infrastructure.

>  > Were you the one who designed a concatenative language with
>  > side-effect checking?)
> Yes, although it is just a little trick of attaching a variable
> to function types that can either be "effectful" or "not
> effectful". You can infer it via unification. Not much to it.
> It's a bit coarse though because it doesn't offer any form of
> local effects with a pure interface as you can get with the ST
> monad or with linear or uniqueness types (though they could be

Ah, I see. Yes, I was thinking more of defining structures on which
effects could occur, and checking those effects.

Forth has a dictionary on which effects can occur, for example.

> - jn

-Wm

__._,_.___
Recent Activity:
.

__,_._,___

Gmane