abstract-algorithm icon indicating copy to clipboard operation
abstract-algorithm copied to clipboard

Y combinator has INET normal form!

Open Kesanov opened this issue 7 years ago • 8 comments

evaluating /Y id reduces to

0 '1:1 0:1 0:2 0:0'
1 '1:2 0:0 1:0 3:1'
2
3
4
5
6
7
8
9

evaluating Y reduces to

{ loops: 41, rules: 10, betas: 4, dupls: 4, annis: 6 }
0 '1:0 0:1 0:2 0:0'
1 '0:0 10:0 6:2 1:3'
2 '9:1 4:1 4:2 2:3'
3
4 '5:2 2:1 2:2 4:2'
5 '9:2 6:1 4:0 2:1'
6 '10:1 5:1 1:2 1:3'
7
8
9 '10:2 2:0 5:0 1:2'
10 '1:1 6:0 9:0 3:3'
11
12
13
14
15

How can INET find normal form for recursive term?

Note that:

  • Calling fromNet on it raises Maximum call stack size exceeded.
  • @succl /Y #r #x /r /succ x also has normal form.
  • @succr /Y #r #x /succ /r x does not have normal form. It raises JavaScript heap out of memory.

Kesanov avatar May 04 '18 07:05 Kesanov

I'm really sorry but on the last ~2 days the algorithm was absolutely buggy because of the last optimization - turns out the machine wasn't moving as I expected. Good news is I've spent a while trying to figure out and understood the issue. Now the algorithm is much cleaner and I tested it with 1m λ-terms and it seems to be working flawlessly - in fact, the older stable version had a bug, so this whole thing was worth it.

Anyway, is the problem still present in the last (0.1.23) version?

VictorTaelin avatar May 05 '18 14:05 VictorTaelin

/Y I, /Y #r #x /r /succ x, /Y #r #x /succ /r x loops endlessly (reduction is never finished) as expected. What is interesting tho, all of those infinite terms use constant memory (~100 nodes).

Evaluating Y still reduces to normal form

0 '1:0 0:2 0:1 1:0'
1 '0:0 10:0 6:2 1:1'
2 '9:1 4:1 4:2 2:2'
3
4 '5:2 2:1 2:2 4:2'
5 '9:2 6:1 4:0 2:2'
6 '10:1 5:1 1:2 1:2'
7
8
9 '10:2 2:0 5:0 1:1'
10 '1:1 6:0 9:0 3:1'
11
12
13
14
15

and calling fromNet on it will result in stack overflow.

Kesanov avatar May 05 '18 14:05 Kesanov

Turns out the Y-combinator has a normal-form, surprisingly. Here is how it looks in ASCII (I really need a good renderer...):

           |
        __/_\__
        |     |
     __/_\__  |
     |     |  |
    /X\    |  |
   |   |   |  |    y-combinator's normal
    \ /   /X\ |       form (as a net)
    / \   | |_|
   |___|  |  
    \Y/   |  
     |____|  

If you try to manually convert that net to a term (emulating fromNet() in a paper), you'll get the following output:

λf. f (f (f (f ...

I.e., a lambda with f applied infinitely many times; which makes sense, because that's what the λ-combinator is! The conclusion is there are finite-size nets that correspond to infinite λ-terms. If JavaScript was lazy, fromNet() wouldn't stack overflow. It'd instead return the equivalent of:

y_combinator = Lam "f" infinite_fs where
  infinite_fs = App (Var "f") infinite_fs

I don't know what to think about that, but this is very interesting. Thanks for pointing that out.

Edit: I just realized - perhaps a little bit later - that, due to n-equivalence, the two dup nodes on the left of that drawing aren't necessary and can be replaced by a wire. The result is still the Y-combinator.

VictorTaelin avatar May 05 '18 20:05 VictorTaelin

This is the form David Turner's rules used to evaluate the reduction of Y. It is also the only one of his rules that introduces cycles in the heap.

ekmett avatar May 06 '18 14:05 ekmett

@ekmett sorry, could you link to the exact thing you mention? I don't know the context!

VictorTaelin avatar May 06 '18 14:05 VictorTaelin

No problem.

See Turner's 1979 paper "A New Implementation Technique for Applicative Languages" on the use of combinators for SASL. In it, he mentions that it is worth giving Y special handling, and defines Y f = let x = f x in x, creating a cycle in the graph.

Paul Hudak's 1984 paper on "A combinator-based compiler for a functional language" builds on that approach.

In my mental model I find it interesting to note that though there are reductions for S, K, I, etc. despite the fact that they mutate the graph, none of them introduce new cycles to it. If you start with a DAG before doing reductions you end with a DAG. This means you can use hash-consing to gain additional sharing, even on a lazy language. I spent some time playing around with defining an evaluator with some notion of maximal sharing that used such hash-consing to build the heap, and which treated Y f specially to keep things "mostly unique" in the presence of recursion by giving Y f a "name" based on f, since when you start you are still in a DAG. This opens up all sorts of other options like managing storage with reference counting despite the existence of cycles, etc. I'm looking to give a talk on the topic at YOW! LambdaJam in a couple of weeks.

ekmett avatar May 07 '18 01:05 ekmett

Here are the slides of Edward's talk. Edward, the slides alone do not seem sufficient to understand your ideas. Is there a recording?

lukaszlew avatar Jun 12 '18 22:06 lukaszlew

This YOW! Lambda Jam 2018 seems private.

lukaszlew avatar Jun 12 '18 22:06 lukaszlew