Today’s letcture was on code abstraction. In one part Professor Danvy gave an example on how recurison can be viewed from a more mathematical point of view through demonstrating the $\mathbb{N}$-bijective nature of some set of procedures. Here is the idea:

```
(define f0
(lambda (a)
(errorf 'f "is not defined for ~s" a)))
(define f1
(lambda (a)
(if (zero? a)
1
(* a (f0 (- a 1))))))
(define f2
(lambda (a)
(if (zero? a)
1
(* a (f1 (- a 1))))))
(define f3
(lambda (a)
(if (zero? a)
1
(* a (f2 (- a 1))))))
```

as more procedures are enumerated in this a manner, we would be able to do factorial on a larger subset of the natural numbers. There isn’t really recursion going on since there is no self-referencing. And this can continue to infinity. But as anyone with a common sense would know, this goes as far as $\aleph_0$ by virtue to the fact that it is bijecitve to $\mathbb{N}$.

So a curious question arises: let be some set of procedures that can be enumerated like above, is it possible to construct a set of procedures, , similiar to (the notation of it being similiar to is rather tricky here: let’s just say it is similiar in that it is related to recursion, or that it is a generalisation of ), that has a cardinality equal or greater than ? Since such set of procedures cannot be enumerated as George Cantor had demonstrated more than a decade ago (O! the whirligig of time), it would need to be constructed in a different manner.

The same question can be asked regarding Turing machine: can we formulate a model of computation, similiar to that of Turing’s (or more of a generalization of it), such that there exists a single object, U, that can (similiar to how every Turing machine can be simulated by a Universal Turing machine) simulate every object in this model, wherein every $r \in \mathbb{R}$ can be used to encode a distinct object? Perhaps in such model, there exists some object that can solve a subset of the Halting problem previously unsolvable (i.e. those which halt in a set of steps with a cardinality less than $\aleph_1$ (and now we have successfully drawn the continuum hypothesis into the disucssion)). But does such model of computation even exist in the realm of the computing science as we know it?