Jacques Mattheij

Technology, Coding and Business

The Lambda Calculus

I’m trying to get my head wrapped around the Lambda Calculus because I want to get a more thorough understanding of functional programming. Through the years there have been many times when I thought it is high time to learn ‘lisp’, a little while ago I had a long look at the source of ‘news.arc’ and it started to become a bit clearer how lisp and other functional languages achieve real work, but to say that I fully grok it would be a serious overstatement.

For me I can only say that I really understand any language when I’ve implemented it, or at least when I think that I can do that. So, the goal is to implement a minimal version of lisp, and in order to do that the Lambda Calculus comes first.

Right now, I come at all this from an ‘imperative’ programmers perspective, I see functional programming as a way to achieve results by evaluation, without any side effects.

That’s a pretty difficult concept when you’ve been conditioned all your life to an environment with a strict division between ‘functions’ and ‘statements’.

In good old ‘BASIC’ the functions were different from the statements in a very clearly defined way. Statements ‘did’ stuff, functions computed stuff.

Other languages, like C, Java and so on blur the distinction between the two and use the word ‘function’ interchangeably, to mean both ‘function’ that computes something and ‘function’ that has side effects.

In pure functional programming there are no side effects, there is only the reduction of an expression until it can’t be reduced any further. Whatever remains then is the mathematical equivalent of an answer, or at least, that’s what I understand right now.

After reading the wikipedia article (but not understanding it very well, I’ll have to re-read it a few more times):

http://en.wikipedia.org/wiki/Lambda_calculus

and

http://safalra.com/science/lambda-calculus/introduction/

I think that the key is that the rules for transformation (enumerated as alpha, beta and eta transformations) are formalized and allow you to ‘normalize’ any expression in the lambda calculus to it’s most reduced form.

Tricky stuff, I never had more math than the 4th year in high school and even though I feel that once I’ll get it this is probably quite trivial for the moment the notation puts me off. That’s one of the things that I always struggled with, I think in code, not in mathematical symbols.

I’ll keep coming back to this bit and edit it as I get a better understanding of it.

<!– 38 –>

HN Submission/Discussion
If you read this far you should probably follow me on twitter: