Wednesday, October 3, 2012

Dynamic vs static ... vs system f vs prototype inheritance vs macros vs eval vs ...

It's general knowledge in the programming world that there are two major types of programming languages.  Dynamically typed programming languages and statically typed programming languages.

It used to seem like there was an ongoing argument about which type of language was better.  However, recently it seems to me that no one really talks about this anymore.  People tend to use dynamic languages for some tasks and static languages for other tasks.  

A dynamically typed language will tend to look something like this:

function add( a, b )
    return a + b
end

class A
    _value
    function +( o )
        return o + _value
    end
end

class B
    function +( o )
        return 5 + _value
    end
end

add( B.new, A.new )

While a statically typed language will tend to look something like this:

int add( IAddable a, IAddable b )
{
    return a.add( b );
}

class A : IAddable
{
    ...
}

class B : IAddable
{
   ...
}

add( new A(), new B() );



They basically look the same, but the major differentiator is that a statically typed language will make you classify your constructs ahead of time, preventing your program from compiling if the classifications do not match up correctly, whereas a dynamically typed language will allow your program to run without checking your classifications, shutting down the program if a construct is asked to do an operation that it is unable to perform.

There are of course a bunch of different variations that make things even more complex.  The picture I have painted in the code above tells a story about writing a compiler with the machine in mind.  Classes are memory layout specifications, objects are pointers to memory slots, and functions are basically dereferences.  In this story there really isn't any difference between static and dynamic languages.  The static language is more verbose, but it protects you from simple errors and it gives compiler mandated hints to programers as to what is going on.  The dynamic language is more permissive, but it does nothing to prevent trivial failures and is a maze symbols without meaning.

I think things get much more interesting when you look at the variations.  Smalltalk, Self, and Lua all tell a story about ideal objects and messages that get delivered to them.  There's really no such thing as an operation on an object ... it's more like asking an entity "leche" and seeing if it understands what to do.  Or seeing what it will do.  On the other hand, ML and Haskell tell a story about mathematical rigor and functions.  You are no longer performing operations on anything really ... you're evaluating functions much in the same way as if you were performing it with pen and paper instead of silicon and electrons.

Finally, you add in "other features" and see how people begin to become weary of the process.  The familiar slots and memory addresses melt away and much more complex questions arise.  Metaobjects allow objects to decide in a turing complete way how they will react to your program itself.  You thought you knew what subclassing meant ... let's see how else we might define that.  Macros and syntax extensions change the very meaning of programming language at compile time ... what is the best language to solve *this* problem.  What if we define data types not by their memory slots, but model them after algebra?  What happens if we try to generalize that process?  If all our functions are pure, doesn't that mean our language forms a category?  I wonder if that allows us to do anything useful.  Maybe we want to find a loop hole in the halting problem, well there's a solution to that pesky mathematical proof depending on how hard you are willing to work.  Or maybe you want to go the other way and evaluate arbitrary code after your program is already finished.  

When it comes to most of the programers I have interacted with, they seem to have a pretty good grounding with memory slots and operations upon them.  And, for the most part, this seems sufficient to create software that can be sold.  However, I feel a bit weary of trying to model more and more complex problems by trying to think about how to fit it in a computer.  I would much rather be thinking about how to get the computer to fit my problem.

No comments:

Post a Comment