Monday, May 6, 2013

Industry Languages VS Super Languages (part 1)


I'm not sure that it is entirely accurate to describe myself as a programmer.  I have a very strong interest in understanding how things work.  This trait makes me an effective programmer and software engineer, but understanding how things work is more important to me than programming or engineering.

I started off using C at Purdue.  Java, Perl, and some other languages showed up.  But the vast majority of everything was C.  When I started working, I used a lot of C#.  These are the typical programming languages that people are used to.  It's what you're likely to see if you walk into an arbitrary software development company or division.  However, because I'm very interested in how things work, I spent a lot of time investigating more exotic programming languages.  

Almost immediately I encountered languages like ocaml, forth, and lisp.  These languages have many interesting features that are typically completely unrepresented or at the very least shunned in the industry standard languages.  I spent a bunch of time getting my head wrapped around complex features like macros and highly expressive type systems.  And I also spent a bunch of time trying to wrap my head around why these features aren't present in the industry standard languages.  Apparently, a bunch of people who really like the less popular languages have also spent a bunch of time trying to explain why their super languages aren't more popular.

I suspect that my perspective is different than most.  Even among the outliers, my focus has been different, so I believe that this has influenced my outlook to also be different.  I sought out different programming languages because I was unsatisfied with what the traditional languages offered, but most of my motivation was based off of pure curiosity.  As I moved from language to language, I never found an environment which made me feel comfortable and satisfied with what is possible.  I found that I was much more interested in learning and understanding the languages as opposed to actually using them for anything.  Most of the defenses of Lisp and Haskell, that I've seen, seem to be from the perspective of people who actually use them to do things.  My not quite a defense but more of a categorization of Lisp and Haskell (and other languages in a related position) is coming from a perspective of trying to figure out how these languages relate to the rest of reality.

There are two obvious categories you can place a language into.  Turing complete and non-turing complete.  This distinction isn't really that useful to people because it's not really obvious, even to people who know the definitions, what the difference is.  A turing machine is able to perform the most powerful calculations that we know how to calculate.  We can propose and describe calculations which would require a machine more powerful than a turing machine, but we don't know how to actually perform those calculations.  They are mathematical problems that we don't know how to solve.  A programming language being turing complete means that it can perform all of the same calculations that a turing machine is able to perform.  Or in other words it means that it is able to perform the most powerful calculations that we know how to calculate.  A non-turing complete language is then a language which is not able to perform all of the calculations that a turing machine can.  We could call it a sub-turing complete language because it's abilities are below that of a turing complete language.

Now, in practice even people who get the gist of what turing complete means wouldn't actually be able to describe the difference between calculations that require a turing machine and calculations that can be done in a sub-turing complete language.  But the idea of turing complete languages are useful at the very least because it's an objective measure.  We can mathematically show that a given language is turing complete or not.  Unfortunately, we do not really have any other objective measures for programming languages.  Don't misunderstand.  There are other objective-ish things we can say about programming languages, but few people actually agree which of those objective things are better and which are worse.  And when the pedants and/or experts get involved the first thing they do is muddle the water as to what commonly understood features actually mean (because at the end of the day, things are always more complicated than we would prefer).  At the very least with turing complete vs not, we are able to say that if a language is turing complete then it can do all of the calculations that we know how to do and alternatively it is not able to do all of them.  So if you think about it, you probably want the turing complete language because that way if the problem is solvable, then it is solvable in your language.  (Of course, there's some interesting situations that arise where you might actually want a sub-turing complete language, but let's ignore that for now.)

Basically every language is turing complete.  There's a couple of outliers, but by and large all of them are turing complete.  So, Java is turing complete and so is Lisp.  This is a problem for people who really like Lisp (and/or the other misfit languages) because when they try to convince other people about the superiority of their language, the best objective measure (turing completeness) indicates that Lisp is in fact equivalent to the boring language that everyone else is using.  Now I think that part of the problem is that different people think different ways.  Some people will work better in Lisp and some people will work better in Java.  It's difficult to argue about superiority when what you are actually trying to express is very specific thought processes that are unique to yourself.  It seems that people usually resort to poetry.  

There's the turing tarpit … where everything is possible but nothing is easy.  A language that doesn't change the way you think isn't worth learning.  You can write the solution to your problem in your language, or you can write the language that solves your problem.  And this is probably even applicable to language slogans.  Consider perl's "there's more than one way to do it" and python's "there's only one way to do it".  I didn't even have to look these up, they're just in my DNA because of how many times they show up when people start to talk about the differences between languages.  And while many of these sound nice, at the end of the day they're not much more than poetry.  It's people trying to tell a story or convey a personal experience.  And even if these people are correct, they are really only correct for themselves and people like them.  After all the perl slogan is diametrically opposed to the python slogan.  When you express something personal, you run the risk that what you say is only true for yourself.  Of course that doesn't mean that you shouldn't still try.  After all, communicating with other people is vitally important to living.  And, like with any art, by expressing yourself you open the possibility of profoundly affecting and changing the lives of others.

I've said a lot here and I haven't quite gotten to what I actually started out to say.  I think if I continue, I'll end up making this post too long and too fragmented.  So next time, what does David say is the real difference between languages like Java and Lisp.

2 comments:

  1. It is possible to give rigorous account of relative expressiveness of programming languages that go beyond mere calculability considerations (Turing-complete or not). A nice tool to think about the relative power of languages is the study of how to translate programs of one language into programs in another (possibly a closely related variant of the first one).

    We have good informal intuitions about the fact that some translations are "simple", "syntactic sugar", and that it means that the target language is at least as expressive as the origin language. On the other hand, some translations (such as rewriting all possibly-mutual recursive functions into loops) are global, non-local, amount to a "compilation pass" done by hand, and are the sign of a loss of expressivity.

    Matthias Felleisen has formalized a certain notion of expressivity that follows from these ideas. His work is not intended to be the last word spoken on programming language comparisons, but to open the door to more rigorous comparisons that go beyoned the mere calculability point of view. See his article, "On the Expressive Power of Programming Languages", 1992 ( http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.51.4656 ), but it's mostly a rigorous presentation of what I've said before about distinction between "local", "structure-preserving" transformations on one hand, and "global" ones on the other.

    ReplyDelete
    Replies
    1. Okay, so we're talking about kind of a more complicated type of levenshtein distance for programming languages. (Although obviously instead of just having insert, delete, or change, you would need a different set of transformations.)

      Hmm, of course going from C to Haskell is going to look like you are losing expressivity. But if you go from Haskell to C, then it's also going to look like you're losing expressivity. (However, now that I'm thinking about it, you could probably implement all the traditional C keywords as functions that get passed a giant byte array ... and then maybe hide it with a "system memory pointer monad" or something.) Perhaps there's more than one type of expressivity ... interesting thought if it turns out to be true.

      Either way, very cool concept, I'll need to look into it sometime.

      Okay, so my next blog post is about how a better way to divide languages is by looking at the aspects that they are composed from and then indicating which of those aspects are turing complete or less than turing complete (and in most cases the features are fixed in nature). This why of categorizing languages is much more discrete than what you're talking about. Additionally, you probably won't be able to see nice relationships between languages.

      Also, my premise is that most people aren't quite sure how to deal with all the possibilities that turing completeness brings to the table, so adding more places that are turing complete is only going to trip them up more.

      However, what you're talking about sounds like it would form a lattice structure of languages. Especially, if there's only one type of expressiveness (ie it really is "easier" to implement C in Haskell than Haskell in C). Which makes for a much more satisfying way of comparing languages. Although I would be interested to find out how easy it is to match a problem's difficulty level with a a programming language's expressiveness level (however, I guess once you can rate which programming language is more expressive than what other programming language you can just use a similar technique to decide how much work you're doing to implement the problem and thus rate the difficulty of the problem in a way which perfectly matches the expressiveness rating ... although at the same time I would think that if you use my fixed, sub, turing feature list technique to rate a language, then an engineer would find that more useful in trying to decide what language to use to solve a problem because it gives him discrete and concrete features that he can consider in deciding what tools are required for solving the problem at hand as opposed to an abstract algebraic structure ... but of course even as I say that I find myself really wanting to sink my teeth into the differences between a local structure preserving transformation and a global one ... seems like something that would make reality a bit more clear).

      Thanks for chiming in. I really appreciate the info.

      Delete