General purpose programming languages' speed of light

  • This "Magellanic view" of programming language exploration doesn't seem quite right. Programming language design is mostly not about finding virgin territory by inventing brand new ideas (which was admittedly easier in the era when high-level languages were brand new – i.e. in the 1950s). Rather, it's mostly about finding unexplored folds hidden nearby in the vast, combinatorial manifold of ways to combine existing ideas in a single, coherent language. In my experience, people who have spent a lot of time designing languages are the most sympathetic to people trying new permutations – precisely because they are so painfully aware of all the awful compromises they were forced to make in their own designs and their understanding that much better ways of combining those features might be so tantalizingly close.

    Scala is a great example: the innovation of the language is not so much in new language features, but rather in its ingenious combination of so many powerful features into a single, coherent system. Of course, some may argue that Scala has too many features (I'm a bit terrified of it), but it's indisputable that putting all those pieces together in a way that works is tour de force of language design.

  • Am I the only one who sees no clothes on this article?

    I mean, it seems that the person who wrote it spends a lot of time thinking about programming and much less time programming; that's where he ends up with meta-ideas that are interesting but also mostly wrong.

    Languages are substantially different. You don't (and can't) understand every concept under the hood to drive the thing. You can't add and substract features, they're interdependent.

  • A bit off topic, but I've been learning about languages like ML and Scala, and what does everybody have against static typing? I feel like if we used type systems better, we'd have a lot fewer problems. You can prove that your programs have no bugs! That's much stronger than unit testing.

    When thinking 'static types', does everybody just think C/C++/Java? Is it upfront costs? My first ML program, I took half an hour to write a function that output all the words in a trie. It gets easier, and more interesting afterward, but I might have given up had it not been for a class. Are static types too rigid for prototyping? Scala, Haskell and OCaml have REPLs.

    Some of the "experimental features of static type systems" like dependent types are really powerful; you can make some strong proofs about the logic of your program. I'd be willing to give up (or at least try going without) duck typing for that.

  • Interesting concept, but the flaw at the heart is the presupposition that lack of progress right now means we are at the ultimate limit of what can be accomplished. Imagine if cavemen learning to paint on walls said, well we haven't improved in a few millennia, so this is probably the most complex thing that can be represented by drawings. Or what about math stopping with Euclid? It was thousands of years later that progress happened.

    Technology comes in fits and starts. A lot of new things happened in the 50s and 60s, we are still trying to figure out ways to use and apply them. Just because someone thought about and prototyped something then doesn't mean it's not new when that feature goes mainstream (e.g. garbage collection in Java, channels for concurrency in Go, etc).

    For a long time we couldn't break the sound barrier, which is a limit, but not the speed of light limit. Because progress is stalled now doesn't mean there will never be progress in the future.

  • But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub.

    http://www.paulgraham.com/avg.html

  • We went backwards when we went to the web in terms of development tools. It was necessary but RAD (Delphi, Visual Basic, yes that VB) were far ahead of where we are now today in the ability to put basic pieces together to make an application. I've not used visual studio in a long time, but I would hope they retained their core philosophy of reuse.

  • I've always had a kind of feeling that there is an upper bound on the rate at which a human can articulate an (original) idea/program/function in explicit enough terms for a computer to then run it.

    Even to do this in the first place will always require a bare minimum understanding of the language of logic; basic control flow statements, variables and so on.

    This is why I feel things like Bret Victor's idea of the 'MathKiller' (which as I understand the idea, is his general term for a hypothetical universally-intuitive computing environment which can model anything) are goals which we can only ever approach asymptotically - there will always be some uncharted waters where the only option available to those who want to explore further is simply to straight up write some code.

    I guess the point I'm trying to make is that I feel improving the sophistication of the programming languages we use or changing the core paradigms upon which they are based will not help the situation; improving the sophistication of the tools we use to write them with will.

  • I agree in the sense that I don't think there's going to be a killer new programming language or language paradigm that's going to overturn the existing languages within the problem domains they're suited for.

    Since the current high-level languages are so extensible, the new paradigm seems to be moving beyond programming languages to a higher level of abstraction based on frameworks and DSLs. We're not just "writing code," we're always writing code that writes code (that writes code, and so on, of course, until it's a stream of 0's and 1's.). We do this because working at a higher level of abstraction is usually more productive. The recent crop of programming frameworks are just another layer on top of this. Next I guess we will have some sort of meta-frameworks on top of those. So the specific language choice will only matter insofar as it's a part of the framework stack.

  • I like to envision something like 'augmented coding', where the concepts are driven by the programmer, and some friendly bot fills in the nitty gritty. I'm not talking about something like a GUI driven language, more like autocomplete - just that the autocomplete is doing way more than calling up a list of method names.

  • A post on the future of programming languages without mentioning Agda, Coq, or OMeta?

    I think we are just now getting beyond the "low-hanging fruit" era of programming languages.

  • Haskell's type system can be unwieldy (and often on the edge of experimentation, it is) but that's because it's not so different from a programming language itself. That said, both the type system and the language, Haskell, itself are based on incredibly simple pieces.

  • How would scissors invention fit in that reasoning?

    I vote for the tooling option or, to be more precise, for a different combination of tooling and features.

  • I hope the author spends some time learning about programming languages and programming language research. The core point is simply false, and the evidence to support it ranges from flimsy to nonsense.

    "The plain truth about programming languages is that while there have been many small gains in the last 20 years, there have been no major advances"

    There have been many major advances. The fact that they were not included in java does not mean they do not exist.

    >There have been no new paradigms

    Arguably there have only ever been two paradigms: imperative and functional (object oriented and procedural simply being minor variations of the imperative paradigm). It is not reasonable to expect entirely new paradigms to be discovered on anything other than an incredibly rare basis.

    >I'm not even aware of major new language features

    The first thing this should do is trigger your "I should research new language features" instinct, not your "I should assume there are none" instinct.

    >beyond some aspects of static type systems

    So, he does know some, but chooses to ignore them because why exactly?

    >The core of virtually every extant programming language is largely similar.

    No. And it is entirely possible for new things to replace old things, programming languages are not required to take the C++ approach of accumulating every possible feature that has ever existed.

    >Some of the things Haskell and Scala's type systems can express are astonishing; but I have also seen each type system baffle world-renowned experts

    That is an awfully bold statement to just pull out of nowhere with nothing to back it up. Who are these experts, and what baffled them exactly? Haskell has been a hotbed of programming language research in the last decade, with a large number of advances being made and being put into actual use, then more advancements being built on top of those. Dismissing the entire concept of type systems based on an unnamed "expert" who was somehow "baffled" by some unmentioned aspect of the language is insane.

    >Verification techniques and tools have made major advances

    Yeah, like those crazy type system things you just dismissed as being unconvincing, oversold, and baffling to experts. Go learn agda and then tell me nothing new has happened in 20 years.