Ask HN: Which people and groups are researching new approaches to programming?
Which people and groups are researching new approaches to programming?
I'm interested in groups who are thinking about the next thirty years, rather than the next five years.
Some projects I find interesting:
* Subtext by Jonathan Edwards: https://vimeo.com/106073134
* Apparatus by Toby Schachman: http://aprt.us
* Bret Victor's explorable explanation and introspectable programming demos: http://worrydream.com
* Eve by Kodowa: https://www.youtube.com/watch?v=VZQoAKJPbh8
* Scratch by the MIT Media Lab: https://scratch.mit.edu
* Mathmatica by Wolfram Research: https://www.wolfram.com/mathematica
Why am I interested? I'm working on Code Lauren, a game programming environment for beginners: http://codelauren.com I want to learn as much as possible about work done by others in the same area.
I think you'd probably be interested by the Viewpoint Research (and Alan Kay):
...and STEPS:
http://www.vpri.org/pdf/tr2011004_steps11.pdf
"We set a limit of 20,000 lines of code to express all of the “runnable meaning” of personal computing (“from the end‑user down to the metal”) where “runnable meaning” means that the system will run with just this code (but could have added optimizations to make it run faster). One measure will be what did get accomplished by the end of the project with the 20,000 lines budget. Another measure will be typical lines of code ratios compared to existing systems. We aim for large factors of 100, 1000, and more . How understandable is it? Are the designs and their code clear as well as small? Can the system be used as a live example of how to do this art? Is it clear enough to evoke other, better, approaches?"
Like most things, the kernel of tomorrow's ideas is already here. On the scale of the next five years, these ideas will give rise to what the future of programming will look like:
* Refinement types
Liquid Haskell: https://ucsd-progsys.github.io/liquidhaskell-tutorial/02-log...
* SMT Solver Language Integration
Cryptol: https://github.com/GaloisInc/cryptol
* Session Types
Scribble: http://www.scribble.org/
* Dependent Types
Agda: https://en.wikipedia.org/wiki/Agda_(programming_language)
Idris: http://www.idris-lang.org/
* Effect typing
Koka: https://research.microsoft.com/en-us/um/people/daan/madoko/d...
* Formal verification
Coq: https://www.cis.upenn.edu/~bcpierce/sf/current/index.html
TLA+: http://research.microsoft.com/en-us/um/people/lamport/tla/tl...
This is the general trend, generally making more composable abstractions and smarter compilers and languages that can reason about more of our programs for us.
Some more projects:
* Unsion by Paul Chiusano: http://unisonweb.org/2015-05-07/about.html
* APX by Sean McDirmid: https://www.youtube.com/watch?v=YLrdhFEAiqo
* Awelon by David Barbour: https://github.com/dmbarbour/awelon/blob/master/AwelonProjec...
And perhaps more in the "next five years" category:
* Om Next by David Nolen: https://www.youtube.com/watch?v=MDZpSIngwm4
* Elm by Evan Czaplicki: http://elm-lang.org/
Me.
I'm working on Full Metal Jacket, a strongly-typed, visual, pure dataflow language (http://web.onetel.com/~hibou/fmj/FMJ.html) with its own IDE.
Things have advanced a fair bit since I wrote those pages, and published the recent paper, so I'll add to the tutorials very soon, and announce this in Hacker News. Type definition, macros, and a few other things have been added to the language.
.303 shared-source release approaches, but I don't do deadlines.
No battle plan survives contact with the enemy, but I have some ideas for future directions, including adding dependent types, running on a multi-core machine with speculative execution, and automatic programming (i.e. user supplies just inputs and outputs). Very long-term ideas involve a developing a variant of the language which enables programs to run backwards, to enable execution on a gated quantum computer.
Sean McDirmid's work on Glitch is an interesting (and distinctly contra- the current "FP all the things!" zeitgeist) approach to live programming: http://research.microsoft.com/en-us/people/smcdirm/
Conal Elliott's work on Tangible FP was an interesting attempt to unify functional and "visual" programming that has been mostly abandoned: http://conal.net/papers/Eros/ Hopefully some of its ideas may yet survive in other projects.
The Berkeley Orders of Magnitude project is somewhere at the intersection of database and PL research, aimed at handling orders of magnitude more data with orders of magnitude less code: http://boom.cs.berkeley.edu/ The Dedalus language in particular is interesting, as it integrates distributed- and logic-programming: http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-17...
Joe Armstrong's thoughts on a module- or namespace-less programming environment are interesting: http://erlang.org/pipermail/erlang-questions/2011-May/058768...
I've been meaning to write a blog post about the convergence of various ideas of what the future of programming might look like for a while now, so I have a bunch of notes on this topic. The OP & other folks have already mentioned most of the other projects in my notes - in particular Unison, Subtext, Eve, & Bret Victor's work.
My current line of work is on tackling a tiny little corner of what I see as the future's problems - trying to find a better way to combine database/relational programming and functional programming. My work is here (but the docs are almost entirely in type-theory-jargon at the moment, sorry! feel free to shoot me an email if you have questions): https://github.com/rntz/datafun
We're working on http://tonicdev.com . You can read a bit about it here http://blog.tonicdev.com/2015/09/10/time-traveling-in-node.j... or just try it yourself.
Some guiding principles:
1. So much of what hinders programmers is that the friction to using existing solutions is so high that they choose to redo the work themselves. In Tonic we've made every package for javascript immediately available (over 200,000), so that you can focus on putting existing pieces together. We keep adding them as they come in immediately, as well as making notebooks accessible to each other. Its like you have access to the global library.
2. There shouldn't be any snippet of code online that isn't runnable. Every example you read on the internet should be immediately runnable (regardless of whether it require binary packages or what have you). The difference in understanding when you can tweak an example and see what happens vs just cursory reading over it and thinking you get it is huge. Tonic embeds allows this: https://tonicdev.com/docs/embed
My primary concern is with personal knowledge bases [1]. This intersects with new approaches to programming, because programs and algorithms are an aspect of and a way of interacting with knowledge.
It's mostly conceptual right now, but my idea is to represent knowledge as a hypergraph with spatial and temporal dimensions on both edges and vertices. This, I hope, could represent every kind of knowledge I can imagine. The hypergraph would function as a filesystem and database, and you could query/program the system. It would be Emacs for all media, and not just text.
I want to augment the inherent power of the hypergraph with a mashup of the OpenEndedGroup's Field, Xerox PARC's Smalltalk, Doug Engelbart's NLS, Symbolics' Open Genera, org-mode, Vim, the Wolfram Language, Bell Labs' Plan 9, and Ted Nelson's ZigZag and Xanadu projects.
If anyone finds this interesting and wants to chat about this stuff, please email me at the email address in my profile.
I'm working in a relational language like http://www.try-alf.org/blog/2013-10-21-relations-as-first-cl....
However a noob in building a language ;)
I have learn a lot of stuff. For example, columnar-stores provide some nice oportunities to manage data in-memory + compress values in columns.
Sorting, joining & selecting on multiple-arrays that are a necessity for OLAP queries translate well to the need of normal programming.
SQL is a overcomplication and a bad "programming" language. Unfortunately, the only practical way to interface with most databases.
If my plan work, this could be good to build a relational store, that with let your to say: The name is a string, and I need to search on it. A normal engine will need to store it again in a index. I will say instead, the name column is the index and the values are stored once.
Or something like that.
BTW: A relation have some simmetry with inmutable functions, but I still don't know how exploit this fact
JetBrains MPS: https://www.jetbrains.com/mps/
I'm building a language for learning to program on smartphones. It's a stack-based language designed for interactive editing (since typing code on a smartphone is no fun).
It's in the very early stages at the moment.
Microsoft have done some great and totally ignored work with embedding visual languages into game engines for kids - http://research.microsoft.com/en-us/projects/kodu/ http://welcome.projectspark.com/
Wrt to Code Lauren you may also be interested in http://www.cs.cmu.edu/~NatProg/index.html - a HCI approach to tooling (the WhyLine is my favourite).
Dog hasn't released much info yet but it's an interesting concept - http://dl.acm.org/citation.cfm?id=2502026 https://www.media.mit.edu/research/groups/social-computing
Program synthesis is really interesting area of research eg http://research.microsoft.com/en-us/um/people/sumitg/
VPRI has already been mentioned but I'd like to highlight their work on Programming as Planning - http://www.vpri.org/pdf/m2009001_prog_as.pdf
From the last FPW, there were a couple of projects that really stood out for me: http://probcomp.csail.mit.edu/bayesdb/ http://2015.splashcon.org/event/fpw2015-an-end-user-programm...
Also not really a new language, but Sam Aarons work on Sonic Pi is pedagogically interesting - http://sonic-pi.net/
Unseen
A Functional and Logical Dataflow language.
http://www.reddit.com/r/unseen_programming
Status: under development.
Functions are components, which can be used recursively. Arrows are logical relations between these functions. Inspired by Scala and VHDL.
The logic and flow deal with the control and time aspect.
Testing and commenting is integrated in the graphical system as different layers. All graphical structures can be converted to (reasonably) simple text, resembling Scala.
Looking at hardware trends on servers, in the next ten years I expect more pure functional programming on the CPU (implicitly concurrent, write-once data structures) and more data-parallel array operations being offloaded to GPUs. The CPU language of 2025 is probably something very much like Haskell with a cleaned-up base library, but I'm not sure what the GPU component will look like to the programmer.
For a meditation on which aspects of our text-based programming tools derive from merits of the medium versus historical accident, see: http://westoncb.blogspot.com/2015/06/why-programming-languag... —there's also a second part (linked to within) that describes an alternate, general purpose architecture for programming tools that lets us stay powerfully text-centric while moving away from operating on character sequences under the hood.
I wrote this non-traditional program editor: https://youtu.be/tztmgCcZaM4?t=1m32s
And a new kind of visual debugger: https://www.youtube.com/watch?v=NvfMthDInwE
> * Eve by Kodowa: https://www.youtube.com/watch?v=VZQoAKJPbh8
This guy seems kinda young. Does anyone think that the future of programming can come from people without extensive programming experience of current programming?
I am. I am working on a new programming language designed for the next generations. The language re-imagines the role of the compiler from being a monolithic static blackbox that converts source to executable, into being an open dynamic system the manages the language syntax and compilation process, but leaves the final syntax and actual compilation to hot pluggable libraries. Why? To make the language dynamically upgradable, and on a per-project basis. This is the only way to make a language future-proof. More details: http://alusus.net http://alusus.net/overview P.S. the project is still at a very early stage, but a proof of concept is there.
Everyone doing probabilistic programming. Can't really give a summary
https://en.wikipedia.org/wiki/Probabilistic_programming_lang...
Formal verification using ACL2 or Coq or other tools also.
I think "visual" programming is the incorrect approach, something I would like to see more of is "tactile-spatial" programming. Anybody have an example of these? Most work I've seen is visual/flowchart which is not optimal for touch devices or large projects.
Like many people, I think about this now and then, but I haven't done any work in that area. Ultimately, I picture a live environment, but not like Blueprint in UE4.
I expect we'll end up with a predominantly interactive approach to programming, most likely visual drag-and-drop style programming, in a live environment that knows common patterns, data structures, algorithms; giving real-time advice, showing native code output as you go, etc. Basically, you're molding the system while monitoring it under various contexts, and you're programming against data models.
Got a new Arduino board? Just drag and drop the data sheet model into your environment. It contains memory address information, read/write contexts, access protocols, contexts, etc for every component, and how they're connected. Now you design the rest of the logic.
"A programming language is a user interface as much as it is any other thing. Why have multiple ones? They are all Turing equivalent."—Alan C. Kay (see the talk for the context of the quote; starts at around 23:25)
"Rethinking CS Education | Alan Kay, CrossRoads 2015." https://www.youtube.com/watch?v=N9c7_8Gp7gI
"Most computer people went into computing because they were disturbed by other human beings."—Alan C. Kay
Lamdu: http://www.lamdu.org
I'm working on a new approach to programming that is in the direction of Bret Victor's Inventing on PrĂnciple (focusing on environment, all open source)
Because we're prelaunch(and working hard on getting it to all of you!) I can't talk about it much yet, but we're funded and looking to work with great people.
Email me at cammarata.nick@gmail.com if you're interested in this space, would love to hear ideas and talk more.
Kayia was presented at the Future of Programming at Strange Loop a little over a year ago.
Viewpoints Research Institute, see e.g. http://vpri.org/html/writings.php which has papers like "Checks and Balances - Constraint Solving without Surprises in Object-Constraint Programming Languages" and Alessandro Warth's "Experimenting With Programming Languages" (which led to OMeta/JS, which is I think on GitHub), as well as a ton of Alan Kay talks on fundamental new computing technologies http://vpri.org/html/words_links/links_ifnct.htm .
The way that Datomic uses Datalog is really interesting from a perspective of "new approach to programming" (databases).
Erik Demaine's course on advanced data structures gives some interesting ideas for time-travel-based games: https://courses.csail.mit.edu/6.851/spring14/ . His work also has application to other fields like creating an efficient in-app version control system http://www.cs.utexas.edu/~ecprice/papers/confluent_swat.pdf .
Lots of cool stuff on HaskellWiki; for example https://wiki.haskell.org/Functional_Reactive_Programming .
If you really want to jump into the deep end there's a whole blog called Lambda: the Ultimate about new approaches to programming: http://lambda-the-ultimate.org/
Program synthesis: see work by Ras Bodik's team at Berkeley (now UW) and descendants in Armando's MIT team and Sumit Gulwani's MSR team.
As a concrete examples making industry waves, see Excel's new Flashfill and Trifacta ETL wrangling product.
Underneath, these use search (ML, SMT, ...) to allow non-tradiotional & sloppy coding: fill in the blanks, program by demonstration, etc.
From Scratch you should checkout Snap! and snap.berkeley.edu
John Maloney (co-inventor of Scratch), Jens Moenig (who was on the Scratch Team and develops Snap!), along with Yoshiki Oshima (who may also have been on the Scratch team), are deveolping a new langauge "GP" (for General Purpose) which like "professional Scratch".
Here's a video of it: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c...
There's also lots of extenstions to Scratch and Snap! particularly around CS education, and I'd be happy to discuss those!
STEPS by alan kay(building a full OS including applications in 20K LOC: http://blog.regehr.org/archives/663
Spiral http://www.spiral.net/index.html and spiralgen - going straight from math to code optimized for multiple platforms: http://www.spiralgen.com/
Automatically solving bugs using genetic algorithms : http://dijkstra.cs.virginia.edu/genprog/
Automatic bug repair(2015): http://news.mit.edu/2015/automatic-code-bug-repair-0629
YC company run by Mike McNeil. Visual backend programming. Built off Node, Sails and Machine Spec. The latter is particularly interesting:
I'm doing several myself at an abstract level given lack of time or resources. I feed what I learn in them to pro's for potentially seeing it built. Here's a few:
1. Modula-2-style systems language with macros, better typing, and typed assembler to replace C with minimal hit on performance or ease of use.
2. Relatively-simple, imperative language plus translators to equivalent functionality in mainstream languages to leverage their static analysis or testing tools if possible.
3. Unified architecture to natively represent and integrate both imperative (C or Modula style) and functional (Ocaml or Haskell style) languages. Extensions for safety/security at hardware level for both.
4. Hardware/software development system using Scheme or abstract, state machines that can turn a high-level description into software, hardware, or integrated combo of both.
5. Defining TCB's for type systems, OS architectures, etc that minimize what must be trusted (mathematically verified) to ensure invariants apply while maximizing expressiveness. crash-safe.org and CertiKOS are more concrete, useful, and professional versions of what I'm getting at with 5.
6. Building on 5, a series of consistent ways of expressing one program that capture sequential, concurrent, failure, integration, and covert channel analysis in a clean way for formal analysis. As in, an all-in-one high assurance toolkit that automates the common stuff a la static analysis or reusable, composable proofs. And makes hard stuff easier for pro's.
7. Occasional thoughts on automatic programming via heuristics, patterns, rules, human-guided transforms, and so on. Got me started in advanced SW and my mind occasionally goes back to possibilities for achieving it.
8. A way to do any of that with the rapid iteration, safety, performance, live updating, and debugging properties of Genera LISP machine. I mean, we still don't have all that in mainstream tooling? ;)
I am working on a new computer language called Beads, which is designed to replace the current development stack for both personal computers, mobile devices, and web. You can find out more about this project at e-dejong.com
The focus of my tool is creating graphical interactive software: iphone and Android apps, desktop apps, and things that run in the browser. The notation is compact, readable, and straightforward. It has many innovative aspects: new data structures, physical units (something FORTRAN just got after 30 years in the in-basket), and deductive reasoning which dramatically reduces the length of programs. It is not a design by graphics system, but a language. It isn't that abstract, and is far more straightforward than Haskell or PROLOG. It is not a LISP derivative.
There are many different projects, I feel however that Idris and Scala are the places where innovation should be done. Although there are ideas in Subtext, etc, they should be implemented as libraries in Idris or Scala to gain the maximum usage from developers.
Jordan Pollack's group is interesting (genetic algorithms, etc.): http://www.cs.brandeis.edu/~pollack/ Web page is hilariously archaic as bonus
The Augmented Programming group: https://groups.google.com/forum/#!forum/augmented-programmin...
I still think there is something to agent oriented programming. Yoav Shoham https://en.wikipedia.org/wiki/Yoav_Shoham http://www.infor.uva.es/~cllamas/MAS/AOP-Shoham.pdf http://robotics.stanford.edu/~shoham/
Relational XPath Map (rxm) provides a syntactically terse domain-specific language (DSL) to query relational databases and generate structured documents.
https://bitbucket.org/djarvis/rxm/
https://bitbucket.org/djarvis/rxm/wiki/Discussion
Currently generates SQL code from a DSL, which can be run against a database to produce an XML document.
I really love Jonathan Edward's chart where he shows the big stack of technologies one has to master to build apps today. The goal is to replace that entire stack with one language. That would be the 10:1 improvement that would really make a difference. The burning question therefore, is what can replace that entire stack? Clearly you have to offer a way of storing and structuring data that improves upon the table. otherwise you are back in relational database hell.
I also interested in new ways for minimal coding & debugging.
I'm working on Animation CPU platform, ACPUL declarative algorithmic language for same purposes.
https://www.youtube.com/watch?v=ubNfWarTawI
Also LiveComment information tool:
Alan Kay’s Viewpoints Research Institute: http://vpri.org/ (with which Bret Victor is associated).
Probabilistic programming might interest you.
I've been toying around with the idea of SPL, Species Programming Language. See Brent Yorgey's thesis. Like APL, functions would be able to match where they bind within a data structure. Instead of just multidimensional arrays it would handle all combinatorial species; think lists, trees, graphs, cycles, ...
Still haven't found a syntax I like yet. Yorgey and Classen both have nice Haskell libraries as a springboard.
Wow, these comments mentions lots of work I wasn't aware of! We've been building a community of people working in this area: the Future Programming Workshop. http://www.future-programming.org/ We will all do better if we get together to exchange ideas and criticism. Suggestions for improving FPW are welcome.
Personally I think in the next 30 years we will be programming with thesame languages popular today. They will evolve to handle multicore better though. With tasks, async, couruoutines, actors, etc. Before too long we will have a mix of thousands of cores. Gpu, cpu, remote cores. We will have to figure out how to spread our programs across all of them. Well all be doing supercomputing.
I'm working on a live programming plugin for Visual Studio that supports C#: http://comealive.io
Previously we'd developed a code exploration tool: http://codeconnect.io
It might be worth checking out ethereum.org and the work they are doing there on their blockchain, as well as serpent / solidity programming languages. Also, check out ipfs. The distributed computation and storage model has broad applicability, and a good lens from which to view the world.
http://www.clafer.org/ Lightweight modeling
http://categoricaldata.net/fql.html FQL Functorial Query Languages
Intentional Programming (https://en.wikipedia.org/wiki/Intentional_programming), started by Charles Simony
Chris Granger seems to be doing some awesome work with 'Eve': https://www.youtube.com/watch?v=5V1ynVyud4M
I spent a day looking at these back in August. Summary is here:
undefined
Sylph is interesting: https://news.ycombinator.com/item?id=9126772
CDG (Communications Design Group) at SAP is where Bret Victor and a lot of other people work on research. I think a big focus is changing how we program in the future.
Social Machines by Mark Stahl ... https://github.com/socialmachines
I am working on FlowGrid (Visual dataflow on Android): http://flowgrid.org
Wolfram Mathematica is a pretty old project, did it change much over the last five years?
This may be a strange way of looking at it, but let's backtrack 30 years before and see what made the biggest differences. As I've been in the industry for about thirty years, my impression is that nothing much has changed. That seems strange, but it's still the same job that I started with.
For me it is interesting that in my career, code base sizes grew to gigantic proportions -- there are many applications that are 10s of millions of lines of code. In the middle of my career I worked on a few. Interestingly, I'm doing web development now and a lot of my colleagues think that 5000 lines is unbearably big. I think the take-away here is that we have gotten slightly better at abstracting things out and using development libraries (and dare I say, frameworks).
OO was just becoming a big thing at the start of my career. Everybody did it incredibly badly. Then Kent Beck and Ward Cunningham came along and told people how to do it not-so-badly. I think the biggest thing that I saw in this time frame was the breaking of the myths of OO being about code re-use, and the movement away from huge brittle design structures. Good OO developers moved back to the really basic ideals of dealing with coupling and cohesion in design. We even started to have a language to be able to discuss design intelligently. Of course, quite a huge number of people were oblivious to this, but it always struck me how amazing it was that Beck and Cunningham were really 15 years ahead of most of the rest of us.
Lately, functional programming is coming into vogue. For the second time in my career I was surprised. People in the know are talking about idempotence, and immutable structures. This was the stuff that the crazy people were talking about in the 80's -- stuff that was "too slow", and "too memory intensive" to take seriously. But now it's pretty obvious this is the way to go.
I think the other big thing that blew me away in the last 30 years was testing. Probably some people will remember pre-conditions, post-conditions, and class invariants. This was unfortunately forgotten by most, but the most astonishing thing was unit testing. Especially the practice of Test Driven Development that not only allowed you to document your code with executable examples, but also forced you to decouple your objects/modules by the very behaviour that creates the documentation. Very few people do this well (just like most of the other things I've mentioned), but it is completely game changing.
As for the future, what is coming up? I suggest you look at what has gone before you for hints to that. In the last 30 years, apart from TDD (which came completely out of the blue as far as I could tell), the major advancements came from stuff we already knew about. It was the stuff that the "crazy" people advocated in the 70's and 80's, but that seemed impractical. If I were to guess, I suspect that we will see further progress on decoupling in design. Immutable data structures will not just be common, they will be how ever professional designs code. As performance moves from single processing to multi-processing, this will be important. Look at innovative ways of processing like flow based processing and other crazy ideas from bygone years.
My last piece of advice: Don't look for large revolutionary changes. I think those are unlikely. The programmer of 30 years from now will probably still be doing the same job as I am today. The changes will be much more qualitative. Also, expect that the vast majority of programmers will be as oblivious to the advancements as most programmers are today.
Thoughts on Julia?
IMHO machine learning will automate coding to such an extent we all but a few will be unemployed.
This is a really broad question, about on par with asking "which fashion houses are putting out daring material and what will Dior be making that's popular 30 years from now". Software is just like any other cargo-cult industry where trends rise and fall almost like clockwork. From Rails, to Angular, to React.
RE: People/Groups who are researching 'new approaches to programming' - you have the typical universities putting out papers. Conferences like POPL and ICFP tend to be where most of the major academic work gets put out. From within the industry, commercial entities aren't really doing much, bar Microsoft Research, Cambridge (UK, not MA). They're really pushing the envelope with regards to strict PL research. www.rise4fun.com to see the dozens of projects they're putting out. Oracle, too, is surprisingly doing some interesting work.
30 years is a hard guess, but 5 years you'll certainly see: 1) a lot more emphasis on concurrency, at 14nm we're rapidly approaching the physical limitation of transistor density (which is why you're seeing 18 core Xeons). Sharing memory is notoriously hard, so the move towards immutability (whether it's pass by ref const in 'traditional' languages like C++ or more esoteric languages like Haskell, that's the direction it's going in, whether by using the actor model, STM, etc) 2) Especially with Intel's Knight's Landing. RDMA has been around for ages, but bringing it to Xeon means the common-man doesn't have to pay Infiniband prices for HBAs. RAM has been getting cheaper but imagine being able to just deck out a 42u filled to the brim with ~18 2u's of half a TB of DDR4 RDIMM a piece that your main application can access. 3) Disks, which used to be a huge thrashing bottleneck (who here is old enough to remember thrashing on GCC 2.95 when trying to compile E17?), are now posting amazing numbers, even for SSDs.
Effectively every level of computing that used to be a barrier (minus SRAM and the CPU caches which seem to have been stuck at the same spot for a while in terms of capacity) has, or will within 6 months be consumer accessible. I couldn't guess what's going to happen in 5 years. I can't even guess what's going to happen in 5 months and I've been at this nearly 20 years.
Ramsey Nasser is developing a language written entirely in arabic. http://nas.sr/%D9%82%D9%84%D8%A8/
My research is on hierarchies of composable domain-specific languages (see github, account 'combinatorylogic').