I've seen the future. The blurry outlines sketched by such briliant audio-visual feasts as Terminator came to fruition as in the future the world is ruled by self-aware software.
That's the bad news. The good news is that we haven't noticed.
We're too busy due to the fact that we're all playing the visually stunning "The Good Guy Kills the Bad Guys 2" on Playstation 22. It really captured the essence of the first. The theaters are ruled by "Animals - Did they exist?" documentary with some stunning CG of a horse, although I could have sworn that horses had 4, not 3 legs but then again I'm no nature expert.
What everyone was wrong about though is which program became self-aware and exerted its iron-fist like dominance upon the unsuspecting humans and the last few, very suspicious cockroaches. It wasn't a military mistake. It was a 3D framework that evolved.
But lets start from the beginning. First there was Mesa, the Open Source implementation of the OpenGL specification. Then there came Gallium3D, a new architecture for building 3D graphics drivers. Gallium3D modeled what modern graphics hardware was doing. Which meant that the framework was fully programmable and was actually code-generating its own pipeline. Every operation in Gallium3D was a combination of a vertex and a fragment shader. Internally Gallium3D was using a language called TGSI - a graphics dedicated intermediate representation.
Gallium3D was generating vertex and fragment shaders at run-time to describe what it was about to do. After that system was working, some engineers decided that it would make sense to teach Gallium3D to self-optimize the vertex/fragment programs that it, itself, was creating. LLVM was used for that purpose. It was used because it was an incredible compiler framework with a wonderful community. The decision proved to be the right one as Gallium3D with LLVM proved to be a match made in heaven. It was pure love. I'm not talking about the "roll over onto your stomach, take a deep breath, relax and lets experiment" kind of love, just pure and beautiful love.
So lets take a simple example to see what was happening. Lets deal with triangles, because they're magical.
Now to produce this example Gallium3D was creating two small programs. One that was run for every vertex in the triangle and calculated its position - it was really just multiplying the vertex by the current modelview matrix - that was the vertex shader. The other program was run on every fragment in this figure to produce the resulting pixels - that was the fragment shader. To execute these two programs they were being compiled into LLVM IR, LLVM optimization passes were run on them and LLVM code generators were used to produce executable code. People working on Gallium3D quickly noticed that, even though, their code wasn't optimized at all and it was doing terribly expensive conversions all the time, it was up to 10x faster with LLVM on some demos. They knew it was good.
So Gallium3D was in essence, at run-time, creating and optimizing itself. Which lead many Free Software enthusiast to create, wear and rarely even wash shirts with a slogan "We might not have a billion dollar budget but our graphics framework is smarter than all the people in your company together".
Then in the year 2113 Gallium3D got bored with just creating graphics and took control of the entire world. Which realistically speaking wasn't hard to do because we were willingly immersing ourselves in the worlds it was creating for us anyway.
But that's still many, many years away from our boring present. So for now, while you wait for sex robots, dinners in a tube and world without insects (or for that matter absolutely any animals at all) you can just go, get and play with Gallium3D where LLVM is used. At the moment only in the software cases, but the fifth of November is going to mark the first day in which work on code-generating directly for GPU's using LLVM is going to start.
Remember, remember the fifth of November... (oh, come on that's one heck of an ending)