Sunday, May 11, 2008

Dynamic Languages Strike Back

Some guys at Stanford invited me to speak at their EE Computer Systems Colloquium last week. Pretty cool, eh? It was quite an honor. I wound up giving a talk on dynamic languages: the tools, the performance, the history, the religion, everything. It was a lot of fun, and it went over surprisingly well, all things considered.

They've uploaded the video of my talk, but since it's a full hour, I figured I'd transcribe it for those of you who want to just skim it.

This is the first time I've transcribed a talk. It's tricky to decide how faithful to be to my spoken wording and phrasing. I've opted to try to make it very faithful, with only minor smoothing.

Unfortunately I wound up using continuation-passing style for many of my arguments: I'd occasionally get started on some train of thought, get sidetracked, and return to it two or three times in the talk before I finally completed it. However, I've left my rambling as-is, modulo a few editor's notes, additions and corrections in [brackets].

I didn't transcribe Andy's introduction, as it seems immodest to do so. It was funny, though.

Technical corrections are welcome. I'm sure I misspoke, oversimplified, over-generalized and even got a few things flat-out wrong. I think the overall message will survive any technical errors on my part.

The talk...

Thank you everybody! So the sound guys told me that because of a sound glitch in the recording, my normally deep and manly voice, that you can all hear, is going to come through the recording as this sort of whiny, high-pitched geek, but I assure you that's not what I actually sound like.



So I'm going to be talking about dynamic languages. I assume that you're all dynamic language interest... that you've got an interest, because there's a dude down the hall talking about Scala, which is you know, this very strongly typed JVM language (a bunch of you get up and walk over there – exactly.) So you know, presumably all the people who are like really fanatical about strong typing, who would potentially maybe get a little offended about some of the sort of colorful comments I might inadvertently make during this talk — which, by the way, are my own opinions and not Google's — well, we'll assume they're all over there.

All right. I assume you all looked through the slides already, so I don't need to spend a whole lot of time with them. I'll go into major rant-mode here at the end. My goal is... for you guys to come away with, sort of a couple of new pictures in your mind, thinking about how languages have evolved over the last 20 years, where they're going, what we can do to fix them, that kind of thing.

Does anyone here know how to use a Mac? It's showing me this weird, uh... thing... OK. All right. Here goes.

So!



Popular opinion of dynamic languages: slooooow! They're always talking about how Python is really slow, right? Python is, what, like 10x to 100x slower? And they have bad tools.

And also there's this sort of, kind of difficult-to-refute one, that says at millions of lines of code, they're maintenance nightmares, right? Because they don't have static types. That one, uh, unfortunately we're not going to be able to talk much about, because not many people have millions-of-lines code bases for us to look at — because dynamic languages wind up with small code bases. But I'll talk a little bit about it.



So first of all, one of my compatriots here, who's an actual smart person, like probably everybody in this room, you're all waaay smarter than me — I got invited here for the booger jokes, all right? – he's a languages guy, and he said: "You know, you can't talk about dynamic languages without precisely defining what you mean."

So I'm going to precisely define it. Dynamic languages are, by definition... Perl, Python, Ruby, JavaScript, Lua, Tcl... all right? [(laughter)] It's the working set of languages that people dismiss today as "dynamic languages." I'll also include Smalltalk, Lisp, Self, Prolog, some of our stars, you know, from the 70s and 80s that, uh, well they'll come up here today too.

I'm deliberately not going down the path of "well, some static languages have dynamic features, and some dynamic languages have static types", because first of all it's this neverending pit of, you know, argument, and second of all, as you're going to see, it's completely irrelevant to my talk. The two... sort of qualities that people associate with "dynamic": one would be sort of... runtime features, starting with eval, and the other would be the lack of type tags, the lack of required type tags, or even just escapes in your type system. These things work together to produce the tools problems and the performance problems, ok? And I'll talk about them, and how they're going to be fixed.

All right!



I just talked about that [slide].



So! Uh... yeah, that's right, I'm at Stanford! Forgot about that. So I've been interviewing for about 20 years, at a whole bunch of companies, and yeah, Stan– every school has this sort of profile, right? You know, the candidates come out with these ideals that their profs have instilled in them. And Stanford has a really interesting one, by and large: that their undergrads and their grad students come out, and they believe that C and C++ are the fabric with which God wove the Universe. OK? And they truly [think]: what is it with all these other languages?

Whereas like MIT and Berkeley, they come out, and they're like "languages, languages, languages!" and you're like, uh, dude, you actually have to use C and C++, and they're like "oh." So it's funny, the kinds of profiles that come out. But this one [first slide bullet point], I mean, it's kind of a funny thing to say, because the guy's a Ph.D., and he's just discovered Turing's thesis. Of course all you need is C or C++. All you need is a Turing machine, right? You know?

What we're talking about here is fundamentally a very personal, a very political, sort of a, it's almost a fashion statement about who you are, what kind of language you pick. So, you know... unfortunately we could talk, I mean I've got 2 hours of ranting in me about this topic, but I'm gonna have to, like, kinda like narrow it down to... we're gonna talk about dynamic languages because people are out there today using them. They're getting stuff done, and it works. All right? And they really do have performance and tools issues.

But they're getting resolved in really interesting ways. And I'm hoping that those of you who are either going out into the industry to start making big things happen, OR, you're researchers, who are going to be publishing the next decade's worth of papers on programming language design, will take some interesting directional lessons out of this talk. We'll see.



All right. So why are dynamic languages slow? Uh, we all know they're slow because... they're dynamic! Because, ah, the dynamic features defeat the compiler. Compilers are this really well understood, you know, really really thoroughly researched... everybody knows THIS [brandish the Dragon Book], right?

Compilers! The Dragon Book! From your school! OK? It's a great book. Although interestingly, heh, it's funny: if you implement everything in this book, what you wind up with is a really naïve compiler. It's really advanced a long way since... [the book was written] and they know that.

Dynamic languages are slow because all the tricks that compilers can do to try to guess how to generate efficient machine code get completely thrown out the window. Here's one example. C is really fast, because among other things, the compiler can inline function calls. It's gonna use some heuristics, so it doesn't get too much code bloat, but if it sees a function call, it can inline it: it patches it right in, ok, because it knows the address at link time.

C++ — you've got your virtual method dispatch, which is what C++ you know, sort of evangelists, that's the first thing they go after, like in an interview, "tell me how a virtual method table works!" Right? Out of all the features in C++, they care a lot about that one, because it's the one they have to pay for at run time, and it drives them nuts! It drives them nuts because the compiler doesn't know, at run time, the receiver's type.

If you call foo.bar(), foo could be some class that C++ knows about, or it could be some class that got loaded in afterwards. And so it winds up — this polymorphism winds up meaning the compiler can compile both the caller and the callee, but it can't compile them together. So you get all the overhead of a function call. Plus, you know, the method lookup. Which is more than just the instructions involved. You're also blowing your instruction cache, and you're messing with all these, potentially, code optimizations that could be happening if it were one basic-block fall-through.

All right. Please – feel free to stop me or ask questions if I say something that's unclear. I know, just looking around the room, that most of you probably know this stuff better than I do.

So! The last [bullet point] is really interesting. Because nobody has tried, for this latest crop of languages, to optimize them. They're scripting languages, right? They were actually designed to either script some host environment like a browser, or to script Unix. I mean the goal was to perform these sort of I/O-bound computations; there was no point in making them fast. Except when people started trying to build larger and larger systems with them: that's when speed really started becoming an issue.



OK. So obviously there's a bunch of ways you can speed up a dynamic language. The number one thing you can do, is you can write a better program. The algorithm, you know, is gonna trump any of the stuff you're doing at the VM – you can optimize the hell out of Bubble Sort, but...

Native threads would be really nice. Perl, Python, Ruby, JavaScript, Lua... none of them has a usable concurrency option right now. None of them. I mean, they kinda have them, but they're like, Buyer Beware! Don't ever use this on a machine with more than one processor. Or more than one thread. And then you're OK. It's just, you know...

So actually, this is funny, because, all right, show of hands here. We've all heard this for fifteen years now – is it true? Is Java as fast as C++? Who says yes? All right... we've got a small number of hands... so I assume the rest of you are like, don't know, or it doesn't matter, or "No."

[Audience member: "We read your slides."] You read my slides. OK. I don't know... I can't remember what I put in my slides.

But it's interesting because C++ is obviously faster for, you know, the short-running [programs], but Java cheated very recently. With multicore! This is actually becoming a huge thorn in the side of all the C++ programmers, including my colleagues at Google, who've written vast amounts of C++ code that doesn't take advantage of multicore. And so the extent to which the cores, you know, the processors become parallel, C++ is gonna fall behind.

Now obviously threads don't scale that well either, right? So the Java people have got a leg up for a while, because you can use ten threads or a hundred threads, but you're not going to use a million threads! It's not going to be Erlang on you all of the sudden. So obviously a better concurrency option – and that's a huge rat's nest that I'm not going to go into right now – but it's gonna be the right way to go.

But for now, Java programs are getting amazing throughput because they can parallelize and they can take advantage of it. They cheated! Right? But threads aside, the JVM has gotten really really fast, and at Google it's now widely admitted on the Java side that Java's just as fast as C++. [(laughter)]

So! It's interesting, because every once in a while, a C++ programmer, you know, they flip: they go over to the Dark Side. I've seen it happen to some of the most brilliant C++ hackers, I mean they're computer scientists, but they're also C++ to the core. And all of a sudden they're stuck with some, you know, lame JavaScript they had to do as an adjunct to this beautiful backend system they wrote. And they futz around with it for a while, and then all of a sudden this sort of light bulb goes off, and they're like "Hey, what's up with this? This is way more productive, you know, and it doesn't seem to be as slow as I'd sort of envisioned it to be."

And then they maybe do some build scripting in Python, and then all of a sudden they come over to my desk and they ask: "Hey! Can any of these be fast?" Ha, ha, ha! I mean, these are the same people that, you know, a year ago I'd talk to them and I'd say "why not use... anything but C++? Why not use D? Why not use Objective-C? Why not use anything but C++? Right?

Because we all know that C++ has some very serious problems, that organizations, you know, put hundreds of staff years into fixing. Portability across compiler upgrades, across platforms, I mean the list goes on and on and on. C++ is like an evolutionary sort of dead-end. But, you know, it's fast, right?

And so you ask them, why not use, like, D? Or Objective-C. And they say, "well, what if there's a garbage collection pause?"

Oooh! [I mock shudder] You know, garbage collection – first of all, generational garbage collectors don't have pauses anymore, but second of all, they're kind of missing the point that they're still running on an operating system that has to do things like process scheduling and memory management. There are pauses. It's not as if you're running DOS! I hope. OK?

And so, you know, their whole argument is based on these fallacious, you know, sort of almost pseudo-religious... and often it's the case that they're actually based on things that used to be true, but they're not really true anymore, and we're gonna get to some of the interesting ones here.

But mostly what we're going to be talking about today is the compilers themselves. Because they're getting really, really smart.



All right, so first of all I've gotta give a nod to these languages... which nobody uses. OK? Common Lisp has a bunch of really high-quality compilers. And when they say they achieve, you know, "C-like speed", you've gotta understand, you know, I mean, there's more to it than just "does this benchmark match this benchmark?"

Everybody knows it's an ROI [calculation]. It's a tradeoff where you're saying: is it sufficiently fast now that the extra hardware cost for it being 10 or 20 percent slower (or even 2x slower), you know, is outweighed by the productivity gains we get from having dynamic features and expressive languages. That's of course the rational approach that everyone takes, right?

No! Lisp has all those parentheses. Of course nobody's gonna look at it. I mean, it's ridiculous how people think about these things.

But with that said, these were actually very good languages. And let me tell you something that's NOT in the slides, for all those of you who read them in advance, OK? This is my probably completely wrong... it's certainly over-generalized, but it's a partly true take on what happened to languages and language research and language implementations over the last, say 30 years.

There was a period where they were kind of neck and neck, dynamic and static, you know, there were Fortran and Lisp, you know, and then there was a period where dynamic languages really flourished. They really took off. I mean, I'm talking about the research papers, you can look: there's paper after paper, proofs...

And implementations! StrongTalk was really interesting. They added a static type system, an optional static type system on top of Smalltalk that sped it up like 20x, or maybe it was 12x. But, you know, this is a prototype compiler that never even made it into production. You've gotta understand that when a researcher does a prototype, right, that comes within, you know, fifty percent of the speed gains you can achieve from a production compiler... because they haven't done a tenth, a hundredth of the optimizations that you could do if you were in the industry cranking interns through the problem, right?

I mean HotSpot's VM, it's got like ten years of Sun's implementation into not one, but two compilers in HotSpot, which is a problem they're trying to address. So we're talking about, you know, a 12x gain really translates to something a lot larger than that when you put it into practice.

In case I forget to mention it, all these compiler optimizations I'm talking about, I do mean all of them, are composable. Which is really important. It's not like you have to choose this way or you have to choose that way. They're composable, which means they actually reinforce each other. So God only knows how fast these things can get.



This is the only interesting... this is actually the only, I would say, probably original, sort of compelling thought for this talk today. I really – I started to believe this about a week ago. All right? Because it's an urban legend [that they change every decade]. You know how there's Moore's Law, and there are all these conjectures in our industry that involve, you know, how things work. And one of them is that languages get replaced every ten years.

Because that's what was happening up until like 1995. But the barriers to adoption are really high. One that I didn't put on the slide here, I mean obviously there's the marketing, you know, and there's the open-source code base, and there are legacy code bases.

There's also, there are also a lot more programmers, I mean many more, orders of magnitude more, around the world today than there were in 1995. Remember, the dot-com boom made everybody go: "Oooh, I wanna be in Computer Science, right? Or I just wanna learn Python and go hack." OK? Either way. (The Python hackers probably made a lot more money.)

But what we wound up with was a bunch of entry-level programmers all around the world who know one language, whichever one it is, and they don't want to switch. Switching languages: the second one is your hardest. Because the first one was hard, and you think the second one's going to be that bad, and that you wasted the entire investment you put into learning the first one.

So, by and large, programmers – you know, the rank-and-file – they pretty much pick a language and they stay with it for their entire career. And that is why we've got this situation where now, this... See, there's plenty of great languages out there today. OK?

I mean obviously you can start with Squeak, sort of the latest Smalltalk fork, and it's beautiful. Or you can talk about various Lisp implementations out there that are smokin' fast, or they're smokin' good. Or in one or two cases, both.

But also there's, like, the Boo language, the io language, there's the Scala language, you know, I mean there's Nice, and Pizza, have you guys heard about these ones? I mean there's a bunch of good languages out there, right? Some of them are really good dynamically typed languages. Some of them are, you know, strongly [statically] typed. And some are hybrids, which I personally really like.

And nobody's using any of them!

Now, I mean, Scala might have a chance. There's a guy giving a talk right down the hall about it, the inventor of – one of the inventors of Scala. And I think it's a great language and I wish him all the success in the world. Because it would be nice to have, you know, it would be nice to have that as an alternative to Java.

But when you're out in the industry, you can't. You get lynched for trying to use a language that the other engineers don't know. Trust me. I've tried it. I don't know how many of you guys here have actually been out in the industry, but I was talking about this with my intern. I was, and I think you [(point to audience member)] said this in the beginning: this is 80% politics and 20% technology, right? You know.

And [my intern] is, like, "well I understand the argument" and I'm like "No, no, no! You've never been in a company where there's an engineer with a Computer Science degree and ten years of experience, an architect, who's in your face screaming at you, with spittle flying on you, because you suggested using, you know... D. Or Haskell. Or Lisp, or Erlang, or take your pick."

In fact, I'll tell you a funny story. So this... at Google, when I first got there, I was all idealistic. I'm like, wow, well Google hires all these great computer scientists, and so they must all be completely language-agnostic, and ha, ha, little do I know... So I'm up there, and I'm like, we've got this product, this totally research-y prototype type thing, we don't know. We want to put some quick-turnaround kind of work into it.

But Google is really good at building infrastructure for scaling. And I mean scaling to, you know, how many gazillion transactions per second or queries per second, you know, whatever. They scale like nobody's business, but their "Hello, World" takes three days to get through. At least it did when I first got to Google. They were not built for rapid prototyping, OK?

So that means when you try to do what Eric Schmidt talks about and try to generate luck, by having a whole bunch of initiatives, some of which will get lucky, right? Everybody's stuck trying to scale it from the ground up. And that was unacceptable to me, so I tried to... I made the famously, horribly, career-shatteringly bad mistake of trying to use Ruby at Google, for this project.

And I became, very quickly, I mean almost overnight, the Most Hated Person At Google. And, uh, and I'd have arguments with people about it, and they'd be like Nooooooo, WHAT IF... And ultimately, you know, ultimately they actually convinced me that they were right, in the sense that there actually were a few things. There were some taxes that I was imposing on the systems people, where they were gonna have to have some maintenance issues that they wouldn't have [otherwise had]. Those reasons I thought were good ones.

But when I was going through this debate, I actually talked to our VP Alan Eustace, who came up to a visit to Kirkland. And I was like, "Alan!" (after his talk) "Let's say, hypothetically, we've got this team who are really smart people..."

And I point to my friend Barry [pretending it's him], and I'm like: "Let's say they want to do something in a programming language that's not one of the supported Google languages. You know, like what if they wanted to use, you know, Haskell?"

What I really wanted to do at the time was use Lisp, actually, but I didn't say it. And [Alan] goes, "Well!" He says, "Well... how would you feel if there was a team out there who said they were gonna use... LISP!" [(laughter)]

He'd pulled his ace out of his [sleeve], and brandished it at me, and I went: "that's what I wanted to use." And he goes, "Oh." [(turning away quickly)] And that was the end of the conversation. [(laughter)]

But you know, ultimately, and it comes up all the time, I mean we've got a bunch of famous Lisp people, and (obviously) famous Python people, and you know, famous language people inside of Google, and of course they'd like to do some experimentation. But, you know, Google's all about getting stuff done.

So that brings us full circle back to the point of this topic, which is: the languages we have today, sorted by popularity at this instant, are probably going to stay about that popular for the next ten years.

Sad, isn't it? Very, very sad. But that's the way it is.



So how do we fix them?



How – how am I doing for time? Probably done, huh? Fifteen minutes? [(audience member: no, more than that)] OK, good.




So! I'm gonna talk a little bit about tools, because one interesting thing I noticed when I was putting this thing together, right, was that the ways you solve tools problems for dynamic languages are very similar to the way you solve perf problems. OK? And I'm not going to try to keep you guessing or anything. I'll tell you what the sort of... kernel of the idea is here.

It's that... the notion of "static" versus "dynamic", where you kind of have to do all these optimizations and all these computations statically, on a language, is very old-fashioned. OK? And increasingly it's becoming obvious to everybody, you know, even the C++ crowd, that you get a lot better information at run-time. *Much* better information.

In particular, let me come back to my inlining example. Java inlines polymorphic methods! Now the simplest way to do it was actually invented here at Stanford by Googler Urs Hoelzle, who's, you know, like VP and Fellow there, and it's called, it's now called Polymorphic Inline Caching. He called it, uh, type-feedback compilation, I believe is what he called it. Great paper. And it scared everybody, apparently. The rumors on the mailing lists were that people were terrified of it, I mean it seems too hard. And if you look at it now, you're like, dang, that was a good idea.

All it is, I mean, I told you the compiler doesn't know the receiver type, right? But the thing is, in computing, I mean, heuristics work pretty well. The whole 80/20 rule and the Power Law apply pretty much unilaterally across the board. So you can make assumptions like: the first time through a loop, if a particular variable is a specific instance of a type, then it's probably going to be [the same type] on the remaining iterations of the loop. OK?

So what he [Urs] does, is he has these counters at hot spots in the code, in the VM. And they come in and they check the types of the arguments [or operands]. And they say, all right, it looks like a bunch of them appear to be class B, where we thought it might be class A.

So what we're gonna do is generate this fall-through code that says, all right, if it's a B – so they have to put the guard instruction in there; it has to be correct: it has to handle the case where they're wrong, OK? But they can make the guard instruction very, very fast, effectively one instruction, depending on how you do it. You can compare the address of the intended method, or you can maybe do a type-tag comparison. There are different ways to do it, but it's fast, and more importantly, if it's right, which it is 80-90% of the time, it falls through [i.e., inlines the method for that type - Ed.], which means you maintain your processor pipeline and all that stuff.

So it means they have predicted the type of the receiver. They've successfully inlined that. I mean, you can do a whole bunch of branching, and they actually found out through some experimentation that you only need to do 2 to 4 of these, right, before the gain completely tails off. So you don't have to generate too much of this. And they've expanded on this idea now, for the last ten years.

Getting back to my point about what's happening [over the past 30 years], there was an AI winter. You all remember the AI winter, right? Where, like, investors were pumping millions of dollars into Smalltalk and Lisp companies who were promising they'd cure world hunger, cure cancer, and everything?

And unfortunately they were using determinism!

They're using heuristics, OK, but you know... before I came to Google, you know, I was really fascinated by something Peter Norvig was saying. He was saying that they don't do natural language processing deterministically any more. You know, like maybe, conceivably, speculating here, Microsoft Word's grammar checker does it, where you'd have a Chomsky grammar, right? And you're actually going in and you're doing something like a compiler does, trying to derive the sentence structure. And you know, whatever your output is, whether it's translation or grammar checking or whatever...

None of that worked! It all became way too computationally expensive, plus the languages kept changing, and the idioms and all that. Instead, [Peter was saying] they do it all probablistically.

Now historically, every time you came along, and you just obsoleted a decade of research by saying, "Well, we're just gonna kind of wing it, probabilistically" — and you know, Peter Norvig was saying they get these big data sets of documents that have been translated, in a whole bunch of different languages, and they run a bunch of machine learning over it, and they can actually match your sentence in there to one with a high probability of it being this translation.

And it's usually right! It certainly works a lot better than deterministic methods, and it's computationally a lot cheaper.

OK, so whenever you do that, it makes people MAD.

Their first instinct is to say "nuh-UUUUUUH!!!!" Right? I'm serious! I'm serious. It happened when John von Neumann [and others] introduced Monte Carlo methods. Everyone was like "arrgggggh", but eventually they come around to it. They go "yeah, I guess you're right; I'll go back and hit the math books again."

It's happening in programming languages today. I mean, as we speak. I mean, there's a paper I'm gonna tell you about, from October, and it's basically coming along and... it's not really machine learning, but you're gonna see it's the same kind of [data-driven] thing, right? It's this "winging it" approach that's actually much cheaper to compute. And it has much better results, because the runtime has all the information.

So let me just finish the tools really quick.



And I'm not talking to you guys; I'm talking to the people in the screen [i.e. watching the recording] – all these conversations I've had with people who say: "No type tags means no information!" I mean, effectively that's what they're saying.

I mean...
function foo(a, b) { return a + b; }

var bar = 17.6;

var x = {a: "hi", b: "there"};
What's foo? It's a function. How did I know that? [(laughter)] What's bar? What's x? You know, it's a composite type. It's an Object. It has two fields that are strings. Call it a record, call it a tuple, call it whatever you want: we know what it is.

The syntax of a language, unless it's Scheme, gives you a lot of clues about the semantics, right? That's actually the one place, maybe, where lots of syntax actually wins out [over Scheme]. I just thought of that. Huh.



OK, so... then you get into dynamic languages. This [code] is all JavaScript. This is actually something I'm working on right now. I'm trying to build this JavaScript code graph, and you actually have to know all these tricks. And of course it's undecidable, right, I mean this is, you know, somebody could be defining a function at the console, and I'm not gonna be able to find that.

So at some point you've gotta kind of draw the line. What you do is, you look at your corpus, your code base, and see what are the common idioms that people are using. In JavaScript, you've got a couple of big standard libraries that everybody seems to be including these days, and they all have their slightly different ways of doing function definitions. Some of them use Object literals; some of them use the horrible with statement, you know, that JavaScript people hate.

But your compiler can figure all these out. And I was actually going through this Dragon Book, because they can even handle aliasing, right? Your IDE for JavaScript, if I say "var x = some object", and you know...

Did I handle this here [in the slides]?



Yeah, right here! And I say, foo is an object, x is foo, and I have an alias now. The algorithm for doing this is right here in the Dragon Book. It's data-flow analysis. Now they use it for compiler optimization to do, you know, live variable analysis, register allocation, dead-code elimination, you know, the list kind of goes on. It's a very useful technique. You build this big code graph of basic blocks...

So it's actually one of the few static-analysis that's actually carrying over in this new dynamic world where we have all this extra information. But you can actually use it in JavaScript to figure out function declarations that didn't actually get declared until way later in the code.



Another big point that people miss is that the Java IDEs, you know, that are supposedly always right? They're wrong. If you miss one time, you're wrong. Right? In Java Reflection, obviously, the IDE has no information about what's going on in that string, by definition. It's a string: it's quoted; it's opaque.

And so they always wave their hands and say "Ohhhhh, you can't do Rename Method!"

Even though Rename Method came from the Smalltalk environment, of course, right? And you say, "It came from the Smalltalk environment, so yes, you can do Rename Method in dynamic languages."

And they say "NO! Because it'll miss sometimes!"

To which, I say to you people in the screen, you'd be astonished at how often the Java IDEs miss. They miss every single instance of a method name that shows up in an XML configuration file, in a reflection layer, in a database persistence layer where you're matching column names to fields in your classes. Every time you've deployed some code to some people out in the field...

Rename Method only works in a small set of cases. These Refactoring tools that, really, they're acting are like the Holy Grail, you can do ALL of that in dynamic languages. That's the proof, right? [I.e., static langs miss as often as dynamic – Ed.]

It's not even a very interesting topic, except that I just run across it all the time. Because you ask people, "hey, you say that you're ten times as productive in Python as in your other language... why aren't you using Python?"

Slow? Admittedly, well, we'll get to that.

And tools. Admittedly. But I think what's happened here is Java has kind of shown the new crop of programmers what Smalltalk showed us back in the 80s, which is that IDEs can work and they can be beautiful.

And more importantly – and this isn't in the slides either, for those of you who cheated – they have to be tied to the runtime. They complain, you know, the Java people are like "Well you have to have all the code loaded into the IDE. That's not scalable, it's not flexible, they can't simulate the program just to be able to get it correct."

And yet: any sufficiently large Java or C++ system has health checks, monitoring, it opens sockets with listeners so you can ping it programmatically; you can get, you know, debuggers, you can get remote debuggers attached to it; it's got logging, it's got profiling... it's got this long list of things that you need because the static type system failed.

OK... Why did we have the static type system in the first place?

Let me tell you guys a story that, even if you know all this stuff, is still going to shock you. I credit Bob Jervis for sharing this with me (the guy who wrote Turbo C.)

So javac, the Java compiler: what does it do? Well, it generates bytecode, does some optimizations presumably, and maybe tells you some errors. And then you ship it off to the JVM. And what happens to that bytecode? First thing that happens is they build a tree out of it, because the bytecode verifier has to go in and make sure you're not doing anything [illegal]. And of course you can't do it from a stream of bytes: it has to build a usable representation. So it effectively rebuilds the source code that you went to all that effort to put into bytecode.

But that's not the end of it, because maybe javac did some optimizations, using the old Dragon Book. Maybe it did some constant propagation, maybe it did some loop unrolling, whatever.

The next thing that happens in the JVM is the JIT undoes all the optimizations! Why? So it can do better ones because it has runtime information.

So it undoes all the work that javac did, except maybe tell you that you had a parse error.

And the weird thing is, Java keeps piling... I'm getting into rant-mode here, I can tell. We're never going to make it to the end of these slides. Java keeps piling syntax on, you know, but it's not making the language more expressive. What they're doing is they're adding red tape and bureacracy for stuff you could do back in Java 1.0.

In Java 1.0, when you pulled a String out of a Hashtable you had to cast it as a String, which was really stupid because you said
String foo = (String) hash.get(...)
You know, it's like... if you had to pick a syntax [for casting], you should at least pick one that specifies what you think it's supposed to be, not what it's becoming – obviously becoming – on the left side, right?

And everybody was like, "I don't like casting! I don't like casting!" So what did they do? What they could have done is they could have said, "All right, you don't have to cast anymore. We know what kind of variable you're trying to put it in. We'll cast it, and [maybe] you'll get a ClassCastException."

Instead, they introduced generics, right, which is this huge, massive, category-theoretic type system that they brought in, where you have to under[stand] – to actually use it you have to know the difference between covariant and contravariant return [and argument] types, and you have to understand why every single mathematical... [I tail off in strangled frustration...]

And then what happens on mailing lists is users say: "So I'm trying to do X." And they say: "WELL, for the following category-theoretic reasons ...there's no way to do it." And they go: "Oh! Oh. Then I'm gonna go use JavaScript, then." Right?

I mean, it's like, what the hell did this type system do for Java? It introduced inertia and complexity to everybody who's writing tools, to everybody who's writing compilers, to everybody who's writing runtimes, and to everybody who's writing code. And it didn't make the language more expressive.

So what's happening? Java 7 is happening. And I encourage you all to go look at that train wreck, because oh my God. Oh, God. I didn't sleep last night. I'm all wired right now because I looked at Java 7 last night. And it was a mistake. [(laughter)] Ohhh...



OK. So! Moving right back along to our simple dynamic languages, the lesson is: it's not actually harder to build these tools [for dynamic languages]. It's different. And nobody's done the work yet, although people are starting to. And actually IntelliJ is a company with this IDEA [IDE], and they... my friends show off the JavaScript tool, you know, and it's like, man! They should do one for Python, and they should do one for every single dynamic language out there, because they kick butt at it. I'm sure they did all this stuff and more than I'm talking about here.



All right. Now we can talk about perf. This is the Crown Jewels of the talk. Yeah. So... unfortunately I have to make the disclaimer that everybody thinks about performance wrong, except for you guys 'cuz you all know, right? But seriously, I mean, you know, you understand, I started out of school... *sigh*

OK: I went to the University of Washington and [then] I got hired by this company called Geoworks, doing assembly-language programming, and I did it for five years. To us, the Geoworkers, we wrote a whole operating system, the libraries, drivers, apps, you know: a desktop operating system in assembly. 8086 assembly! It wasn't even good assembly! We had four registers! [Plus the] si [register] if you counted, you know, if you counted 386, right? It was horrible.

I mean, actually we kind of liked it. It was Object-Oriented Assembly. It's amazing what you can talk yourself into liking, which is the real irony of all this. And to us, C++ was the ultimate in Roman decadence. I mean, it was equivalent to going and vomiting so you could eat more. They had IF! We had jump CX zero! Right? They had "Objects". Well we did too, but I mean they had syntax for it, right? I mean it was all just such weeniness. And we knew that we could outperform any compiler out there because at the time, we could!

So what happened? Well, they went bankrupt. Why? Now I'm probably disagreeing – I know for a fact that I'm disagreeing with every Geoworker out there. I'm the only one that holds this belief. But it's because we wrote fifteen million lines of 8086 assembly language. We had really good tools, world class tools: trust me, you need 'em. But at some point, man...

The problem is, picture an ant walking across your garage floor, trying to make a straight line of it. It ain't gonna make a straight line. And you know this because you have perspective. You can see the ant walking around, going hee hee hee, look at him locally optimize for that rock, and now he's going off this way, right?

This is what we were, when we were writing this giant assembly-language system. Because what happened was, Microsoft eventually released a platform for mobile devices that was much faster than ours. OK? And I started going in with my debugger, going, what? What is up with this? This rendering is just really slow, it's like sluggish, you know. And I went in and found out that some title bar was getting rendered 140 times every time you refreshed the screen. It wasn't just the title bar. Everything was getting called multiple times.

Because we couldn't see how the system worked anymore!

Small systems are not only easier to optimize, they're possible to optimize. And I mean globally optimize.

So when we talk about performance, it's all crap. The most important thing is that you have a small system. And then the performance will just fall out of it naturally.

That said, all else being equal, let's just pretend that Java can make small systems. Heh, that's a real stretch, I know. Let's talk about actual optimization.



And by the way, here are some real examples, sort of like the Geoworks one, where a slower language wound up with a faster system. It's not just me. I've seen it all over the place. Do you know why this one happened? Why was the Ruby on Rails faster than Struts? This started one of the internet's largest flamewars since Richard Stallman dissed Tcl back in the 80s, you know. You guys remember that? [(laughter)]

I mean, the Java people went nuts, I mean really really nuts, I mean like angry Orcs, they were just like AAAaaaaauuuugh, they did NOT want to hear it. OK? It was because they were serializing everything to and from XML because Java can't do declarations. That's why. That's the reason. I mean, stupid reasons, but performance comes from some strange places.

That said, OK, disclaimers out of the way...



Yeah yeah, people are using them.



Um, yeah. So JavaScript. JavaScript has been really interesting to me lately, because JavaScript actually does care about performance. They're the first of the modern dynamic languages where performance has become an issue not just for the industry at large, but also increasingly for academia.

Why JavaScript? Well, it was Ajax. See, what happened was... Lemme tell ya how it was supposed to be. JavaScript was going away. It doesn't matter whether you were Sun or Microsoft or anybody, right? JavaScript was going away, and it was gonna get replaced with... heh. Whatever your favorite language was.

I mean, it wasn't actually the same for everybody. It might have been C#, it might have been Java, it might have been some new language, but it was going to be a modern language. A fast language. It was gonna be a scalable language, in the sense of large-scale engineering. Building desktop apps. That's the way it was gonna be.

The way it's really gonna be, is JavaScript is gonna become one of the smokin'-est fast languages out there. And I mean smokin' fast.

Now it's not the only one that's making this claim. There's actually a lot of other... you guys know about PyPy? Python in Python? Those crack fiends say they can get C-like performance. Come on... COME ON! They... I mean, seriously! That's what they say.

Here's the deal, right? They're saying it because they're throwing all the old assumptions out. They can get this performance by using these techniques here, fundamentally. But if nobody believes them, then even when they achieve this performance it's not gonna matter because still nobody's gonna believe them, so all of this stuff we're talking about is a little bit moot.

Nevertheless, I'm going to tell you about some of the stuff that I know about that's going on in JavaScript.



So type inference. You can do type inference. Except that it's lame, because it doesn't handle weird dynamic features like upconverting integers to Doubles when they overflow. Which JavaScript does, interestingly enough, which is I guess better behavior than... I mean, it still overflows eventually, right?

We overflowed a long at Google once. Nobody thought that was possible, but it actually happened. I'll tell you about that later if you want to know.



So... oh yeah, I already talked about Polymorphic Inline Caches. Great! I already talked about a lot of this stuff.



This one's really cool. This is a trick that somebody came up with, that you can actually – there's a paper on it, where you can actually figure out the actual types of any data object in any dynamic language: figure it out the first time through by using this double virtual method lookup. They've boxed these things. And then you just expect it to be the same the rest of the time through [the loop], and so all this stuff about having a type-tag saying this is an int – which might not actually be technically correct, if you're going to overflow into a Double, right? Or maybe you're using an int but what you're really using is a byte's worth of it, you know. The runtime can actually figure things out around bounds that are undecidable at compile time.

So that's a cool one.



This is the really cool one. This is the really, really cool one. Trace trees. This is a paper that came out in October. This is the one, actually... I'll be honest with you, I actually have two optimizations that couldn't go into this talk that are even cooler than this because they haven't published yet. And I didn't want to let the cat out of the bag before they published. So this is actually just the tip of the iceberg.

But trace trees, it's a really simple idea. What you do is your runtime, your VM, you know, it's interpreting instructions and can count them. Well, it can also record them! So any time it hits, basically, a branch backwards, which usually means it's going to the beginning of a loop, which usually means it's going to be a hot spot, especially if you're putting a counter there... Obviously [in] the inner loops, the hot spots will get the highest counts, and they get triggered at a certain level.

It turns on a recorder. That's all it does. It starts recording instructions. It doesn't care about loop boundaries. It doesn't care about methods. It doesn't care about modules. It just cares about "What are you executing?"

And it records these tree – well actually, traces, until they get back to that point. And it uses some heuristics to throw stuff away if it goes too long or whatever. But it records right through methods. And instead of setting up the activation, it just inlines it as it goes. Inline, inline, inline, right? So they're big traces, but they're known to be hot spots.

And even here in the Dragon Book, Aho, Sethi and Ullman, they say, you know, one of the most important things a compiler can do is try to identify what the hot spots are going to be so it can make them efficient. Because who cares if you're optimizing the function that gets executed once at startup, right?

So these traces wind up being trees, because what can happen is, they branch any time an operand is a different type. That's how they handle the overflow to Double: there'll be a branch. They wind up with these trees. They've still got a few little technical issues like, for example, growing exponentially on the Game of Life. There's a blog about it, um... I'm sorry, I've completely forgotten his name [Andreas Gal], but I will blog this. And the guy that's doing these trace trees, he got feedback saying that they've got exponential growth.

So they came up with this novel way of folding the trace trees, right, so there are code paths that are almost identical and they can share, right?

It's all the same kind of stuff they were doing with these [Dragon Book] data structures back when they were building static compilers. We are at the very beginning of this research! What has happened is, we've gone from Dynamic [to] AI Winter... dynamic research stopped, and anybody who was doing it was sort of anathema in the whole academic [community]... worldwide across all the universities. There were a couple of holdouts. [Dan Friedman and] Matthias Felleisen, right, the Little Schemer guys, right? Holding out hope.

And everybody else went and chased static. And they've been doing it like crazy. And they've, in my opinion, reached the theoretical bounds of what they can deliver, and it has FAILED. These static type systems, they're WRONG. Wrong in the sense that when you try to do something, and they say: No, category theory doesn't allow that, because it's not elegant... Hey man: who's wrong? The person who's trying to write the program, or the type system?

And some of the type errors you see in these Hindley-Milner type [systems], or any type system, like "expected (int * int * int)", you know, a tuple, and "but got (int * int * int)", you know [(clapping my hands to my head)] it's pretty bad, right? I mean, they've, I think they've failed. Which is why they're not getting adopted.

Now of course that's really controversial. There are probably a bunch of type-systems researchers here who are really mad, but...

What's happening is: as of this Ajax revolution, the industry shifted to trying to optimize JavaScript. And that has triggered what is going to be a landslide of research in optimizing dynamic languages.

So these tricks I'm telling you about, they're just the beginning of it. And if we come out of this talk with one thing, it's that it's cool to optimize dynamic languages again! "Cool" in the sense of getting venture funding, right? You know, and research grants... "Cool" in the sense of making meaningful differences to all those people writing Super Mario clones in JavaScript.

You know. It's cool.

And so I encourage you, if you're a language-savvy kind of person, to jump in and try to help. Me, I'm probably going to be doing grunt implementations, since I'm not that smart.



And I don't even need to talk about this [last optimization — Escape Analysis], since you already knew it.



All right! So that's it. That's my talk. CPUs... you get all the good information about how a program is running at run time. And this has huge implications for the tools and for the performance. It's going to change the way we work. It's eventually – God, I hope sooner rather than later – going to obsolete C++ finally.



It's going to be a lot of work, right?



And then, when we finish, nobody's going to use it. [(laughter)] Because, you know. Because that's just how people are.



That's my talk! Thanks. [(applause)]



Questions? No questions? I think we're out of time, right? [(audience: no, we have half an hour)]

Q: What's your definition of marketing?

Hey man, I'm doing it right now. [(laughter)]

I am! In a sense, right? I mean, like, Perl was a marketing success, right? But it didn't have Sun or Microsoft or somebody hyping it. It had, you know, the guy in the cube next to you saying "Hey, check out this Perl. I know you're using Awk, but Perl's, like, weirder!"

The marketing can happen in any way that gets this message across, this meme out to everybody, in the Richard Dawkins sense. That's marketing. And it starts from just telling people: hey, it's out there.

Q: Do you see any of this stuff starting to move into microprocessors or instructions?

Ah! I knew somebody was going to ask that. So unfortunately, the JITs that are doing all these cool code optimizations could potentially be running into these weird impedance mismatches with microprocessors that are doing their own sets of optimizations. I know nothing about this except that it's... probably gonna happen. And, uh, God I hope they talk to each other. [Editor's note: after the talk, I heard that trace trees started life in hardware, at HP.]

Q: You could imagine CMS (?) pulling all these stunts and looking at stuff and saying, "Oh, I know that this is just machine language... oh, look! That's an int, and..."

Yes. I do know... that there's a compiler now that compiles [machine code] into microcode, a JIT, you know, I was reading about it.

Q: So one problem with performance is that it's not just fast performance vs. slow performance. What they're having a lot of trouble with is that a function one time takes a millisecond or a microsecond, and another time it takes 300 or 500 or 1000 times longer. [part of question muted] Any thoughts on how to improve the performance predictability of dynamic languages?

Yeah... *sigh*. Well, I think for the forseeable future, I mean honestly having talked to several of the VM implementers, they're not making any claims that JavaScript's going to be as fast as C any time soon. Not for the forseeable future. It's going to be very fast, right, but it's not going to be quite... they're not going to make the crazy promises that Sun did.

Which means that these dynamic speedups are primarily going to be useful in long-running distributed processes, for which a little glitch now and then isn't going to matter in the grand scheme of the computation. Or, they're going to be, you know, the harder one is in clients, where you've got a browser app, and you're hoping that the glitch you're talking about isn't on the order of hundreds of milliseconds.

Generational garbage collectors is the best answer I've got for that, because it reduces the pauses, and frankly, the garbage collectors for all the [new] dynamic languages today are crap. They're mark-and-sweep, or they're reference counted. They've got to fix that. Right out of the starting gate, that's gonna nullify 80 percent of that argument.

For the rest, I don't know. It's up to you guys.

Q: You also have to look at your storage management; you have to understand your storage in your program, and have some sort of better control over that...

That's right. As you point out, it's domain-specific. If your bottleneck is your database, all bets are off.

Q: You seem to be equating dynamic language with dynamic encoding, that you have "dynamic language equals JIT"."

For this talk, yes. [(laughter)]

Q: ...but the same thing can be done for static languages.

Yeah, absolutely!

Q: ...and as soon as the marketing starts getting some market penetration, the C++ people will just simply come around and say, "You can have maintainability and performance".

Yep! They can, actually. That's what they'll say. And I'll say: "All right. I'll give you a thousand dollars when you're done." OK? Because C++ have actually shot themselves in their own foot. By adding so many performance hacks into the language, and also actual features into the language for performance, like pointer manipulation, the language itself is large enough that it's very difficult. It's much more difficult to envision doing a JIT that can handle pointers properly, for example, right? You can do it! It's just a lot more work than it is for these simple languages. [In retrospect, I'm not so sure about this claim. Trace trees may not care about pointers, so maybe it wouldn't be that hard? Of course, they'd have to move to a JIT first, requiring an initial slowdown, so it'd probably never happen. -Ed.]

So they're winding up... they're winding up in a situation where they're gonna have to weigh it carefully, and say, "OK: when all is said and done, is my language actually gonna be faster than these other languages that have gone through this process already?" Because now we're on a more level playing field.

Especially as it's getting increasingly difficult to predict exactly what the hardware architecture is going to be, and those mismatches tend to have a huge impact on what the JIT actually can do. I mean, hardware's getting really out there now, and the compiler writers are still trying to figure out what to do about it. I mean, even the stuff they're doing in the JITs today might not apply tomorrow.

So I realize it's a weak answer, but I'm saying, you know, it's a hard proposition for me to imagine them doing. They'll try! Maybe they'll succeed.

Q: The other side of dynamic languages is second-order systems: the ability to do an eval. And the difficulty with that is intellectual tractability. Most people use second-order languages to write first-order programs. Is there any real reason to even have a second-order language for writing Cobol?

Can they hear these questions in the audio? Because this is a really good point.

So this is, I mean, I don't know the answer to this. This is a hard question, OK?

Java has kind of gotten there without even having eval. They've tiered themselves into sort of second-order people who know how to manipulate the type-theory stuff, you know, right? People go off to them with batch requests: "Please write me a type expression that meets my needs". And it comes back. So we're already in sort of the same situation we were in with hygienic Scheme macros, you know, or with any sort of macro system, or any eval system. Which is that really only a few people can be trusted to do it well, and everybody else kind of has to... right?

So I don't know, maybe it's just a cultural... maybe it's already solved, and we just have to live with the fact that some programming languages are going to have dark corners that only a few people know. It's unfortunate. It's ugly.

[My PhD languages intern, Roshan James, replies to the questioner: Your usage of the phrase 'second-order', where does that come from? A comment as to what you're telling us, which is that some sort of phased evaluation, specific to Scheme at least, we're saying... some would say the complexity of writing Scheme macros is roughly on the order of writing a complex higher-order procedure. It's not much harder. A well thought out macro system is not a hard thing to use.]

...says the Ph.D. languages intern! He's representing the majority viewpoint, of course. [(laughter)] I'll tell you what: I'll let you two guys duke it out after the talk, because I want to make sure we get through anybody else's questions.

Q: You're assuming you have a garbage-collected environment. What do you think about reference counting, with appropriate optimization?

Ummmm... No. [(laughter)] I mean, come on. Garbage collection... you guys know that, like, it's faster in Java to allocate an object than it is in C++? They've got it down to, like, three instructions on some devices, is that right? And the way the generational garbage collector works is that 90% of the objects get reused. Plus there's fine-grained interleaving with the way the memory model of the operating system works, to make sure they're dealing with issues with, whaddaya call it, holes in the heap, where you can't allocate, I mean there's a whole bunch of stuff going on. [This was me doing my best moron impersonation. Sigh. -Ed.]

So, like, it works. So why throw the extra burden on the programmer? Even [in] C++, by the way, Stroustroup wants to add garbage collection!

Q: If you believe your other arguments, you can do the reference counting or local pooling, and point out when it's actually wrong.

Right. The philosophical answer to you guys is: compilers will eventually get smart enough to deal with these problems better than a programmer can. This has happened time and again. [For instance] compilers generate better assembly code [than programmers do].

All the "tricks" that you learned to optimize your Java code, like marking everything final, so that the compiler can inline it – the VM does that for you now! And it puts [in] some ClassLoader hooks to see if you load a class that makes it non-final, and then [if the assumption is invalidated later] it undoes the optimization and pulls the inlining out.

That's how smart the VM is right now. OK? You only need a few compiler writers to go out there and obsolete all the tricks that you learned. All the memory-pooling tricks...

Hell, you guys remember StringBuffer and StringBuilder in Java? They introduced StringBuilder recently, which is an unsynchronized version, so they didn't have to have a lock? Guess what? Java 6 optimizes those locks away. Any time you can see that the lock isn't needed, they can see it. [Editor's Note: See "Biased locking" in the linked perf whitepaper. It's an astoundingly cool example of the predictive-heuristic class of techniques I've talked about today.]

So now all these tricks, all this common wisdom that programmers share with each other, saying "I heard that this hashtable is 1.75 times faster than blah, so therefore you should...", all the micro-optimizations they're doing – are going to become obsolete! Because compilers are smart enough to deal with that.

Q: You didn't mention APL, which is a very nice dynamic language...

I didn't mention APL!? Oh my. Well, I'm sorry. [(laughter)]

Q: The thing is, well – several of the APL systems, they incorporated most of the list of program transformations you're talking about. And they did it several decades ago.

Yeah, so... so you could've said Lisp. You could've said Smalltalk. "We did it before!" And that was kind of, that was one of the important points of the talk, right? It has been done before. But I'm gonna stand by my – especially with APL – I'm going to stand by my assertion that the language popularity ranking is going to stay pretty consistent. I don't see APL moving 50 slots up on the [list].

I'm sorry, actually. Well not for that case. [(laughter)] But I'm sorry that in general, the languages that got optimized really well, and were really elegant, arguably more so than the languages today, you know, in a lot of ways, but they're not being used.

I tried! I mean, I tried. But I couldn't get anybody to use them. I got lynched, time and again.

Q: So what's the light at the end of the tunnel for multithreading?

Oh, God! You guys want to be here for another 2 hours? [(laughter)] I read the scariest article that I've read in the last 2 years: an interview with, I guess his name was Cliff Click, which I think is a cool name. He's like the HotSpot -server VM dude, and somebody, Kirk Pepperdine was interviewing him on The Server Side. I just found this randomly.

And they started getting down into the threading, you know, the Java memory model and how it doesn't work well with the actual memory models, the hardware, and he started going through, again and again, problems that every Java programmer – like, nobody knows when the hell to use volatile, and so all of their reads are unsynchronized and they're getting stale copies...

And he went through – went through problems to which he does not know the answer. I mean, to where I came away going Oh My God, threads are irreparably busted. I don't know what to do about it. I really don't know.

I do know that I did write a half a million lines of Java code for this game, this multi-threaded game I wrote. And a lot of weird stuff would happen. You'd get NullPointerExceptions in situations where, you know, you thought you had gone through and done a more or less rigorous proof that it shouldn't have happened, right?

And so you throw in an "if null", right? And I've got "if null"s all over. I've got error recovery threaded through this half-million line code base. It's contributing to the half million lines, I tell ya. But it's a very robust system.

You can actually engineer these things, as long as you engineer them with the certain knowledge that you're using threads wrong, and they're going to bite you. And even if you're using them right, the implementation probably got it wrong somewhere.

It's really scary, man. I don't... I can't talk about it anymore. I'll start crying.

Q: These great things that IDEs have, what's gonna change there, like what's gonna really help?

Well, I think the biggest thing about IDEs is... first of all, dynamic languages will catch up, in terms of sort of having feature parity. The other thing is that IDEs are increasingly going to tie themselves to the running program. Right? Because they're already kind of doing it, but it's kind of half-assed, and it's because they still have this notion of static vs. dynamic, compile-time vs. run-time, and these are... really, it's a continuum. It really is. You know, I mean, because you can invoke the compiler at run time.

Q: Is it allowed at Google to use Lisp and other languages?

No. No, it's not OK. At Google you can use C++, Java, Python, JavaScript... I actually found a legal loophole and used server-side JavaScript for a project. Or some of our proprietary internal languages.

That's for production stuff. That's for stuff that armies of people are going to have to maintain. It has to be high-availability, etc. I actually wrote a long blog about this that I'll point you to that actually... Like, I actually came around to their way of seeing it. I did. Painful as it was. But they're right.

Q: [question is hard to hear]

[me paraphrasing] Are we going to have something Lisp Machines didn't?

Q: Yes.

Well... no. [(loud laughter)]

I say that in all seriousness, actually, even though it sounds funny. I, you know, I live in Emacs. And Emacs is the world's last Lisp Machine. All the rest of them are at garage sales. But Emacs is a Lisp Machine. It may not be the best Lisp, but it is one.

And you know, T.V Raman, you know, research scientist at Google, who, he doesn't have the use of his sight... he's a completely fully productive programmer, more so than I am, because Emacs is his window to the world. It's his remote control. EmacsSpeak is his thesis. It's amazing to watch him work.

Emacs, as a Lisp Machine, is capable of doing anything that these other things can. The problem is, nobody wants to learn Lisp.

Q: And it doesn't have closures.

And it doesn't have closures, although you can fake them with macros.

I'm actually having lunch with an [ex-]Emacs maintainer tomorrow. We're going to talk about how to introduce concurrency, a better rendering engine, and maybe some Emacs Lisp language improvements. You know, even Emacs has to evolve.

But the general answer to your question is No. Lisp Machines pretty much had it nailed, as far as I'm concerned. [(shrugging)] Object-oriented programming, maybe? Scripting? I dunno.

Q: Many years ago, I made the great transition to a fully type-checked system. And it was wonderful. And I remember that in the beginning I didn't understand it, and I just did what I had to do. And one dark night, the compiler gave me this error message, and it was right, and I thought "Oh wow, thank you!" I'd suddenly figured it out.

Yes! "Thank you." Yes.

Although it's very interesting that it took a long time before it actually told you something useful. I remember my first experience with a C++ compiler was, it would tell me "blublblbuh!!", except it wouldn't stop there. It would vomit for screen after screen because it was Cfront, right?

And the weird thing is, I realized early in my career that I would actually rather have a runtime error than a compile error. [(some laughs)] Because at that time... now this is way contrary to popular opinion. Everybody wants early error detection. Oh God, not a runtime error, right? But the debugger gives you this ability to start poking and prodding, especially in a more dynamic language, where you can start simulating things, you can back it up... You've got your time-machine debuggers like the OCaml one, that can actually save the states and back up.

You've got amazing tools at your disposal. You've got your print, your console printer, you've got your logging, right? [Ummm... and eval. Oops. -Ed.] You've got all these assertions available. Whereas if the compiler gives you an error that says "expected expression angle-bracket", you don't have a "compiler-debugger" that you can shell into, where you're trying to, like – you could fire up a debugger on the compiler, but I don't recommend it.

So, you know, in some sense, your runtime errors are actually kind of nicer. When I started with Perl, which was pretty cryptic, you know, and I totally see where you're coming from, because every once in a while the compiler catches an error. But the argument that I'm making is NOT that compilers don't occasionally help you catch errors. The argument that I'm making is that you're gonna catch the errors one way or the other. Especially if you've got unit tests, or QA or whatever.

And the problem is that the type systems, in programming in the large, wind up getting in your way... way too often. Because the larger the system gets, the more desperate the need becomes for these dynamic features, to try to factor out patterns that weren't evident when the code base was smaller. And the type system just winds up getting in your way again and again.

Yeah, sure, it catches a few trivial errors, but what happens is, when you go from Java to JavaScript or Python, you switch into a different mode of programming, where you look a lot more carefully at your code. And I would argue that a compiler can actually get you into a mode where you just submit this batch job to your compiler, and it comes back and says "Oh, no, you forgot a semicolon", and you're like, "Yeah, yeah, yeah." And you're not even really thinking about it anymore.

Which, unfortunately, means you're not thinking very carefully about the algorithms either. I would argue that you actually craft better code as a dynamic language programmer in part because you're forced to. But it winds up being a good thing.

But again, I – this is all very minority opinion; it's certainly not majority opinion at Google. All right? So this is just my own personal bias.

Q: [question too hard to hear over audio, something about is it possible for the compiler at least to offer some help]

You know, that's an interesting question. Why do compiler errors have to be errors? Why couldn't you have a compiler that just goes and gives you some advice? Actually, this is what IDEs are excelling at today. Right? At warnings. It's like, "ah, I see what you're doing here, and you don't really need to. You probably shouldn't."

It's weird, because Eclipse's compiler is probably a lot better than javac. Javac doesn't need to be good for the reasons I described earlier, right? It all gets torn down by the JIT. But Eclipse's compiler needs to give you that exact help. The programmer help, the day-to-day help, I missed a semicolon, I missed this, right? And Eclipse and IntelliJ, these editors, their compilers are very very good at error recovery, which in a static batch compiler usually just needs to be: BLAP, got an error!

OK? So to an extent I think we are getting tools that come along and act like the little paper-clip in Microsoft Office. You know. Maybe not quite like that.

Q: The only thing I worry about is that there's a chunk of code that you really want to work sometimes, but the error-recovery parts are hard to test.

That's the part you have to do at runtime, right? Well, I mean, when you get into concurrency you're just screwed, but if you're talking about situations where it's very difficult to... I mean, it's computationally impossible to figure out whether all paths through a code graph are taken. I mean, it's NP-complete, you can't do this, right? But the VM can tell you which code paths got taken, and if it doesn't [get taken], you can change your unit test to force those code paths to go through, at which point you've now exercised all of your code. Right? That's kind of the way, you know, they do it these days.

And I would say it's a pain in the butt, but I mean... it's a pain in the butt because... a static type-systems researcher will tell you that unit tests are a poor man's type system. The compiler ought to be able to predict these errors and tell you the errors, way in advance of you ever running the program. And for the type systems they've constructed, this is actually true, by and large, modulo assertion errors and all these weird new runtime errors they actually have to, heh, inject into your system, because of type-system problems.

But by and large, I think what happens is unless the type system actually delivers on its promise, of always being right and always allowing you to model any lambda-calculus computation your little heart desires, OK? Unless it can do that, it's gonna get in your way at some point.

Now again, this is all personal choice, personal preference. I think, you know, static compilers and error checkers, they have a lot of value and they're going to be around for a long time. But dynamic languages could get a lot better about it.

I'm not trying to refute your point. I'm just saying that... there are tradeoffs, when you go to a dynamic language. I have come around... I've gone from dynamic to static and back to dynamic again, so I've done the whole gamut. And I've decided that for very large systems, I prefer the dynamic ones, in spite of trade-offs like the one you bring up.

Q: Actually you said, for hybrid systems, where pieces of it are dynamic and pieces are static...

I think some of it has been pretty... there's a great paper from Adobe about it, right? Evolutionary programming? Talks about how you prototype your system up, and then you want to lock it down for production, so you find the spots where there are API boundaries that people are going to be using. You start putting in contracts, in the way of type signatures and type assertions.

Why not build that functionality into the language? Then you don't have to build a prototype and re-implement it in C++. That's the thinking, anyway. It seems like a great idea to me, but it hasn't caught on too much yet. We'll see. [Editor's note: The main hybrid examples I'm aware of are StrongTalk, Common Lisp, Groovy and EcmaScript Edition 4. Any others?]

All right, well, now we're definitely over time, and I'm sure some of you guys want to go. So thank you very much!




At this point I got mobbed by a bunch of grad students, profs, and various interested people with other questions. They dumped an absolutely astounding amount of information on me, too – papers to chase down, articles to follow up on, people to meet, books to read. There was lots of enthusiasm. Glad they liked it!

84 Comments:

Blogger Charles Oliver Nutter said...

That HotRuby Ruby-in-JS thing is fast because it's a *toy*. It removes a large number of features and guesses at the implementations of a number of methods, so it's really doing a lot less work.

I can turn on unsafe optz in JRuby and almost get it as fast as equivalent Java code (sans primitives) but I'd never claim that's actual performance.

12:29 AM, May 12, 2008  
Blogger Unknown said...

Did you use Comic Sans in your slides to irritate people? Because it worked. I hate that it worked, but it did.

12:50 AM, May 12, 2008  
Blogger humblefool said...

Now, of course, you have to share the story about the time the Google People(tm) managed to overflow a long.

12:51 AM, May 12, 2008  
Blogger Steve Yegge said...

Sounds like you've got your work cut out, eh Charles? ;-)

Don't get me wrong: I'm confident that JRuby can eventually be as fast as Java, given (a) a lot of work and (b) a little help from the JVM.

You can do it!

12:52 AM, May 12, 2008  
Blogger Steve Yegge said...

I dunno, I used the default font that Keynote picked for the Chalkboard theme.

It was more readable in the actual-size slides. Blogger gives you limited sizing options and does its own scaling, which doesn't help.

Sorry!

12:53 AM, May 12, 2008  
Blogger Samuel A. Falvo II said...

#$@$ what people say about what font you use. It's more than legible, even on a screen of my resolution (1600x1200). They should be glad you even bothered with the transcription and slides at all.

I, for one, am very thankful for your efforts, even if I do not always agree with your material.

12:58 AM, May 12, 2008  
Blogger Alan Keefer said...

Great talk . . . thanks for posting the transcript and linking to all the various JIT optimization papers.

You talk about hybrids in the sense of "languages you can optionally add types to," but I honestly think that fully statically-typed languages can still be almost as friendly as dynamic languages provided they have type inference such that the types don't get in your way and that they provide the same level of dynamism, interactivity, and metaprogramming as traditionally dynamic languages. Of course, I'm biased, because I'm helping develop such a language on the JVM . . .

The topic of metaprogramming is an interesting one, because it completely screws up tooling: in Java that means reflective calls, but my experience is that a higher percentage of code that you use in a dynamic language has some metaprogramming component (which is one reason it's more compact). So I think the upper bound on how good the refactorings in a JS IDE can be will be lower than the bound for Java just because the Java code will invariably have fewer things the IDE can't handle.

Of course, the total inability to do metaprogramming in Java is the primary reason (in my opinion) why all Java frameworks inherently suck, so it's worth the tradeoff. But for what it's worth, our language (GScript) has what we call an open type system, where the metaprogramming bits are essentially sectioned off from the runtime bits, so you can have type-safe metaprogramming (though more limited in scope) that the IDE can refactor. So you get more metaprogramming abilities than Java has *and* a more intelligent IDE/safer refactorings.

Lastly, I'll say that while I generally agree with your "no new language in 10 years" rule I can see it being violated by a JVM language; the barrier to entry is much lower there because it can interact with the huge base of existing Java code (and, often, tools) instead of starting from scratch. It also allows for a gradual transformation of an existing codebase, which could also be a huge selling point. Though perhaps that new JVM language will just end up being JRuby.

2:40 AM, May 12, 2008  
Blogger dh said...

Minor issue: There's a broken link immediately after slide "JIT compilation (4 of 5)". The paper can be found at http://www.ics.uci.edu/%7Efranz/Site/pubs-pdf/C44Prepub.pdf

Great Posting! Thanks for the transcription of your talk!

3:28 AM, May 12, 2008  
Blogger Tony Morris said...

"...really fanatical about strong typing, who would potentially maybe get a little offended about some of the sort of colorful comments I might inadvertently make during this talk."

Steve,
Some of these people are more likely to be offended by your compulsion to pass severely under-qualified comment on the topic; something you have done more than once before. The offense comes about because it is almost deliberately misleading to others who might have the desire to learn and are not in a position to know any better and may mistake your pseudo-scientific nonsense with warranted factual claims.

I say "almost deliberate" because I am more inclined to believe your desire to continue doing this is a result of your ignorance rather than malice.

3:32 AM, May 12, 2008  
Blogger JeanHuguesRobert said...

Congratulations. I will refer to your post, it states what I think much better than I ever will. Thanks.

4:21 AM, May 12, 2008  
Blogger lmeyerov said...

While a good deal of HPC folk still use c/fortran/etc, I'm seeing a surprising amount of scripting languages being used to glue BLAS etc libraries together. For example, right now, I'm prodding at python/mpi for EC2. My point is that we won't see a new general programming language mostly because it's an antiquated notion - general systems languages and what have you, maybe, but that's already pigeon-holed.

Sad I missed the talk, you should give an encore at Berkeley :)

4:26 AM, May 12, 2008  
Blogger Carl Friedrich Bolz-Tereick said...

Charles: That's a common pattern with new Python implementations too: They claim to be fast (at a stage where they support only the simple stuff) and then they you don't really hear anything from them afterwards.

Steve: Really interesting talk, I could connect with tons of things you were saying.

5:24 AM, May 12, 2008  
Blogger Ian J Cottee said...

How the hell do you find time to prepare a talk, give the talk and then type up the whole bloody transcript? I am in awe.

5:34 AM, May 12, 2008  
Anonymous Anonymous said...

Having played with perl threads recently, I just wanted to note that they are actual "real" threads, unlike the "global interpreter lock" of most other dynamic languages.

The price for that (and it's a big one) is that when you create a new thread, the interpreter actually makes a copy of all non-shared variable data! That right, it iterates over all data and copies all data structures that weren't explicitly shared with threads::shared.

The real problem is that dynamic languages store their data in complex (at the C level) structures that could be left in indeterminate state by threads running over the same data at the same time. That leaves you with only a couple of main options: have a lock on each structure (no one does this - requiring locking on every access slow things enormously), have a global lock (most do this - but you lose all concurrency), copy all unshared data (perl threads). But with that last one, then you might say "hey, why don't we just fork and use the OS COW semantics and use some shared memory for the explicit shared data?"

In fact, there's a drop in replacement for perl threads called forks that does just that.

http://search.cpan.org/~rybskej/forks-0.27/

Personally, I buy into the unix fork model, where nothing is shared, and all sharing/communication is explicit.

5:37 AM, May 12, 2008  
Blogger Unknown said...

Fun talk, but I'm still looking for solid data on the relative productivity of static and dynamic languages for large-scale engineering (something more than anecdotes); without that, the whole debate feels very tomato-toMAHto. (I'd like solid data on the productivity of functional languages, too, for the same reason.)

5:39 AM, May 12, 2008  
Blogger Alexander Konovalenko said...

The Amazon links from the programming language names haven't been expanded with href attributes so they kind of point to nowhere.

5:54 AM, May 12, 2008  
Blogger Unknown said...

> Personally, I buy into the unix fork model, where nothing is shared, and all sharing/communication is explicit.

Which is why Erlang's one of the most enjoyable languages as far as concurrency goes: it's split between "sequential" erlang (a functional language based on immutability, very deterministic) and "concurrent" erlang based on message-passing and a pair of primitives (send a message, receive messages in process mailbox).

All of the communications between processes (erlang has no programmer-visible threads as it implies shared memory) are done via shared-nothing message passing, so the non-deterministic parts of Erlang are very well defined: at process boundaries and nowhere else.

Experiencing that and then going back to concurrency in Java (even with java.util.concurrent) makes you want to rip out your eyes, and realize that, really, shared-memory concurrency should die, it only exists because it was easier to implement and more efficient at machine-level. But it doesn't actually work. And it doesn't scale for any value of "scale" worth talking about.

6:06 AM, May 12, 2008  
Blogger Unknown said...

I just wanted to say thanks for the transcription. I'm one of the rare people (presumably) that hates watching videos because I'm impatient and read much more quickly than I listen.

All too often I have to skip these things that I'm interested in hearing.

6:17 AM, May 12, 2008  
Blogger Chris Ryland said...

The Java polymorphic inline caching you describe was done a long, long time ago in the ParcPlace Smalltalk JITting VM (by Peter Deutsch and Allan Schiffman).

As you point out elsewhere, it's all been done before. It's a shame it all has to be re-discovered in new contexts, and in less dynamic languages.

Great rant!

6:24 AM, May 12, 2008  
Blogger Dan Weinreb said...

We may not have many million line code bases, but we do have at least one half-million line code base in Common Lisp. I work on it every day. www.itasoftware.com. For those reading, I was a co-founder of Symbolics, which made Lisp machines, and I'm one of the co-designers of Common Lisp. I find myself again using Common Lisp these days. I speak here for myself, not for ITA Software.

Actually we do have some type checking, although it's not static. We have this macro called define-strict-function, in which you specify types for arguments, returned values, and conditions, and it checks it all automatically at runtime. That said, I have used Java a lot and sometimes I do wish we had typed variables, since there are definitely bugs that they would have caught for me, saving me debugging time. On the other hand, advocates of typed-variable languages have a tendency to talk as if once all the types match, the program must be bug-free, which is rather silly.

You know, the Dylan people didn't even have a definition for "dynamic language". And at the "Lighweight Language Workshop" series at MIT, they similarly had no definition for "lighweight language". They really do define it by a list of languages, exactly as in your ironic joke.

About the fabric with which God wrote the Universe, that's been well-established: http://www.xkcd.com/224/

I haven't looked into it as much as I should, but I think the worst slowdowns that you usually see are things like: we're trying to add a and b, but we have to examine their datatypes and dispatch off to the various kinds of addition like rational and floating point and so on. (This dispatch was done in microcode in the original MIT Lisp machine architecture, with a very fast (fall-through, as you'd guess) optimziation for the fixnum-fixnum case.) In ordinary Lisp implementations these days, you can declare the types of the variables. It's more work to do that, but you only end up doing it inside inner loops and other such critical areas; in most places it doesn't matter. You probably already know this.

You're right about how you can write a better program. You can especially do this if your language system makes it easier for you to be more productive! That's a key point, in my opinion.

I'm not sure if you know that many current Common Lisp implementations do support native threads. See my survey paper at

http://common-lisp.net/~dlw/LispSurvey.html

You know about IronPython and Microsoft's DLR ("Dynamic Language Runtime") extension to the CLR ("Common Language Runtime")? The CLR is looking promising, although time will tell. Have you seen Jim Hugunin's talks about this? If not, see my notes in:

http://dlweinreb.wordpress.com/2007/10/28/3/

Another thing about C++ is that the major compilers are full of bugs. This is from our Object Design experience. The guys maintaining that stuff are still running into this. Dave Moon told me recently that it appears that the last person at IBM able to write a compiler must have walked out the door, in reference to the IBM AIX C++ implementation.

Regarding GC latency issues: Do you know about InspireData? It's a very popular (in schools) shrink-wrap application for simple data understanding. Nobody would ever know that it is written in Common Lisp (the LispWorks implementation). Nobody has ever perceived any GC effects. This is a major recent Common Lisp success story.

Not to mention that most C/C++ programmers have no idea just how bad their "malloc/free" implementation is. It's actually pretty hard to write a good one. They can get damned slow as memory gets fragmented.

Re HotSpot: take a look at JRockit, from BEA. Well, BEA just got absorbed into the Oracle borg, but presumably you can still get JRockit. It's a great Java implementation. I was once doing some CPU-heavy (overnight) simulation runs, and they were TWICE as fast in JRockit.

Amazingly, Richard Gabriel, in "Patterns of Software", written in 1998, said at one point that there was now only one computer language: C++. Gabriel of course is a major Lisp guy. The pendulum swings, and now there is again a proliferation of languages (but not operating systems!) being used in real applications. You just never know what's going to happen five years out in this business.

"A lot more programmers": And it depends on whom you count as programmers. Linden Labs says that 15% of the Second Life users are doing programming. That's a lot of people. (Some may be copying code from others, I suppose.) There are 2.5 billion lines of user code written in Second Life in 30 million scripts, of which 15,000 or so are usually running at a time. (They describe their language as "terrible" from a language-design point of view, and they plan to fix that by bringing up the CLR.)

I have not heard of Boo or any of those others except Scala. I did a blog entry on Scala, which I think looks very promising. See http://dlweinreb.wordpress.com/2007/12/25/the-scala-programming-language-my-first-impressions/.

Of course you're totally right about doing natural language stuff probabalistically. I was recently talking to my co-worker, Bill Woods, who is one of the very first natural language guys; Peter Norvig used to work for him. He agrees that probablitic is what everyone is doing now, and he's sort of peeved that this has postponed work on serious natural language. But I think we all know that real parsing will only come back when we reach the limits of the probabilitic stuff, and someone sees it as a commercially viable proposition to do all that work, seeing a benefit that makes it all worth it. (Or, if government funding for basic research ever comes back; maybe when the faith-based administration is gone...) These days Peter probably knows as much about this as anyone in the world.

The Steel Bank Common Lisp compiler does a lot of type-propagation and uses it to generate better code. The code isn't all that much better, and the compiler ends up being slow. Scala looks like it offers more opportunities for type inference. Having a Common Lisp built on a virtual machine with a great JIT compiler is what I'd like to see. We've got Armed Bear Common Lisp, written on the JVM, but apparently it's not very fast; I have not looked at it. Someone started writing a Common Lisp for the CLR, but it was too big a job and they punted. I'd like to see Microsoft's DLR ideas added to the JVM. Anyway, I'm not actually a compiler person, and there's so much to say about this.

I havent' used the refactoring stuff (like Rename Method) in Eclipse or IntelliJ. Some people have told me that they really like it and it works well. Maybe they didn't encounter those "method name in an XML file" issues in their own use. I think all the major things you're saying about IDE's are right on the money.

Yes, adding the generic types to Java was absolutely necessary -- those casts sucked -- and my own Java code was much nicer because of it. But the more powerful you make a type system, the more you run into hard-to-understand stuff at the edges, and if you make it even more powerful, the edges start moving in toward the center. That, it has always seemed to me, is the big problem. Not that I am in any way an expert on this.

Yes, lots of controversy about Java 7. You've seen what Josh Bloch has to say about how they intend to put in lexical scoping? Uh-oh. I hope the Java designers are very careful here.

Yeah, the IntelliJ guys do kick butt. So much that they compete effectively against free software. It's amazing. I met a couple of them at OOPSLA. Basically, they just have to keep working their asses off to keep their stuff significantly better than Eclipse, enough to make it worth the money. How many people manage to make money selling development environments these days? I'm very impressed by them.

I like the story about refreshing the title bar 140 times. It's an example of the principle I always tell everyone: your performance bottlenecks are usually something you would never think of in a million years. You absolutely must use performance tools rather than trying to reason from first principles.

The greater point about how complexity kills you is really the profound thing, and I have found that it takes programmers a long, long time to truly appreciate this. They usually overestimate the amount of complexity that can be tolerated. You can't teach this to people: they won't believe it. They have to learn the hard way.

Yes, the pricing and shopping engine used by Orbix (and many airlines), which we call "QPX", is really in Common Lisp (plus a few optimizations involving mapping stuff into main memory; it's all written up somewhere, which I could find for you if you care). I'm working on a new product, an airline reservation system (initially for Air Canada), which uses QPX as a module. 500KLOC of in-house Common Lisp plus maybe 10KLOC of Common Lisp open-source libraries are used in the main business-layer module. Presentation layer in Java (it generates a lot of JavaScript like any modern web site with nice interactive behavior), back-end Oracle RAC, many other, smaller pieces in Python. At ITA you just use whatever language you feel like, which we probably are too lax about. One guy was going to use Prolog for an access control subsystem, but not for any really good reason, so we squelched that.

You overflowed a "long"? And they don't appreciate Lisp?? :) Not to mention going past array bounds and so on. We actually do use a bit of C at ITA, for places where we need very fast bit manipulation, or if we need to write an Apache extension or something, but we use it as little as possible.

I've never heard of this double-dispatch type inference; I'll have to read that paper when I have some time. Thanks for the pointer. Trace trees likewise. Doing a whole lot of inlining may have instruction-cache problems in some of the upcoming architectures, so this will require some careful benchmarking. Yes, sharing code paths obviously helps this.

Someone mentioned performance predictability. Butler Lampson made this point in a recent talk I went to at MIT. A language being 20% slower just does not matter, since Moore's Law takes care of it. But unpredictable performance can be a headache for a long time. Now, he was saying this regarding GC, and I think he was way overstating the problem there. But the overall point has merit. Try selling software to stock traders, where a small realtime pause can mean losing very big bucks. Not that a C program can't take page faults. However, we have stringent SLA's in our airline reservation system and we do not feel we have to worry at all about the GC issue.

JIT compilers for C++: Well, why not take a look at the existing "Managed C/C++" that Microsoft has had for some time now? I know next to nothing about this, except that I hear rumors it's not widely used since the people who are still on C/C++ want to be down there in the bits. I'm also not sure to what extent it conforms to the standard for C++, or the de facto behaviors that people might depend on. I'm sure there are people who can tell you all about how this is working out.

We never, ever use "eval". What we use is macros. A lot.

About "language popularity ranking is going to stay pretty consistent", well, that's a very complicated topic. Sometimes the order does change, e.g. Java. But it's very hard; technically there are all these bootstrap issues (who writes libraries for a language that isn't used yet, and who uses a language for which there aren't libraries yet; this can be finessed sometimes by letting the new language call the old languages's libraries), and non-technically there are all kinds of marketing and perception issues. I do not expect to see Common Lisp move way up there in my lifetime, but I do think we can move it up significantly.

Threads: boy, is that controversial. Does using threads doom you to disaster automatically? People have widely varying opinions about that. This is too big of a subject for me to get into here. But it's very interesting.

"Emacs is a Lisp machine". I think you mean using Emacs's built-in Lisp, but do you know about using a real Common Lisp with SLIME? The latter is where we're trying to re-create the most important Lisp machines features. (By "we" I don't actually mean me, since I'm working on the airline reservation system, but Marco Baringer is doing some great stuff.)

"Lisp machines pretty much had it nailed." Thank you. I have run into smart people who are huge Lisp machine fans who are so young that they have never even seen one!

In all fairness, there are compilers out there that deliver far better error messages than others. The PL/C compiler at Cornell, and the Multics PL/I compiler, are great examples, and I wish more people had learned from them. It takes work to make the compiler issue good error messages and I bet that usually ends up far down in the to-do list so it doesn't happen.

6:48 AM, May 12, 2008  
Blogger Mike Hales said...

One small issue about the Strongtalk system. The static typing didn't have anything to do with the performance gains, it was completely optional.

6:59 AM, May 12, 2008  
Blogger RH Trial said...

Clojure supports optional type hints that eliminate reflective calls. It also greatly reduces the parens-overload of Lisp, adding vector and map literals for much more accessible syntax. And, unlike many dynamic languages, it has a strong concurrency story.

7:03 AM, May 12, 2008  
Blogger Jesse A. Tov said...

This comment has been removed by the author.

9:25 AM, May 12, 2008  
Blogger Alexandre Richer said...

Alice ML is based on Standard ML, but it also has dynamic typing (and tons of other features):

http://www.ps.uni-sb.de/alice/manual/packages.html

10:01 AM, May 12, 2008  
Blogger Gregory said...

I'm curious about your opinion of Objective-C as a "dynamic" language. It was originally an attempt to maintain backward compatibility with C while adding Smalltalk-like OO extensions, but it's grown into something more since then. It is now garbage collected, for one thing. Its IDE support is pretty good, as is its performance. Its type system is largely optional, and seem to be mostly useful for code clarity and to provide hints to the IDE. Do you consider it a "dynamic" language?

10:45 AM, May 12, 2008  
Anonymous Anonymous said...

Great article...and thanks for all the links.

10:59 AM, May 12, 2008  
Blogger Charles Oliver Nutter said...

steve: My point is that it's easy to make a Ruby implementation that's fast and does nothing. It's much, much harder to make one that's fast and runs everything. That's what we've done already, and we're still improving.

11:17 AM, May 12, 2008  
Anonymous Anonymous said...

Re: Comic Sans --

That's not Comic Sans in those slides. You can tell pretty easily; look at the uppercase B and compare to the uppercase B in Comic Sans. The Wikipedia article on Comic Sans has the latter if you need a reference.

Apple used Comic Sans for the Chalkboard theme in the first release of Keynote, I think, but they're using Chalkboard now. Chalkboard does not suffer from the poor spacing and general sloppiness of Comic Sans, and it's a fine choice for this sort of thing.

http://www.identifont.com/show?G47
http://www.identifont.com/show?1MH

Flip back and forth between 'em; you can feel the quality difference, IMHO.

This has been probably more typeface discussion (of the non-monospaced version) than expected, huh?

11:45 AM, May 12, 2008  
Blogger John Millikin said...

So I'm going to be talking about dynamic languages. I assume [...] that you've got an interest, because there's a dude down the hall talking about Scala, which is you know, this very strongly typed JVM language [...] So you know, presumably all the people who are like really fanatical about strong typing [...] — well, we'll assume they're all over there.

I'm sure this is not the first time you've heard this, nor will it be the last, but weak/strong and static/dynamic typing are orthogonal. Many dynamic languages, such as Python, Ruby, Lisp, and Smalltalk, are strongly typed. Some statically typed languages, such as C and C++, are weakly typed.

Also, requiring the user to explicitly tag types is not a requirement of static typing. Haskell, OCaml, and Scala will automatically inference types when possible.

12:26 PM, May 12, 2008  
Blogger astrange said...

> Trace trees likewise. Doing a whole lot of inlining may have instruction-cache problems in some of the upcoming architectures, so this will require some careful benchmarking.

Also in all the current architectures! Inlining small stuff almost always lets you end up with a smaller result, so it's still beneficial of course.

(gcc can do trace formation with -ftracer, but it doesn't work that well)

12:28 PM, May 12, 2008  
Blogger Unknown said...

Garbage collectors have been improving for the newer dynamic languages, too... Lua's latest version has an incremental garbage collector, and PyPy's garbage collector is also improving. Of course implementations on top of the JVM/CLR automatically benefit from the good garbage collectors of these platforms. But I agree that the garbage collector is a very important (and often overlooked) item in overall performance.

12:33 PM, May 12, 2008  
Blogger Steve Yegge said...

Great comments, everyone!

Dan Weinreb: a virtuosic comment if ever there was one. I really enjoyed reading it all. Thanks! I've got nothing to disagree with there, and you've given me some good material to chase down. My secret agenda for the talk, of course, is to get more people to try Lisp. I'm completely agnostic about which Lisp, as long as they try it.

Dibblego: you'll have to be more specific, so that others may refute you or back you up. Otherwise your argument is no really better than the point it's trying to make about my writing. Try again! I will listen with an open mind.

Rob Mueller and masklinn: yeah, I'm really coming around to the nothing-shared models. I think things wind up fundamentally more scalable (across machines) if you begin with that model.

Alexandre: thanks for the link to Alice ML! I'll check it out.

gregory: Objective-C seems decent at first glance. I read the language spec through about 3 weeks ago. The message sending is, of course, quite dynamic. I'll need to get more experience with it (but I'm 100% on Macs now, so it'll happen.)

For Charles and the other folks who jumped on the HotRuby thing: sure, it's a toy. I didn't mention it in the talk; decided it wasn't worth it. I hope you're not saying that the mention on the slide invalidates my message, which is that dynamic language performance is experiencing a renaissance of sorts, and that we should encourage it.

As for the meta-comments: preparing this transcript was a lot of work! It took basically a full Saturday. I'm glad some people found it worth reading (even those who disagree with it.)

And I like the Chalkboard font. ;-)

1:42 PM, May 12, 2008  
Blogger Levi said...

In case you haven't seen it, here's a link to an implementation of runtime call-trace hotspot optimization for arbitrary native binaries. It's an HP research project called Dynamo.

1:49 PM, May 12, 2008  
Blogger Unknown said...

The video is an ASX; WTF?! I guess I'll have to settle with your transcript. Maybe next time you'll also promote YouTube, not just Dynamic Lanaguages :-)

2:07 PM, May 12, 2008  
Blogger Ron said...

Your story about wanting to use Lisp at Google really resonated with me. For what it's worth, I tried the same thing back in 2000 when Google only had 100 employees and also failed. So the deck was really stacked hard against you.

We should get together some time to share parenthetical comments. :-)

2:14 PM, May 12, 2008  
Blogger Unknown said...

> Also, requiring the user to explicitly tag types is not a requirement of static typing. Haskell, OCaml, and Scala will automatically inference types when possible.

As always, this issue is not so much who has to provide the type tags, it's that you can't do anything with the program until someone does.

Moreover, you do have to provide dummy procedure definitions almost as soon as you start writing code that refers to an unwritten procedure, even if the reference is irrelevant to the work that you're doing and is just a placeholder for "do later".

2:14 PM, May 12, 2008  
Blogger Unknown said...

Dan Weinreb wrote: "Having a Common Lisp built on a virtual machine with a great JIT compiler is what I'd like to see. [...]"

Amen, I couldn't agree more.


Dan also mentioned: "Someone started writing a Common Lisp for the CLR, but it was too big a job and they punted."

In fact, there have been a number of attempts and projects to build a Lisp implementation on top of the CLR. A while ago, I did some research and listed them all (well, all projects I could somehow get hold of). One of those projects (IronLisp) even used the DLR.

2:25 PM, May 12, 2008  
Anonymous Anonymous said...

dan weinreb: "Amazingly, Richard Gabriel, in "Patterns of Software", written in 1998, said at one point that there was now only one computer language: C++."

Gabriel didn't say C++, just C.

2:33 PM, May 12, 2008  
Blogger Stephen said...

Some say the Lisp Machine died because when you build a special purpose machine, even if you have a 10x performance boost for your niche, it's still tough to compete with commodity Moore's law driven CPUs. It appears to me that the speed for a single CPU has stalled. So my 5 year old box is still 'modern', more or less. (I've been wanting a multi-CPU box since i wrote my first OS - and now that they're cheap, well, my old box still works... but i'll actually NEED 64 bits soon.) Perhaps the "JVM in hardware" idea can still float. Especially if it can use the huge # of devices per chip to do some of this analysis in real time while executing at full bore.

Don't take the 'java byte code machine' found on many ARM chips as an example. That's in an embedded (limited power, therefore device and cycle speed) niche. So, it turns out that many of these ARM processors don't even use their Java engines. I've got one in my pocket, for example.

Of course, i have no experience with either lisp machines or building such hardware. So take this with a grain of salt.

C is still the gold standard for speed, right? Does Google say C/C++ is OK? Or just C++? In the end, it's all down to speed. Even for 20%, if C is better than D, people will take C. It's about what's easy to measure, and what that easy to measure thing means. Have a huge project and want it to be maintainable? Keep the original development staff. How hard is that? For management - impossible.

And after writting half a million lines of C, it streams from my fingers at high rates. I've been looking into Python and Scheme of late, and of course, haven't seen any evidence for improved development speed. And yet, C wasn't my first language. Uhm, it was something like 37th. Still, Scheme is proving very difficult to pick up, despite prior Lisp exposure (it might have been #8).

How about Forth? It seems easier to write than read, and easy enough to write. It doesn't seem to use garbage collection. Even simple implementations are pretty fast.

For me, a language is easy to learn if you can build a mental model for it's execution. For Forth, the model is really simple. For C, well, i got to learn it with the Ritchie compiler on the PDP-11. I asked the compiler for assembler output, and found out exactly what he had in mind. Much better than the ANSI documents. So i code thinking what a PDP-11 might do with it, and trust that the compiler will do something similar. The trouble with LISP is that my professor talked for an hour about the switchboard going on inside. I think that was about memory management, and how totally non-linear the program is represented. Absolutely no clue about what the thing is doing. And, it was 30 years ago.

OK, so object oriented stuff is cool. I'm trying to learn closures. As near as i can tell, it's things that i've been doing in C for some time, though perhaps with less language support. But what people like about them isn't all that clear. I've done structured code in assembler too. I even implemented longjump for error recovery (really try/catch) in assembler. That doesn't mean i like try/catch very much.

From the reading, it sounds like it must have been an awesome talk. Giving such a talk to a bunch of experts is something like talking about science. It's extraordinarily easy to say something that's not just non-controversial, it's just plain stupidly wrong. It scares the willy's out of me. And if you talk about things that are potentially controversial, there's always going to be someone out there who thinks that it's a closed issue. There are no closed issues. There are no absolute truths. Even that one.

2:44 PM, May 12, 2008  
Blogger Chris Smoak said...

I suggest taking a look at the Tamarin project. It draws from Adobe's experience with ActionScript (both people involved and extensions to the ECMA spec, which has some of the optimizations for dynamic languages you mentioned, like optional type annotation) to build a JIT compiler for the next generation JavaScript. cool stuff.

4:58 PM, May 12, 2008  
Blogger Mark Lee Smith said...

Hi Steve, Objective-C & Dylan both fits as hybrid systems. Both have optional typing :).

5:03 PM, May 12, 2008  
Blogger Unknown said...

Great entry, Steve!

Coming from the Python side of things, I have long hoped to try out some of the Self ideas in Python. After trying type inference in Python and finding it lacking without changing the language, I have always hoped to see more tracing information propagating through a long-running Python process and see where it leads.

But that won't happen until I have the free time, which might not be until pigs fly or I actually finish my PhD, whichever comes first. =)

5:06 PM, May 12, 2008  
Blogger James Iry said...

If eval is what it takes to make a dynamic language then every Turing complete language is dynamic. Even if eval isn't built into the language spec you can always write it.

There is definitely a real distinction to be made about dynamic metaprogramming and its power. But eval isn't it.

8:11 PM, May 12, 2008  
Blogger Neville Ridley-Smith said...

IntelliJ will do intelligent renaming across all filetypes. If you reference a method name in a spring xml config file for example, it will rename it. If you mention the method in some javadoc, it will rename it. IntelliJ is just plain awesome.

8:42 PM, May 12, 2008  
Blogger Lucas Richter said...

One more of these: thanks very much for the transcript. This is one of those cases where I wouldn't have watched a video, but I'm really glad I read this.

9:05 PM, May 12, 2008  
Blogger Unknown said...

Steve, If you are going to eventually play with Objective-C on your mac, you might want to have a look at the Nu language ( http://programming.nu/ ). It's a fairly young, lisp and ruby inspired language on-top of the objC runtime

9:28 PM, May 12, 2008  
Blogger brett said...

My totally unscientific take on the question of high performance reference counting in lieu of true garbage collection which I guess is more of a response to one of the questions at the end of the talk as opposed to the talk itself:

http://brett-hall.blogspot.com/2008/05/blame-management.html

In a nut-shell: using atomic ops to manage the reference count just kills your performance, but what else are you going to do unless you're running single threaded.

10:36 PM, May 12, 2008  
Blogger John Connors said...

"So! It's interesting, because every once in a while, a C++ programmer, you know, they flip.."


Hmm. That's exactly what happened to me. After 7 + years of C/C++ being the "only" way, someone introduced me to Python, and thence on to Common Lisp.

I should have killed them before they opened their mouth. It's driven a complete coach and horses through my career.

3:34 AM, May 13, 2008  
Blogger wgarvin said...

Interesting presentation, thanks for blogging and transcribing it!

I like to read stuff like this, but...

I'm a game engine developer, with an interest in programming languages. The sad truth is that except for C++, there is nothing out there which can do the things we need to do, with an acceptable level of compiler/debugger/IDE support. I hate C++ because of its long compile times, lousy compiling/linking story, difficulty of parsing and manipulating it with tools, and lack of sane metaprogramming functionality (template metaprogramming is the spawn of the devil, heavily-templated code in a large complex system can be nigh undebuggable).

The problem is we absolutely need statically-compiled native code, manual memory management, and the ability to do low-level bit twiddling, muck about with pointers, etc. We can't tolerate the unpredictability of a garbage collector (1 millisecond is an eternity) nor the storage overhead (anything that saves 1 megabyte of memory is a giant win). We're always trying to push the performance limits of the target hardware (Xbox360, PS3 etc), and every 1% matters. We also want to cram as much data as possible into the console's RAM; again, every 1% matters (sometimes I spend an entire day doing an optimization that saves 100 KB of memory, or a million cycles per frame).

So basically, what do I want?

1) A language much simpler than C++, but with the same basic low-level performance and bit-twiddling abilities as C.
2) Module-based compilation. Strong IDE-based support for incremental compilation. It should inline aggressively, except maybe in areas of the program which are being edited and recompiled very recently.
3) Can't be JIT-based (must be native compiled), and can't rely on a garbage collector (I know it rules out nifty features like D's array slicing, but that can't be helped). Remember, we have to cross-compile our code for several platforms and deploy compiled binaries over a network or dedicated link. Consoles are kind of like embedded systems scaled up 100x.
4) Compile-time metaprogramming support. I basically want the power of Lisp macros with all the syntactic comforts of curly-brace languages. I want to write imperative metaprograms, in the same language--or a similar language--as the target program. They should be able to inspect an AST declarations and statements, generate new ones at will, etc.
5) Staged compilation, where the final output of the 2nd-last stage is human-readable source code in the same language as the code I wrote by hand. The last stage would just "compile" in the traditional sense...the stages before would be doing things like: evaluating metaprograms and template expansions, evalating conditional compilation (when compiling for PS3 I want the human-readable output to include PS3-specific code and not include code specific to some other platform, etc).
6) In the IDE and debugger, I want to be able to switch back and forth at will between my hand-authored source code and the human-readable output code produced by stage N-1.

We find that we don't need much run-time dynamicity for game engines (no more than C++ provides). For example, many game engines don't use compiler-provided RTTI, but often we do roll our own RTTI and reflection things (and building those with a compile-time metaprogramming facility would be almost ideal for us). Remapping strings to integer IDs is another task I'd love to be able do with compile-time metaprogramming (instead of at run-time or the ugly hacks we end up doing now). Very useful for efficient printf-style logging (i.e. replace all the format strings with IDs and offload the actual string manipulation to a log-viewing tool). I also want to generate initialized data structures to act as helpers for each instantiation of a certain template, or translate "simple" data structure descriptions into complex run-time representations which take less space. Compile-time metaprogramming is ideal for this, and has some big advantages over using some external tool to generate the code. I really want to be able to see and debug the generated code in source form though.

Anyway, there is no language I know of which is "excellent all around" for modern game engine programming. C++ is sufficient in the performance department, but in many other areas it is somewhere between barely-adequate and sorely-lacking. Tools for C++ (e.g. Developer Studio) are as good as they will ever get, and they still suck. Building our executables for one target platform can take over half an hour, and produce hundreds of megabytes of debug information and hundreds of megabytes of precompiled header files for an executable which is maybe 20 megabytes. Small changes often require recompiling dozens or hundreds of source files (especially if you touch anything included by a precompiled header, yuck).

In order to do better than this, we need to start over with a vastly simpler and cleaner language, and design it so it is easy to parse correctly and provide strong tool support for. I just don't know how to motivate anybody to work on statically-compiled languages instead of JITs for JavaScript-or-some-other-dynamic-language.

3:45 AM, May 13, 2008  
Blogger Dan Weinreb said...

gvwilson: Solid data on relative productivity of two languages is very hard to get. You need to put two equivalent programmers in a room, and give them the same amount of time to do the same problem in each language, or something like that. I actually know someone who intends to do this: he has two programmers for whom he can make a very good argument that they are of the same quality, and who are willing to participate in the experiment. I'm looking forward to hearing from him. Otherwise, you just have to decide whether you believe us, or you have to try it yourself.

Jesse Tov: Is a generational collector good enough for real-time code? It depends on what you mean, quantitiatively, by real-time. It depends on the numerical value of the latency variation that the application allows. InspireData is an interactive application, for which GC is fine. If you make the real-time requirements stringent enough, yes, even the best modern GC's would be a problem.

Marklinn: Shared-nothing concurrency a la Erlang is an important idea, and I hope to see it tried out in other languages (since not everything else in Erlang is what I want). I'm particularly interested in the isolation it provides between the threads, in case one of them detects a software bug (violates an assertion). You can kill one off, leaving the others undisturbed. Designing software in the face of knowledge that it still has bugs is something I've always been interested in.

Clausb: Thank you very much for the reference! I'll take a careful look at that.

Stephen: Yes, that's one of the main reasons that the Lisp machines became obsolete. (See also http://dlweinreb.wordpress.com/2007/11/16/why-did-symbolics-fail/) Regarding the "JVM in hardware", have you seen the www.azulsystems.com Java accelerator hardware? See http://www.pbs.org/cringely/pulpit/2008/pulpit_20080229_004404.html for example. As for learning Lisp, I suggest you ignore the professor and take a look at "Practical Common Lisp" by Peter Siebel, which is the new gold standard for learning Common Lisp. You'll like it.

4:15 AM, May 13, 2008  
Blogger Tim Kerchmar said...

To Wylie:

I respectfully disagree on your point about absolutely requiring manual memory management. I work on the Gamebryo engine, easily one of the top 5 3D game engines in the world, and have been writing a Common Lisp wrapper for it in my spare time for a game project. The garbage collector can be tuned to have a small first generation and many quick GC passes. For the casual games sector, the performance has been very acceptable. Keep in mind that like other software, games are still going to use 3rd party libraries for everything. I'm using Flash, fmod, RakNet, and Gamebryo for the game, and all of those libraries are native C++. There is no need for the game itself to be much faster. The only thing that is very important for a Win32 implementation of Lisp is native threads (not posix), and some way to predictably use finalization to deallocate DX9 resources. With these two requirements fulfilled, I don't feel that Lisp is incapable of working for a real commercial game product.

6:27 AM, May 13, 2008  
Blogger Jesse A. Tov said...

Dan Weinreb: I'm just responding to the misconception that a generational collector eliminates pause times. It eliminates some pause times, but unless you're doing something incremental, occasionally you'll have an O(L)-time pause. I'm not saying this makes it unsuitable for this or that application. I'm just stating an algorithmic fact.

If you need pauses to be bounded by something less than the size of the live data, you need something complicated. There are a whole bunch to choose from, and they have different algorithmic properties, different constant factors, etc. If you need bounds on pauses or minimum mutator utilization, you're going to be implementing something terribly complicated.

My concern is that if you tell someone, "Language X has generational collection, so you won't get pauses," they might be disappointed. That's not what plain-old generational collection is about.

7:14 AM, May 13, 2008  
Blogger ombzzz said...

thank you for the transcript !

great talk

i printed the papers and will have a look at them

greetings from argentina

orlando

7:29 AM, May 13, 2008  
Blogger Knotty said...

Great post!

It really emphasizes the fact that It's always worth investing the time and effort in learning a new language. Thanks for the positive vibes!

12:16 PM, May 13, 2008  
Blogger Coderboi said...

The answer to Wylie's (much justified) lament increasingly seems to be the programming language D.

4:01 PM, May 13, 2008  
Blogger Jack Palevich said...

Managed C++ The Microsoft CLR version of C/C++ works great. I was able to compile Quake by just adding the /clr command line switch. It ran it with a 50% performance hit compared to native code. But it's kind of pointless. If you're willing to take the perf hit of using CLR you might as well switch to using C#.

Another lang besides C++ for games You can use Lisp for games. Naughty Dog shipped many AAA console games using Lisp. The trick is to avoid consing at runtime, which eliminates Garbage Collection pauses. But once Naughty Dog got bought by Sony they were forced to switch to C/C++ because they needed to share code with other Sony game teams.

You could of course apply the same trick to any other GC language such as Java or D or Javascript. Of course some languages make it very difficult to write code without consing.

7:42 PM, May 13, 2008  
Blogger Ben said...

@coderboi: Seconded with enthusiasm. I worked for a gaming company in a previous life, so I know somewhat whereof I speak.

When the game industry discovers D, they're going to wonder why the hell they didn't do it years ago.

9:50 PM, May 13, 2008  
Blogger zwetan said...

great post
I think I agree on almost everything

really thank you to have shared all that

now some more questions :)

when I've read another of your post "the next big language", I thought he's talking about ECMAScript 4

when I see what's going on in ES4 and in parallel with the Tamarin project (especially Tamarin TT)

I can not prevent myself to think, but hey ES4+Tamarin they are leading in those directions,
really on almost everyone of your slides I could say "hey ES4 do that" or "hey Tamarin do that", etc.

so here the questions,
Steve, what's your take on ES4+Tamarin ?
do you see them as an ideal combination ?
can you see some part where they could fail ?

and now the tricky bonus questions =)

Google now have just a few of officialy supported languages and amongst them JavaScript,
in the forseable future could you see ECMAScript 4 as a new official language at Google ?

Could you imagine something as Rhino on Rail evolving to ES4 on Rail ?

12:27 AM, May 14, 2008  
Blogger slickrockPete said...

This all reminds me of so many hallway conversations I had in 1990.
Some of the compiler/jit technology has come further, but the underlying concepts of using inference and runtime behavior to influence the shape of the application were all there. Then we were more concerned with memory since it was expensive and the biggest problem was paging, so we had treeshakers and reorganizers to optimize footprint and paging behavior respectively.

What often happens now is that the dynamic language nerds, and others who find that's what they need, implement whatever features they need in C or java or whatever they have been forced to use.

Wylie:
The embedded/game world has some interesting constraints.
There was a lisp implementation by Harlequin for a telco switch that was garbage collected. They got around the GC delay by using an object table and incremental GC. The overhead was spread out to every allocation and object reference.

Back in the day we spent a lot of time on trying to make all of the arithmetic and bit twiddling operations in CL as fast as C.
I do think it's theoretically possible to have a lisp (or whatever dynamic language you prefer. I'm not hung up on syntax) based game development system, but getting anyone to start using it is probably an insurmountable problem.

1:01 PM, May 14, 2008  
Blogger wgarvin said...

To pTymN:

I did not mean to imply that Lisp or other dynamic languages can not be used for games--I wouldn't try to write a game *engine* in one, though! I wonder how fast Gamebryo would be (and what its memory footprint would be like) if it were rewritten in Lisp with a suitable optimizing compiler?

I do vaguely recall that Naughty Dog used their own custom Lisp for Jak and Daxter (on the PS2). I'm not sure if they're still doing that for newer titles.

Anyway, at my workplace (Ubisoft) we have several different in-house engines of varying size and complexity, and all of them are written in C++. For higher-level code, most of them embed either a home-grown scripting language, or an off-the-shelf scripting language such as Lua. But the engines themselves are entirely C++.

If you weren't trying to push the limits technically, you could write an engine in something garbage-collected and maybe get away with it. But I think a modern engine for any ambitious AAA game currently needs to be native compiled code and no GC.

10:57 PM, May 14, 2008  
Blogger Unknown said...

Naughty Dog uses Lisp again. There was recently a presentation what they are doing now.

The GC-problem comes up in several domains. People who still want to use face a problem. The mentioned ATM switch used a custom LispWorks version. Before that, the same group used an embedded Lisp machine with a special real-time OS. There are other applications that have similar problems. Years ago there was an Expert System developed for real-time applications (like controlling chemical plants, etc.). G2 from Gensym was the product, it is still available. G2 was developed in a special Lisp with no GC.

Games and simulation systems are areas where much of the standard software is C++. Anybody who would want to use something like Lisp would have lots of work to adapt it to that domain. Very few did it. I wonder if there will be an affordable GC that could be used in these domain. Choosing another language than C++ has then the additional drawbacks, that the usual game developers don't know how to use it and no specific libraries are available. For G2 this was not that a huge problem, since it was developed as a highly-graphical expert system - something that's quite possible to do in a Lisp-dialect - they had to solve the real-time problem, though.

7:58 AM, May 15, 2008  
Blogger Sony Mathew said...

I feel Yegge is grasping at straws for arguments. Take any argument say Optimization or Refactoring for e.g, any improvements in these are all going to apply proportionally to dynamic and static languages keeping static at pace or better. Even arguments such as productivity as measured in keystrokes or language features like closures - static languages can keep pace with its own syntactic sugar (but its not fair to always compare against Java - it is now an old language after all trying to catch up).

Say dynamic languages do catch up in performance. Static Types still form a fundamental set of pre & post assertions that are always going to be valuable in a programmer's toolset - not to mention valuable for compiler optimizations, refactorings, tools, clarity across large teams, etc. etc. for which more Type information is always going to yield better results, no matter how much you try to dismiss it. Its like you can have a car, a house, clothes and be completely self-sufficient but having more money is "always" going to provide better benefits.

Take your humorous cartoon - depicting a dog, a house etc. with labels - Isn't the reality more like having indistinguishable boxes? Say you had to make a pie and you had a set of indistinguishable boxes, also assume some of those boxes are bombs...hahah. It would tough to make a pie without setting up the right assertions first - and Static Types provide some of those basic assertions upfront. I am a very defensive assertive programmer and Static Types form a valuable part of my defensive programming style, and along with dynamically coded assertions, allows me to programmatically prove my execution paths as correct, not to mention the clarity it provides when I or anyone else reads that code a year later. I almost always write programs correctly the first time, i mean once its compiled, they almost always execute correctly as expected the first time - i find it freaky sometimes.

Your arguments regarding highly distributed applications (e.g. apps that use the web as a platform) or systems that live forever - are definitely better arguments. In these environments - independent modules "assert" inputs dynamically before proceeding. The point is that "assertion" is a valuable tool and when it is feasible to get some basic assertions for free with Static Typing, it is always going to yield better results (again analogy: a self-sufficient guy can always get more benefits with more money). There is No reason why these Modules can't be created with Static Typed languages and reloaded dynamically into the Application to sustain longevity.

12:29 PM, May 15, 2008  
Blogger Unknown said...

I also want to express gratitude for putting up a transcript, as I think the current trend of publishing by podcasts is a disease. People that read as slowly as they talk should take a class. Really.

I think that the blog post can be summarized as: The problems with dynamic languages are in the implementation which can be fixed, whereas the problems with static(-ally typed) languages are in the designs themselves for which there is no fix.

However, what I find lacking in dynamic languages is the lack of context that typing provides. Of course, your point is that the same context can also be constraining.

1:15 PM, May 15, 2008  
Blogger Jack Palevich said...

> Rainer said... "Naughty Dog uses Lisp again"

Unfortunately, it doesn't appear that Naughty Dog is using Lisp in any significant way. If you read their GDC presentation, it says they use Scheme as a scripting language for their game data pipeline. They gave zero examples of why Scheme would be any better for this application than Lua or Python or even a shell scripting language.

Scripting a data pipeline is just writing glue code -- it's a much wimpier use of Lisp compared to the old days where they used Lisp for their core game engine.

4:58 PM, May 15, 2008  
Blogger Antonio Marquez said...

Wylie:

Here's a quote from a discussion I found regarding GOAL (Naughty Dog's lisp dialect):

"Well, fast iteration times weren't merely due to the the listener - that was a nice touch, but only the tip of the iceberg. We could basically dynamically link per-function or variable. Effectively, you could hit a key while working in the IDE, and whatever function the cursor was on would instantly get compiled, sent across the network to the TOOL, linked and dropped into the game while it was running. The whole process took a fraction of a second. You could also do the same per-file. This feature was sort of like Edit and Continue, but you didn't have to broken in the debugger - it could be done while the game was running. This was insanely useful for programming gameplay, physics, and fx, as well as prototyping, visual debugging (just drop in some debug spheres or prints while you have the game in some interesting state), etc. We also used it for dynamic code streaming - so only a fraction of the executable code was loaded at any given time (to conserve memory)." -- Scott Shumaker

From: http://web.archive.org/web/20070315153349/lists.midnightryder.com/pipermail/sweng-gamedev-midnightryder.com/2005-August/003798.html

5:28 AM, May 17, 2008  
Blogger Unknown said...

> Static Types form a valuable part of my defensive programming style, and along with dynamically coded assertions, allows me to programmatically prove my execution paths as correct

No, they don't. Type correctness merely guarantees that you're not asking something from a representation that it can't do.

That's a long way from program correctness.

And, since static languages tend to have something other than "duck typing", your representations are more brittle than they could be.

9:11 PM, May 19, 2008  
Blogger Jesse A. Tov said...

Andy Freeman says:

> Type correctness merely guarantees that you're not asking something from a representation that it can't do.

That's a very limited notion of type correctness. When I do typeful programming, I may have a variety of types that all have the same representation, but correspond to semantic notions that shouldn't mix. The abstract type exported by my library may be represented as a regular old string (for now), but as a client you'll never know. With phantom types, there may be type parameters that are completely unrelated to representation, which are used to keep track of other sorts of static information about programs.

> And, since static languages tend to have something other than "duck typing", your representations are more brittle than they could be.

Conversely, parametricity guarantees that when you reuse polymorphic code, you have an iron-clad guarantee that it won't get into certain sorts of mischief. If a function's type is polymorphic in a particular parameter, you know that its behavior can't depend on inspecting that parameter.

(If "duck types" are all you're after, OCaml's row-typed objects will give you that, though word on the street is that OCaml's object system is more trouble than it's worth.)

9:26 PM, May 19, 2008  
Blogger Unknown said...

Komodo from ActiveState is an IDE for Perl, PHP, Python, Ruby, Tcl, HTML, CSS, JavaScript, XML, and more. I use it for Perl - it's not perfect but it sure beats not using an IDE!

7:54 AM, May 26, 2008  
Blogger John Zabroski said...

(I originally sent the below to Steve by e-mail, not realizing he enabled comments.)

Steve,

in your transcription of your Stanford talk, you were asked about C++ programmers striving for maintainability and performance. Noticeably absent from your answer was 'generic programming'. The two most generic operators in programming, equality and copy (and, therefore, assignment), require explicit semantics. Not just to keep programmer's sane when first learning a language by conquering built-in types, but also for creating compatible user-defined types. Compilers can take advantage of generic properties of code to perform optimizations like common subexpression elimination, constant propagation, code hoisting and sinking, etc. Really, the biggest challenges for C++ 0x, as far as Google should be concerned, is defining a memory model and some concurrency model.

Your talk was entertaining. The most valuable point you made was about probability, but you blew a good opportunity to talk about probability versus accuracy (computer scientists at Google seem to have a hard time comprehending that fuzzy sets are a generalization of crisp sets, which precludes them from thinking about 'state zero' at time 't0' as being fuzzy); Note that generic programming turns 'definiteness' of traditional concrete algorithms on its head by replacing a partial instruction stream with axioms that describe general facts about those streams, so it is not just 'dynamic encoding' research that is changing the pantomime of the perf landscape; However, your biggest misfire was not explaining the evolution of scheduling priorities in various OS models e.g. the unholy mess that is Windows NT, and the impact it has on writing portable code w/ perf guarantees, such as background worker threads.

Z-Bo

12:11 PM, May 29, 2008  
Blogger Laurent GUERBY said...

To Wylie: try Ada :). It even has compilation to CLR and Java bytecodes, native threading, distributed system and very low level bit tweaking.

Stevey, many thanks for this wonderful transcript!

4:07 AM, May 31, 2008  
Blogger Eric said...

Hi Stevey. Great talk! Thanks for that!

I just wonder how you came up with "The next thing that happens in the JVM is the JIT undoes all the optimizations!". This seems highly inaccurate.

Firstly, javac does not actually do any optimizations, not even constant propagation. It does not have to do so because -as you say- the JIT does it right away.

Therefore the JIT does not actually undo anything. Why would it? In fact it cannot even undo optimizations. How do you want to undo a constant propagation? That's obviously impossible. Maybe I misunderstand what you meant to say but as it stands now that does not really make sense.

10:12 AM, June 07, 2008  
Blogger kerecsen said...

I'm really curious about what ticked you off about Java 7 :) Please give us another rant on the topic...

1:15 PM, June 09, 2008  
Anonymous Anonymous said...

Don't see why C++ will be left behind with multicore. Worked at national labs with large scale massively parallel C++ codes; works great!

9:50 AM, June 11, 2008  
Anonymous Anonymous said...

The low phase that has set into the IT sector is no longer news. The fall in US economy was a major blow to the IT world all over the world. In countries like India where you can find a software development company on every nook and corner are rapidly closing down because of lack of work. What is to be seen is the strategy formulated by the big fishes such as Infosys and Tata. The current situation is of uncertainty and fear as companies are sacking employees, something that was unthought of a few years back!

10:07 PM, June 29, 2008  
Blogger Jeff Jacobs said...

So you've left untold the stories of overflowing a long, and if I'm reading between the lines right, the story of how you used a legal loophole to use server side JavaScript.

I'm guessing when you told the higher ups the project was using JavaScript, you didn't tell them it was server-side?

10:24 AM, July 02, 2008  
Blogger ed said...

I really liked GeoWorks on the desktop. Please tell anyone you still know from there that people loved it.

7:39 PM, July 06, 2008  
Blogger Darren New said...

Is there something wrong with the share-nothing native threads of Tcl that you dismiss it as a dynamic language with native thread support?

Indeed, switching from event-driven to threaded is something I've repeatedly done in a fairly trivial way when Tcl programs have grown to need such concurrency.

6:23 AM, September 14, 2008  
Anonymous Anonymous said...

Now, of course, you have to share the story about the time

1:27 AM, September 18, 2008  
Anonymous Anonymous said...

Don't get me wrong: I'm confident that JRuby can eventually

1:30 AM, September 18, 2008  
Blogger Quintin said...

Brilliant! I'm dumb struck.

From a young age I experimented with many languages. Then I was "All C++". Then I discovered dynamic languages and professionally did Bash/PHP/JavaScript. After discovering Java I have to admit that I've been getting "blackholed" into it and having it become my religion, moving away from all languages and *especially* all dynamic languages -> mostly because of slow performance and the lack of strong typing and compile time errors *shy*.

As with all your posts I always feel I come out a "wiser" person ;> Thanks! I'll certainly be keeping my eyes on this topic.

What do you think of the V8 VM? Are they making way towards "truly" better performance?

11:27 AM, November 14, 2008  
Blogger Tartley said...

Thanks for the post. I just ended up doing a talk about dynamic languages on .Net,
http://www.tartley.com/?p=456
which now I come to hit [publish] and read it through, is very clearly in the shadow of this awesome post by Steve. Thanks for the information and the good influence Steve, hooray the defacto creative commons. \o/ I assume that's cool.

4:36 PM, November 24, 2008  
Anonymous Anonymous said...

pray4me-Jesus told me persecution is him trying to make you loose your salvation, pray for me, Holy Ghost baptizm feels like Body your perfect size coming in, electricity caressing penus, spiking, hair loss, itching in scalp, face bags and sags cause muscles tithen at instant you wink, turn, head, in vision Jesus came down and pointed like lighted needle thru heart, felt hole, all desire to live for Jesus drained out, this is teasing by God to persecute, gum desease, dreams of family in hell , loose hair, wrinkle, I love putting people in hell to talk about it thru out eternity, they would not do one thing but cry and plead and beg but I would torture them, the Lord told me something like that he is laughing and tried to make me loose my salvation since I got saved and and told me he will do it to everybody, and in vision I saw Jesus and he said something like, "i have mercy", and he laughed like maniac, on 11-2008-God said: pray for me, God said-I the Lord toture to abase, causing my slaves to hate me by manifold temptations, teasing, caressing dick, making it harden, soften, make face, crease, bag, tithening muscles, when used, stripping heart, of all desire to live for me, laughing at them, teasing, night and day, to make them offended all the time, causing most to loose their souls, perish, said, Jesus, after you served, to abase, in hell, saith the LOrd, I hate people cause the tortured me, saith God, Im the LOrd, who saves, then destroys, ha ha, saith God, make me your Lord, and I will destroy, you, till you perish, saith God:
if you steal the name saith the Lord, I will ruin you, with sickness, desease, cut you off instantly, and embarrass you, these all have stole the name ministry of dreams, to make prophet fall, Laurie Behncke steals God spoken created protected christians ministry name ministry of dreams, Duncan Rouleau, dr. rev. jeremy taylor d.min u.u. m.a. ph.d. steals God spoken created protected christians ministry name ministry of dreams and tristy taylor writes and tries to get prophet to speak blasphemy against the HOly Ghost and rev. amy brucker built dr. jeremy taylor's website and kathy taylor approves but calls herself reverend, Alexander Micevski stole ministry of dreams and at one time held ministryofdreams.net to try to ruin Gods servant, Alexander Micevski was the first one to steal the God spoken created name ministry of dreams, antonia vladimirova steals God spoken created protected christians ministry name ministry of dreams, dee finney steals God spoken created protected christians ministry name ministry of dreams, Randy Sukhai steals God spoken created protected christians ministry name ministry of dreams, john mark pool steals God spoken created protected christians ministry name ministry of dreams, randy haxor steals God spoken created protected christians ministry name ministry of dreams, douglas heller steals God spoken created protected christians ministry name ministry of dreams, ron tompkins steals God spoken created protected christians ministry name ministry of dreams, Antonia Vladimirova, LLM
Director of Dreams Foundation
Abba's Kids Director steals God spoken created protected christians ministry name ministry of dreams after christian signs her guestbook,
elizabeth shea steals God spoken created protected christians ministry name ministry of dreams, jose alvarez steals God spoken created protected christians ministry name ministry of dreams, tom beland steals God spoken created protected christians ministry name ministry of dreams, matt brady steals God spoken created protected christians ministry name ministry of dreams, wole soyinka steals God spoken created protected christians ministry name ministry of dreams, dave karlotski steals God spoken created protected christians ministry name ministry of dreams, dana tillusz steals God spoken created protected christians ministry name ministry of dreams, quinling harlequin steals God spoken created protected christians ministry name ministry of dreams, andy shaw steals God spoken created protected christians ministry name ministry of dreams, christina marie sanford steals God spoken created protected christians ministry name ministry of dreams, jean patrick charrey steals God spoken created protected christians ministry name ministry of dreams, brian, bill of the massive bri steals God spoken created protected christians ministry name ministry of dreams, lawrence forman steals God spoken created protected christians ministry name ministry of dreams, david shapiro steals God spoken created protected christians ministry name ministry of dreams, katie bazor, janetmck who is janet mcknight steals God spoken created protected christians ministry name ministry of dreams, arron shutt steals God spoken created protected christians ministry name ministry of dreams, micheru mathys' steals God spoken created protected christians ministry name ministry of dreams, Senyum Sayang steals and alters God spoken created protected christians ministry name ministry of dreams to ministry dreams like dr. mike murdock, nick field steals God spoken created protected christians ministry name ministry of dreams, jonah weiland steals God spoken created protected christians ministry name ministry of dreams, ismail kadare steals God spoken created protected christians ministry name ministry of dreams, robert gillen steals God spoken created protected christians ministry name ministry of dreams, lawrence forman of funarchy, cluracan of furnarchy steals God spoken created protected christians ministry name ministry of dreams, cluracan of dreamnova steals God spoken created protected christians ministry name ministry of dreams, diana hughey holds ministryofdreams.com, utubia1party steals God spoken created protected christians ministry name ministry of dreams, duncan rouleau's children jonah weiland, FunkyGreenJerusalem steals God spoken created protected christians ministry name ministry of dreams, FunkyGreenJerusalem is Ben Lipman steals God spoken created protected christians ministry name ministry of dreams, CKYT Radio13 steals the God spoken name ministry of dreams, CKYT Radio 13 steals the God spoken name ministry of dreams, this is joel tao, sandy_brophy@hotmail.com Dalarsco steals God spoken created protected christians ministry name ministry of dreams, sandy brophy steals God spoken created protected christians ministry name ministry of dreams, Thomas Rickert steals God spoken created protected christians ministry name ministry of dreams, utubiaparty1, sistermoonshine13 steals God spoken created protected christians ministry name ministry of dreams, joel tao steals God spoken created protected christians ministry name ministry of dreams , robert gillen, ginny hill, john e. carey steals God spoken created protected christians ministry name ministry of dreams, wesley pruden steals God spoken created protected christians ministry name ministry of dreams, furnarchy, krayzier, merc, amfortas steals God spoken created protected christians ministry name ministry of dreams, kragamore steals God spoken created protected christians ministry name ministry of dreams, jag, khaless steals God spoken created protected christians ministry name ministry of dreams, furcadia and katie bazor steals God spoken created protected christians ministry name ministry of dreams, john scott alters steals God spoken created protected christians ministry name ministry of dreams, s.m. scott alters steals God spoken created protected christians ministry name ministry of dreams, sue scott is the ministry of Spirit and she alters steals God spoken created protected christians ministry name ministry of dreams larry clow steals God spoken created protected christians ministry name ministry of dreams, michael doran steals God spoken created protected christians ministry name ministry of dreams, henry wynn steals God spoken created protected christians ministry name ministry of dreams, andrew bosch steals God spoken created protected christians ministry name ministry of dreams, diana hughey of as one heart photography and artdizon holds ministryofdreams.com, Brandon Adamson, Damadar is cluracan, dchandler, Thomas Rickert the last-fm, wole soyinka, eric lindberg, Lauren Artres of Veriditas alters ministry of dreams to exalt her Lord doctor rev jeremy taylor unitarian universalist minister who did steal ministry of dreams to ruin Gods servant, todd callender steals God spoken created protected christians ministry name ministry of dreams, alex ness steals God spoken created protected christians ministry name ministry of dreams, kenley darling steals God spoken created protected christians ministry name ministry of dreams, Brad Loubser steals God spoken created protected christians ministry name ministry of dreams , and more, like jeff dee steals God spoken created protected christians ministry name ministry of dreams, manda dee steals God spoken created protected christians ministry name ministry of dreams, amanda dee steals God spoken created protected christians ministry name ministry of dreams, most claim to be christians saith the Lord, im ruining them right now, I will not save any for wanting elijah to fall, and they will loose health, jobs, talks, spouses, more, saith the Lord, for this evil wickedness, which, bob, reached out to most like david shapiro and dee finney, micheru mathys, and they stole his God spoken ministry name like cluracan who is little, mustie and cecil pennyton, when he prayed for them, like john mark pool, and jose alvarez, he signed jose alvarez's guestbook, and jose alvarez stole the name ministry of dreams for doing, wicked, pastor, like antonia vladimirova of the, dreams foundation, he signed her guest book, and she stole the name ministry of dreams, when he witnessed to her, she is witch, lost forever, like wole soyinka steals God spoken created protected christians ministry name ministry of dreams, amanda dee, jeff dee steals God spoken created protected christians ministry name ministry of dreams, and duncan rouleau steals God spoken created protected christians ministry name ministry of dreams, bob reached out to all, and they all stole the name ministry of dreams, God spoke for this only, and Im ruining them all, watch, saith the Lord, they all release when times get hard.
do not copy in any way or modify the name Ministry of Dreams, ministryofdreams, or I the Lord will ruin you, it did not exist till prophet prayed, and brought it to the internet, for this work only. I will make you run to remove the name ministry of dreams like all listed here will, saith the Lord, God spoke this: persecution is me Jesus trying to make my servants loose thier salvation by manifold temptations like, careesing thier sex organs with electricity, like oral sex, stripping heart of desire to live for me, teasing, scaring by what I say to them, making thier faces bag, sag, wrinkle, hair loss, laughing at them, letting them see me, like people do a little dog, trying to get them and others thru them to speak blasphemy against the HOly Ghost, it is not my will that any perish, but get saved and I the Lord will kill you all day long, those who dont make it are ROMANS 1, saying what I know will hurt them deep in thier hearts, making them hate me, cause I want them to be backslided, saith the Lord, for no reason, -------this is me Jesus torturing my servant non stop cause I want to make him backslide to embarras him, and will do it till he does or dies, and will to all that come to me, without exception, saith the Lord------Jesus told me about 11-2-2008-hes been doing this since the day I got saved and he will make me desire suicide to cause me to despair unto life, Jesus tole me he was the one that makes people fall by the wayside and loves it when one perishes after they have served him, or, the LOrd let me know that he would do this until I left and would not come back, in other words, backslided with no intention of returning, and also told me about fasting, fast till the brink of death, and he told me things like, then I would stop cause I had no choice, then I will leave you and strip you of your flesh, etc Holy Ghost baptizm feels like body your perfect size coming in, and after, Jesus kills all day long, thats torture thru manifold temptations, terrifying in visions, like letting me dream that im choking, in pain, cat breath, making my face bag, wrinkle by tightening muscles at instant I move them, carressing penus with electricity, feels like oral sex, like electric tongue sucking, going around, spiking, hurting me deep in my heart all day, scaring me by what he tells me, threatens lost, shows me my family in hell, fire, my hair falls in eyes, leads me to drive van into oncoming cars, drive into them, walking in circles, spining when stopped looks like blur to passer by, told me if I got to church, spin till they call police and have me removed, spin while they are trying to arrest, spin till collapse, then roll on floor, the Lord told me persecution is him trying to make me loose my salvation, deseasing gums, etc, tortured like a littley poodle, he doesnt care about this page cause he told me he will do it to "whoever gets saved saith Jesus to strip of thier, make them despair, not be able to go on, like I couldnt" 7-21-2008-persecution is mainly me Jesus trying to make my servants backslide". this is mainly to reach out for your prayer, inercession, you may not like it, if you read, and i dont necessarily like it either, but, you will under stand if you read why I put it up, if you read these pages, God spoke most, and has visions of Jesus telling what he is doing to me, pray for me pray for me, I the LOrd persecute by careesing penus with electricity, causing hair loss, bagging and cracking face by titghtening muscles at instant he moves, piercing heart by trying to anger all day, this is teasing, prodding my elijah, saith God, thru manifold temptations, I strip out desire to live for me out of heart by taking it away, anger, and then remove from memory, wrath, most perish, you who read this will perish and you know it, cause I abase, kill all day long, saith God, and I do more then this, I will tach on to make worse, then, feel ashamed, but damage done, 1 PETER 4:18 And if the righteous scarcely be saved, where shall the ungodly and the sinner appear? HOly GHost baptizm feels like a Body your perfect size coming in, and It speaks in another language, tongues. ST. MARK 10:30 But he shall receive an hundredfold now in this time, houses, and brethren, and sisters, and mothers, and children, and lands, with persecutions; and in the world to come eternal life. 7-3-2008-persecution is me the Lord trying to make you loose your salvation, by torturing, I say come unto me all ye that are heavy laden and I will give you rest then kill all day persecuting, torturing, to abase till you backslide, then if I miss I make you strong, then, destroy, ha ha saith God, what are you going to do about it, I made hell to abase and want people there saith God, many of my servants are there cause I torture, destroy, God speaks on 6-14-2008-I am tempting you to sin to make you loose your salvation, that is persecution, I will not stop until you die, I start killing all day long when you get filled with HOly Ghost, saith GOd, on about 5-31-2008-I saw Jesus in a vision or a man that looked like Jesus and he said, or I heard this, "put your mom in the lake, kill her for me". since I got baptized wiht the HOly Ghost in about 1997 and , and right now IM in fellowship of the Father, and that is staying in visions and hearing God speak thru Jesus or ministering spirits, which I see, all the time unless im talking or doing something bloxing it out, . I saw Jesus in a vision and im going by memory, and I was there with and Jesus had a prod, this is like what they poke cattle with, like pitchfork, and I saw in visions being stabbed with prod, like you use to stab cattle, and there was an edge, and the LOrd let me know he was pushing me to that edge, and that is using his prod, and when I fell over, he caught me in the air. Prodding is torturing, and a prod is like what they use to make cattle move, poke, stab with, with below torments and God spoke most of this, and I have been chased out by, many times, and now my face is ripping, bagging, sagging because of a new intruduced torment, when I blink, or turn head, its like muscles tighten, so they stretch, or maybe even move with my doing, much face damage,(i may look like a star wars character if this doesnt stop)(like last night, and it is, please pray for my face, muscles in face, tighten when used like when I wink, turn my head, causing strectching, tearing, breaking, bagging, have sanded out many bags, when I lay down, I feel eye muscle tighten up, and if I dont move, and I know if I do, cause damage, its like the Lord shoots electrical beam into eye muscle or, cause muscles in face to stretch, I see Jesus laughing at me in visions, and he said one time in vision, "come on", like called to fight
I woke up on 9-9-2008 early in morning, and was tortured for hours, and God tole me I wanted suicide, and I saw a big bag under one of my eyes. My nose is red with some sores and like boils, eyes in corners are swollen or damaged......LOrd what do you have to say about this: Jesus: I the Lord, tortured you this morning to make you humble in my eyes, backslided and suicidal is humble in my eyes, ill ruin you, and I told you to go into the world and stay until I call you back, but you wont and I wont..........bob-you told me about walking in circles and spinning when stopped that this was to destroy and that you would have gotten me put in prison, and stripped me of my salvation in prison..........Jesus: I did Lord, I saw the bag under my eye, and you caused it somehow, I woke up and it was there, and what IM saying to you is right when swelling goes down, and knows heals, we are going to plastic surgery, what do you think about that, that means instead of looking my age, ill look about 25..........Jesus: I caused it and I the Lord will heal it........bob: ok, when are you going to heal it, are you stilll trying to make me backslide, you keep saying to me, "run from me into the world, hide, etc". Jesus: I want to humble you and taht is backslided. bob: If I put the shoe on the other foot I would be happy to have you, anyone, I certainly would not torture.....did you become jelous of your upright servant JOB who was perfect, and move satan against him and leave in in the last state, after allowing stripping of so much, leave him boiled wanting him to leave you, and not return, isnt that what you wanted, did you watch possibly for etertainment purposes, but uyour heart changed.........Jesus: that is exactly what happened, I became jelous of JOB, cause I prospered, and he gave me glory, but, I wanted more then he would give, like you, so I moved satan against, and satan did not want to come against............bob: didnt you tell me that you want to make me backslide to prevent me from becoming known, and so far have added torment onto torment.................Jesus: yes bob: then you saved me and knew before like you have told me, that at the end you would hit me hard, it has really not stoppped, probably since I got filled with YOU, YOU have been torturing me out........................Jesus: I did exactly that, this is killing all day long, suffer, or perish............bob: could you yourself endure what you are doing to me. Jesus: no, but, would any way bob: Lord I have a question, you somehow take out of my heart all desire to live for you, how do yoou do that Jesus: I know where it is in your brain, desire to do this or that, thats how, and I make you hate me, that means no desire to live for me and cause you to hate me ontop of that, ha ha, saith Jesus, hell will be full bob: you said it
JOB 4:2 [14] Fear came upon me, and trembling, which made all my bones to shake. is this because so much has come upon that one can not bare and thier salvation is at risk, pray for me, Holy Ghost baptizm feels like Body your perfect size coming in, then persecution, killed all day long, electricity caressing penus, spiking, sometimes feels like oral sex, hair loss, severe itching in scalp, face bags and sags cause muscles tithen at instant you wink, turn, head, in vision Jesus came down and pointed like lighted needle thru heart, felt hole, all desire to live for Jesus drained out, this is teasing by God to persecute, gum desease, showing dreams of family in hell, God twice drove me to try to commit suicide, by sticking hose in exhaust and starting van, If you read ROMANS 1, they tasted of the heavenly gift, but fell away, those letters were prophesyied, God spoke, not paul, and now you probably know why they fell away, could not stand he tease, Jesus spoke this to me 1-2009- bob I got you to make me your Lord then I betrayed you for years, by torturing you for years to make you perish ha ha saith the LOrd, you are tortured all day and night to strip you of your salvation, ha ha, 6-17-2008-under my eye tightened, and it was like that just about all night that I remember, and yes, new bag, both streteched, and I told Jesus, "ill have a face lift if you go to far" and being intimidated lots of the time cause when I twist kneck, muscles tithten, am im trying to not allow, and GOd told me that he wants to make it so I cant go on, thats where he wants me to offer sacrifice, that means, Hate God so much cause of what he is doing to me that I can not even bare thinking about him, toture me and make me jelous, and knows I cant do one thing about but backslide and he has told me many times to do that, to embarras me, and this is probably cause GOd has showed me the beast with seven heads and ten horns, most if not all bible mysteries, and he told me he doesnt want someone like me walking, he wants me hidden in the people, thats backslided, and tahts perish at end, and God has told me, Ive seen Jesus, "Im jelous", and has told me, "for no reason", and explained like, you were perfect and I became jelous, and im sure something will be tached on to this if I make it thru this torment, and I saw my face in a vision in 5-2008 and it had many knots, boil like, many, and in another, I was talking to a woman and not looking her in the eye, and this was cause I could not bear being facially destroyed, and being wrinkled every day, bagged, so Im faced with fasting until I die or, cause I can see that God is wanting to make it so I wont even want to come out of my room and be seen, to make me backslide to embarras me like he had told me he wants to do many times and has told me many times, "for no reason", and God scares me lots with HIS PROD, intimidate, strike, and what can you do, backslide for be tortured till you will not recover, and God has let me know, "Im killing you", thats making me hate Jesus by what he does to meand want to run, hate so much I cant even stand to talk to him, and I would see Jesus in vision and he would say things like, "bob get away from me", and has let me know if I fast and i have, "till the brink of death", and im tortured while fasting beyond ability to deal with, being teased, messed with many ways hurting my heart deeply, and GOd let me know he wants to do that, and in dream this morning, my eye pulled like bag, I felt and he let me know my mom is going to hell in antther, and hell is under my feet, and yours, and I keep asking God, why do you hate me so much, do you hate people, and I see people and ask God, do you hate them, do you hate that bird, or, do you hate this, do you hate that, whats it like to hate so much, and now if I fast till death cause God wants my face as he said, and I wont use the exact words, like the former prime ministers of ukraine, its a stumblingblok, I just think, put me in hell, go ahead, but I served and surrendered and was available to you, and for doing, so if you see someone backslided, maybe they were not willing to cut thier hand off like God asked me to do, and then he probably would have laughed at me like I see him doing many times after one of his strikes with his prod, and I have expressed thats, its sad YoU are over the existence so pray for me, and if you dont beleive it, good, but save yourself if you can and after you read this you to will say, "who then can be saved"? You will know after you read this what the oaklahoma bomber said is true, "if im going to hell, im going to have a lot of comapny", and just incase you dont believe, look what happened to JOB, and I saw him in visions with boils on top of boils, growing out, and God has told me scaring me and im always trying to recover, something like, "I want to make you an example of rising to the top and falling", " I want you to comit sin in front of those you witnessed to". I have begged God to stop till I almost cant no more, and I have asked him to take my life, and he has led me to attempt suicide twice, and sent gariel once he told me to stop it. So whether you pray for my life or death, thnaks, this makes me know, or I beleive it does why when the romanian pastor writes a book, and it talks about fasting and the whole church fasting or many theere, possibly why they did. I saw Jesus in a vision and he said something like, "its funny not knowing if you will be able to stand tomorrow or not"(he said this to me like a smart ass, and I told God since this batch of God chasing me away from him to embarras me has been going on since about 9-2005, stop or Ill put it on net and reach out for outside prayer, God just spoke, "I will do it to everybody", "I just love it when you are so angry", I saw like an red possibly outlined Jesus in vision while I was lying on couch, and one thing he said was, "you little fucking runt", Please pray for me, cause about late 2007, I found myself cutting garden hose, starting van, sticking hose in exhaust, thru window, got in, under covers in back seat, hose in mouth, and later still alive, and God told me he sent gabriel to stop it, and I heard, "I led you", and gabriel will see people reading this, and if you read about how im being tempted to sin all the time just about, you wont like it, and it is worse then youll read, like I woke up this morning and all desire to live for Jesus was stripped right out of my heart, and I saw Jesus in visions and didnt record what he said, but he mocked me, scared, teased, like this morning, I saw Jesus and he yelled something like, "got you", I saw Jesus this morning, and he had brown hair, and he said, "om jelous. . Please pray for me. in vision I saw me in hell and fire was coming from hand or hands,

5:04 PM, January 20, 2009  
Blogger Unknown said...

I'm in neither the static camp nor the dynamic camp. Python and OCaml are my favorite languages. I understand being fed up with C++ and Java's type systems that give you relatively little compared with the verbosity and hassle they add. Most of the people who are big static typing evangelists also strongly dislike the Java and C++ type systems.

I think we can agree that "gradual typing" (optional static typing) allowed by ECMA Script 4 (R.I.P.) and some languages like PLT Scheme/Racket are exciting directions for programming languages. I think ideally I'd like to see a language with a statically typed dialect and a dynamically typed dialect that target the same virtual machine. (Perhaps something like IronPython and Boo are what I'm looking for.)

Another exciting idea somewhere between static and dynamic typing is "soft typing".

As far as I can tell, the aside about multithreading in Java vs. C++ is about the relative ease of making non-threadsafe legacy code threadsafe. Am I understanding you correctly? I don't see a big threading advantage in Java vs. C++ apart from the truly amazing way monitors are implemented in HotSpot/OpenJDK. Two different types of ultra lightweight locks are used, and in some cases, a thread trying to acquire a monitor will go and modify the stack of the thread holding the monitor in order to replace a lightweight lock with a mutex on the fly. Now, I think Erlang's shared-nothing approach is better, but the JVM's locking mechanisms are pretty slick.

As far as the paper on double dispatch goes, the trick relies on two things: (1) the dynamic language interpreter is implemented in a statically typed language and (2) this statically typed language runs on a tracing virtual machine. Because these two conditions are met, in the common case you get two highly predictable branches (checking the vptrs of both objects) followed by inlined code for the method body. By the way, it's not that you call a.add(b) and b.add(a), you call a.add(b), which is always implemented as return b.radd(this); In order to pull the same trick if your interpreter is written in a dynamically typed language, you need to do your own manual name mangling on your radd methods.

I think language-specific CPU instructions are generally a bad idea, but there are a hand full of instructions that would help any JIT implementation. Branch-on-random would be useful for detecting hot spots and traces. Bounds-checking array load and store instructions that assumed array sizes are stored at index -1 would be nice for all bounds-checking languages. A branch-if-either-even instruction (as in BREVEN %r3, %r5, slow_path) would be useful for all of the virtual machines that use the 31-bit or 63-bit tagged int trick, such as V8, SpiderMonkey, Ruby, OCaml, etc. Branch-if-either-even would allow a tracing (or traditional) JIT to replace two conditional branches in the fast path with a single conditional branch. Instead of the VNZ flags on x86, a small number of boolean registers like ia64 along with a predicated push instruction (using a different register than sp as the stack pointer) would make it much easier to generate native code that could be efficiently traced, like Mozilla's JagerMonkey project would eventually like to do. None of these instructions are language-specific, and some of them would be widely useful even outside of JITs.

11:32 PM, June 25, 2010  
Blogger Adam Paynter said...

Wow, thanks for the transcript of your presentation! It was really interesting! However, I am curious - what were the optimizations you spoke of when you said:

"I'll be honest with you, I actually have two optimizations that couldn't go into this talk that are even cooler than this because they haven't published yet. And I didn't want to let the cat out of the bag before they published. So this is actually just the tip of the iceberg."

Thanks!

1:20 AM, September 16, 2010  

Post a Comment

<< Home