Sunday, September 14, 2008

Various software licenses in a single line

Back in 2003, I posted this handy cheat-sheet for various software and source code licenses. Several recent events have prompted me to repost this, as a Public Service to the Internet. Apologies in advance for the painfully old meme, but good ideas transcend time.

HOW ARE YOU GENTLEMEN. Here are a number of popular software and source-code licenses, expressed as one-liners in an easy-to-understand format:
  • GNU General Public License (GPL):
    ALL YOUR SOURCE ARE BELONG TO EVERYONE
  • GNU Lesser General Public License (LGPL):
    ALL OUR SOURCE ARE BELONG TO EVERYONE
  • Netscape Public License:
    ALL OUR SOURCE ARE BELONG TO US
  • BSD-Family licenses (New BSD, Apache 1.0, Apache 2.0):
    ALL OUR SOURCE ARE BELONG TO YOU
  • SCO software license:
    ALL YOUR SOURCE ARE BELONG TO US
  • Typical monopoly-era Microsoft End User Licensing Agreement:
    YOU HAVE NO CHANCE TO DECIDE MAKE YOUR PAYMENT

Sometimes life proves your point... or does it?

A few weeks ago, I made the point that Apple treats the iPhone as a gadgets-style platform. Then last week, lots of people reported the news that Apple had "banned" an application because it reproduced functionality in iTunes. Cries of anti-competitiveness began almost immediately. Based on my earlier post, you might suppose that I'd agree with that -- but I don't.

I don't think this is anti-competitive per se. They're certainly no more anti-competitive than anyone else, right now. The App Store Ts&Cs make it perfectly clear that Apple views the iPhone as their platform, and that developers are welcome only insofar as they push the platform forward. This is simply a concrete illustration of Apple's determination to own the direction of their platform, and not implicitly cede it to third-party developers. They're not necessarily wrong in that.

Of course, it's an illustration that has come as a surprise to developers. But while it's unwelcome, I don't really think it's anti-competitive. After all, there are plenty of opportunities to build and market apps for other platforms and devices. It may be much harder to actually deploy such an app for, say, Symbian and acquire the same reach as you can with iPhone, but doesn't that say more about the state of the industry than about Apple?

Who's more "anti-competitive" -- Apple, or the status quo?

Even with this issue, Apple's platform is still a step forward for the industry. Of course, it would be nice to take a few more strides than just the one, which is why I'm still pleased as punch to be working on Android. But I think flaming Apple over this is the wrong response; you're not going to change their minds, because I guarantee you that they expected to get flamed for this. They're smart folks, and they knew very well that this policy would cause ripples the first time it came to a head.

That said, in this specific instance, it seems like their conclusion is kind of specious; does that app really duplicate iTunes functionality, and even if it kinda does, does it matter? So I'm positive that their developer relations and PR teams are (re)considering this even now, and it wouldn't surprise me at all to see them recant and perform one of their homey mea culpa reversals as only Steve Jobs can perform. It would defuse the criticism in this case with little actual risk to the platform, while still making the point they want to make: "Welcome to our platform. Enjoy the ride, but remember that we're the ones driving."

Last year, it seemed like deploying a successful mobile app was as much luck as anything else. This year, at least you can easily deploy one, if only on someone else's terms. That's definitely forward progress -- and momentum is building. Just think where we'll be next year.

Sunday, August 10, 2008

There are gadgets, and then there are Gadgets...

A while back one of my colleagues -- Charles Wiles -- asked me a very interesting question:
"What is an example of an application you build with the Android APIs that you couldn't build with browser Ajax?"
This is a deceptively simple question. There are easy answers to it, but it turns out that those lack a certain... je ne sais quoi. The easy answers just aren't satisfying, and I eventually realized that there's a deeper question that people are really asking, when they ask about Android "vs." some other platform, like J2ME or the iPhone or Symbian.

So I spent some time thinking about it. One problem is that there's a lot of grey area here. Many applications could be done using either technique, or on several of the other platforms out there, and still be great user experiences. In other words, it's not really about features, and it's not really about capabilities. It's not about having GPS where someone else doesn't, since the other guy will add it in his next release along with a feature you don't have.

I needed a deeper, better answer. I eventually realized that what developers are really getting at when they ask questions like this is:
"What is the difference is between all these damned platforms, and why I should care?"

Every developer has a different career, skill background, and set of interests. Everyone also tries to understand new things by applying knowledge they already have. So, developers who understand J2ME ask about Android vs. J2ME; developers who understand Macs or iPhone come from that direction; and so on. But they're really asking the same thing: how should I be thinking about Android?

Let's take the iPhone, for instance. With the iPhone Apple has a very solid, very successful platform -- no doubt about it. However, much has been made about the restrictions on applications, such as no background processing, no interpreted language runtimes, and so on. Opinions vary as to whether this is good, bad, user-friendly, developer-unfriendly, or what.

But that kind of scrutiny is like looking through a microscope: you can't see the big picture. What Apple has with the iPhone really just boils down (in concept, not in the details) to a gadgets platform, like OpenSocial, iGoogle, or Facebook.

Apple owns their platform and sees themselves as responsible for the user experience, period. They restrict what applications can do and what they can access accordingly, so as to maintain the quality of that user experience according to their standards. They acknowledge the benefit that third-party developers can bring and will let you build apps to plug in to their user experience, but make no mistake: this is a private party at Apple's house. They can be as picky as they want about who they let in, and they can ask you to leave at any time. But once you get in, as long as you behave yourself Apple wants you to have a good time.

Now, this isn't criticism, just an observation. As Google, Facebook, and the various sites that have deployed OpenSocial demonstrate, this is a perfectly legitimate platform model. Like those other services, Apple is providing the platform and a very clear, specific execution environment. Developers are expected to plug into that cleanly and respect the borders that Apple has laid out.

The benefit to developers is that they get to ride the coattails of iPhone's success. But the potential downsides are that you just can't break out of that sandbox. For instance, consider Facebook. Facebook has a really nice iPhone application, and I've even heard it said that it could "almost" physically replace the iPhone Contact book.

Well, why doesn't it? The answer is that Apple doesn't want it to, because that Contacts user experience is a key part of their platform. It would be like Facebook itself letting a third-party developer build an app to totally reskin the site and insert its own branding -- it's just not going to happen.

However, that kind of thing is exactly what Android aims to enable. If an end user buys a phone that comes with the standard Google look-&-feel and application suite on it, we see no reason why that user shouldn't be able to choose to replace it with Facebook. Android is intentionally designed to let third parties such as Facebook build an application that replaces the Contacts book, Home screen, and Dialer, or nearly anything else. It will quite literally be possible to completely reskin an Android device into a Facebook phone. (What would that be -- an fPhone?)

You might even say that Android and iPhone are sort of inverses of each other. iPhone is a Gadgets-style platform where the user experience is largely fixed and an execution context is carved out for each third-party application. Android provides a framework for managing the execution contexts of third-party applications, which are permitted (and expected) to augment the user experience in innovative ways.

Neither approach is inherently "better", and when it comes down to the daily business of writing code and building UIs, the experience probably isn't that different. But in terms of the big picture, that is the satisfying meaty answer that I've been looking for.

At least, for "Android vs. iPhone." There are quite a few other platforms out there, though, and each one has a slightly different answer. (Perhaps the subjects of future posts...) But now that I have the right angle to view this question from, it makes it a lot easier to talk about. At least with geeks -- I don't know that my family appreciated it that much last week, when I was home on vacation.

So it goes. ;)

Monday, July 14, 2008

My Obligatory Foo Post

(...but one I am excited to write.)

This weekend I attended Foo Camp. It was a crazy, exhausting, exhilarating, fascinating experience. I wasn't quite sure what to make of everything beforehand, but now I can say that Foo was quite possibly the best event I've ever attended. The thing is, I'm not sure I could say exactly how or why.

Foo is a mostly technical event. I've been to some really rotten (though well-intentioned) events that tried to wed technical and non-technical content. However, they involved putting technology and humanities people in the same room and expecting magic to happen. In contrast, through what I can only assume was painstaking care Sara Winge and the other Foo organizers put together a remarkable guest list, and the format (or perhaps, lack thereof) is structured to make with the mingling.

But I shouldn't be trying to do an expository description. You can't accurately describe Foo that way. The only way I can think of to really capture Foo is to just describe some of the things I saw and did there.

I saw and met a lot of people -- many of whom I never thought I'd have the opportunity to meet. For instance, I played Werewolf with Jimmy Wales, among others. At one point, Jimmy exclaimed "No no -- it's me who smells like pickles!" At least, I think it was Jimmy -- it's hard to remember clearly when you play Werewolf until 6am. (And don't ask -- I can't remember how we got on the topic of pickles anyway.) The whole time, I just couldn't figure out how I got meet Jimmy Wales, let alone play Werewolf with him, let alone hear him say a thing like that.

At least as interesting were the folks I met whose names I didn't recognize (but probably should have, and am pleased I do now). I was going to list a few, but decided that would be name-dropping, and there's no way I could even remember everyone I met anyway. Instead I'll say that a big part of Foo for me was being in the presence of people who seemed larger than life, and realizing that they are all just people -- albeit more talented and more interesting people than me. I know that probably sounds gratuitous and cliche, but it's the truth: there's something profound in the simultaneous revelations that celebrities are just people, but many of them really do deserve to be celebrities.

The sessions were great too. I went to one session where the first half was about electric cars, and the second half was about baking killer bread. And both were really great conversations! Where else can you find a session like that followed by one on parallel programming trends, and then co-lead one yourself on Android?

I actually co-lead two sessions, and tried for a third on designing a card game, but the aforementioned 6am Werewolf and my co-lead's unexpected cold spelled doom for that one. I'm bummed about that, because the card game session was to my mind the one that would have counted. I do Android stuff all the time so answering a few questions with a colleague for an hour is no big deal. It did go well (actually, very well -- the questions were quite good, and we had pretty good turnout at around 30 or so people) but I can't claim it was very original or insightful, as content goes.

What I still am not quite sure about is how I got invited. I'm curious, but I guess I don't need to know. That's not really the point of Foo -- you're just supposed to go, and be a part of what's going on.

And that, I now realize, is what Foo is all about: it's about people and ideas, not technology. Tim actually says as much when he describes the event, but sometimes it's hard to parse the forest for the trees.

Who knows if I'll get invited back next year (I doubt I was all that interesting to the other folks there) but it was definitely a pleasure and a privilege to go. I owe a huge thank-you to Tim O'Reilly and his awesome staff for a wonderful event.

Tuesday, July 8, 2008

I think I've learned a Genuine Life Lesson

I used to be quite the armchair analyst. Well, probably no more so than any other software engineer, but as a group we do love to speculate. There's just so much drama out there, with the jockeying and jostling of companies and technologies.

But for the most part we are also naturally skeptical, so I never got that carried away. Indeed, I was quite proud of how much I didn't get carried away.

At least, before I started working on a high-profile project that routinely gets press coverage. If I took things with a grain of salt before, I take them with 5-pound bag, now.

It's odd though: I haven't learned anything I didn't already know (remember that reporters always have angles, and so on), but now I don't just know it, I know it all the way down to my marrow.

I went through phases, too. At first I was indignant ("How dare they miscontrue us so grotesquely?") or amazed ("Well I'm glad they like us, but that's a bit over the top.") On my way to where I am now (mostly indifferent with a mix of amused and irritated), I passed through a few other minor phases, of which my favorite was a sort of cognitive vertigo: "I know it's not true, but... everybody's saying it. Maybe... maybe I'm wrong and I don't know it?"

We've had everything from cleverness to malice to incompetence attributed to us. There have been reports of Android devices launching ridiculously early (February? seriously?), as well as implausibly late (2009?!). People "familiar with the matter" have been quoted saying things that sound like they came from a different universe. Based on just the press, you could be excused for thinking that.... well, that there's drama.

What this has taught me is that there's just no point in judging anyone -- especially companies -- by anything other than their actions. But what I've learned is to include in that what they say: if Nokia/Symbian says they want to open up, then I assume that that's their plan, until they do something else that says otherwise. If Apple says they don't have background processes because they want to protect the user experience, then I don't automatically assume that's just cover for wanting to hamstring third-party developers.

I no longer scrutinize companies for drama, until they actually do something dramatic. Conspiracy theories and armchair punditry are fun in their way, but for me they've become an indulgence I try to avoid.

Monday, July 7, 2008

Oops

Heh, looks like my new Blogger template is not without bugs -- on the comments page in this case. :) I'll have to fix that ASAP.

Friday, July 4, 2008

Blogging is hard.

I've had blogs before -- the kind where I imagined people wanted to read about me doing my laundry, or the cat's latest biological nonconformity. When I started this one, I decided I'd try to keep the value a bit higher, and only post things that I'd thought through, and proofread a couple times. So far I've done that.

But damn, it turns out blogging is harder than I thought. I just re-read a couple of my most recent posts, and while I'm not unhappy with them, I am not really satisfied with them, either.

I think they're still a bit too long and rambling. The message is in there, but you have to work to find it. This is after what I thought of as an aggressive editing cycle. I think I need to amp it up a notch, and go from aggressive to brutal.

Thursday, July 3, 2008

Why I don't like JavaScript 2

I don't like JavaScript 2, ECMAScript 4, or whatever it's called. That doesn't mean I'm taking sides in the ongoing controversy. To the contrary -- I think the controversy is boring and lame. I don't like JS2 because I just don't care.

Actually I'll even go beyond that. If, in say, 7 years, we are still using JavaScript or JS2 to write Ajax applications, I'll be very sad. I don't like JavaScript, as a language. I don't hate JavaScript, but I don't like it as a matter of taste. It doesn't excite me, I don't think it's particularly powerful, it doesn't change the way I think about programming, etc. JavaScript is okay, but it's not great.

So in a way I guess I should be interested in JS2. And to whatever extent JS2 is better than JS1, then the web benefits. I guess. I don't really know though, because I haven't even so much as glanced at a spec. People whose opinion I respect think it's an improvement, but like I said, I don't care. Why?

Because I think JS2 has something in common with XSLT: they're both painstakingly-crafted, elegant solutions to completely the wrong problem.

Why, exactly, are we betting the future of the web on just another programming language? If the past 40 years of computer science have taught us anything, it's that the industry never -- never -- agrees on a single programming language. ForTran, COBOL, C, Java... things change. So why does anyone think that this time around it's going to be any different?

Worse, designing a successor to JavaScript implies that you know what's best for all developers everywhere, and have the required skills to design it. That's a hell of a presumption, even granting that it's the work of a team of talented experts and not a solo thing. I really like Python; who are you to tell me that your shiny new language is better for the browser than a language with years of history?

This is the same issue I have with XSLT. The need there is for a standardized way to transform XML from one format to another -- okay, fine, that's definitely a real need. But what is XSLT, if not machine readable instructions for performing a specific task? And hey guess what -- we have lots of different things we can use to encode machine-readable instructions; we call them programming languages, and they have a rich and varied history. In the silly days of my professional youth, I once devised a system where you basically wrote little scripts in an XML syntax, and I got rightly bonked on the head for it. It's not any better an idea for XSLT, either.

That said, for what it is XSLT is pretty well done; it's just that it's a well-done solution to the wrong problem. XSLT is a tightly-coupled solution to the problem; a loose-coupled solution that focused on defining a sort of "CGI for XML transforms" would -- in my opinion -- have been a better idea. Specify the boundaries, and let existing languages do what they're good at.

It's definitely possible to nit-pick my argument there, but I'm bringing it up because I think it's even more true of JavaScript today. I'm sure that JS2 is a great, elegant language. But the problem is, it's just not what's needed.

The web works as a platform because it embodies an effective application model implemented via open standards. But it's getting long in the tooth, which is why we have at least 3 different platforms aiming to replace it: AIR, Silverlight, and JavaFX. Each of these is basically some permutation in the cartesian product of (Vector-renderer, DOM-renderer) * (custom VM, custom language). The thing is, the browser of today is itself one permutation, and JS2 boils down to yet another.

We've already proven the model, folks. We don't need to prove it all over again.

The browser isn't broken, and doesn't need to be "fixed", either by a proprietary platform, or by a shiny new language. Instead it needs to be extended, and functionality gaps need to be plugged. That's why I don't care about JS2: its fundamental message is "the Browser of Tomorrow is the Browser of Today, just with different syntax." This is spinning our wheels, and won't do anything to advance the web.

So what is the right solution? Well, in my opinion what we need is to formalize and standardize the DOM API, agree on a single vector model and API, and specify an intermediate compilation format. That is, specify the boundaries, and let existing languages do what they're good at.

That intermediate format could be bytecode like the JVM or .Net CLR, plain old JavaScript (which GWT has proven can be used for this purpose), or even "JS plus compilation-friendly extensions". Between Tamarin and SquirrelFish and ten years of the JVM, I think it's pretty well established that this is a solvable problem. For that matter, ditch the idea of an integrated VM entirely, and establish a standard API where an external process can attach to your browser DOM, without the browser needing to run code at all.

With one of these approaches, the browser -- through incremental steps -- becomes just as "innovative" as SilvAIRLaslavaFX, while still based on the open standards that have worked so well so far. This is basically the same pragmatic gap-filling philosophy that Gears and HTML5 have, though taken a step farther.

The problem is that this is messy work, and it's far less sexy than designing your own language or busting out with your own runtime out of whole cloth. It feels less innovative, because it's less of a dramatic change. But as DOM, CSS, and XMLHTTPRequest have proven repeatedly, it's the modest, targeted innovations in the web that have caused the biggest revolutions.

Sunday, June 1, 2008

Google I/O has come and gone

The Google I/O event has come and gone, now, and it was a pretty awesome event. We laughed, we cried (oops), we demoed a new build. I'm pretty pleased with how it turned out -- beyond that even; I was amazed by the energy level. I think the 2-day format sort of provided a sense of urgency, and it kept people excited. I remember sitting around at about 4:30 on the second day, feeling like I was waiting for something, only to suddenly realize that it was all over and there was nothing left to wait for.

Anyway we were very busy preparing for that event, and before that we were extremely busy handling the Android Developer Challenge. I feel suddenly light as a feather! It's almost like I have nothing to do next week. Heh heh... except that I definitely do. Lots of stuff got put aside in favor of ADC and GIO that needs to get picked up again.

One of them is my blog here. I think it might be high time to finish up a post I've had sitting in draft form for entirely too long.

Friday, May 16, 2008

I still function!

No, I'm not dead and gone. Buried maybe, but not dead.

The Android Developer Challenge just wrapped up, and now it's time for Google I/O. I've been on the hook for both, and haven't had time to do much else than eat, sleep, and administrate. Fortunately it looks like the worst is over, so I can shift 'er down a notch.

In the interval between this post and its predecessor, I started on Twitter and got invited to Foo Camp. It's been quite a year for me; the Valley is indeed a small place, and experiencing all these hacker tropes first-hand is a bit of a weird experience.

Saturday, March 8, 2008

What do developers have in common with movie-goers?

Answer: Suspension of disbelief.

To explain, let's start here: Developers write code.

In so doing, we are by necessity required to use tools -- nearly always, tools written by other people. Those people may or may not share the specific philosophy we have about the art and science of software development. When they do share our views, we get tools that we really like; when they don't, we get tools we hate. Eventually, we develop a sort of self-preservation mentality that is best summarized as pragmatism: you'll forgive a tool vendor a rather shocking number of sins, if at the end of the day you can still get your work done. (You will, however, bitch and moan about it all the way -- and as for me, guilty as charged.)

There's a flip-side to that, though: if we CAN'T get our work done, we will hate you with the burning passion of a thousand suns.

This mentality quickly takes over your professional life, which for a developer like me becomes a pragmatic cynicism: we take a live and let live approach and will tolerate a lot of abuse -- but only up to a point. Cross a certain line, and you will lose us forever. For instance, we know that when a tool or platform vendor puts on a trade show that it's in their own shameless self-interest. But we also are aware that our interest is to a large degree also the vendor's interest, so we're willing to suspend our disbelief and attend anyway.

I say "suspension of disbelief", but I might more accurately say "suspension of fiery cynicism." Our "disbelief" is still lurking behind the scenes, ready to jump out and bite everyone in a fit of pique -- and woe be to he who lets it out.

Here's an example from my own experience. I used to own a Windows Mobile phone. I'm not normally a Windows kind of guy (quite to the contrary, in fact) but I suspended my disbelief and went to see what I could do. I quickly determined that there was a free version of DevStudio; hey, that's cool. However, I just as quickly learned that the WinMo development tools are not compatible with the free version of DevStudio. As far as I was concerned, I was being asked to pay for the privilege of developing (not even releasing!) an application for Microsoft's platform. Talk about an abrupt way to shatter my suspension of disbelief. I still have a bad taste in my mouth over that -- and I never even got a chance to try out the tools!

It's probably evident that I've got my share (perhaps more than my due share) of the developer's skepticism, so I know all too well that no matter what I do in my day job, I'd better be authentic. I truly understand what it means for a developer to suspend her disbelief on my behalf. Just the act of reading our docs is really quite a big deal: every developer I know always has something else she could be doing. Nothing I do -- docs, sample code, technical support, consulting -- will ever be perfect; but it had better, at the end of the day, help developers get the job done.

Once developers really buy in to a product, though, it's more than just a time commitment. In some lesser or greater way, they're putting their professional reputations on the line, if only to themselves. If I fail to honor his suspension of disbelief, the developer is always a little disappointed in himself -- even if no one else is.

It never feels good to be disappointed, so I take pride in helping to avoid that. Fortunately it's usually pretty easy (at least in theory): be open, be up-front, and be helpful.

Friday, March 7, 2008

Liberté, égalité, fraternité

From a certain perspective, there's nothing wrong with the mobile world today. Phones are getting more powerful, and can do lots of neat things. Sure, maybe developers have to jump through some hoops to get applications in the hands of users, but that shouldn't matter. A developer who knows deep down that he has a great idea will go the extra mile to get that application out there. Developers who aren't so sure won't bother, but that's okay since their apps are probably poor anyway. So these are all good things.

Right?

Obviously there are quite a few things wrong with that, and I don't really think anyone (or at least, anyone who's rational) would really say that; it's a straw-man argument I set up. However, I think there is a kernel of truth in there: a lot of people do think that some apps are more important than others. I'm referring to the fallacy of the "killer app".

The quest for the killer app can blind you to the real world, if you let it. True killer apps are exceptionally rare. For instance, what was the killer app for the web? Was it email? Was it mapping? Search maybe? How about social networking? With the possible (though arguable) exception of search, none of those are true killer apps. They were either "ports" of popular apps to the web, or else popular incremental additions. In fact, I assert that there's no clear killer app for the web[1] at all, yet few people would today argue that the web is not a viable application platform.

I really believe that the history of the web backs me up when I say that this quest for "special" apps is misguided. Trying to predict in advance what the best and most disruptive applications are is a chump's game. And if you're trying to predict (let alone encourage) disruption, the people who you least want to put in charge of that are by definition the people who stand to be disrupted.

That's why I think the current state of mobile isn't working. There is absolutely no reason to believe that the next big thing is more likely to come from a well-heeled, committed, industry insider than from a hobbyist with a spouse and two kids. Yet the current mobile world, with all its restrictions and certifications and fees, favors the former.

That's why I really love what we are trying to accomplish with Android. When I read posts by people working on the Android Developer Challenge, I get much more excited by the guy who says he works on his app after he puts his kids to bed, than I am by the guy who says he's just porting something for the Challenge. (And yes, I have seen both posts -- that's not just hyperbole.) That guy with the kids is in my opinion far more dedicated and likely to come up with something clever and practical than the latter. Yet he's also the guy who is least likely to want or be able to jump through the hoops necessary to get a signed application available for users to download -- $99 annual fee or not.

In other words, the best apps are the best apps, not the biggest or most "important". I look forward to an open mobile world where apps of any size and scope -- not just the ones with big money, biz-dev teams, and CxOs smarming it up on the golf course -- can land on my phone.


[1] Some people I know would argue that the dark secret killer app of the web is porn. I guess I can't really disagree, although I'd argue that porn is the first-follower of new media types, rather than the driver.

Saturday, February 23, 2008

More on Internet-Fame -- Seriously!

Recently I talked about the subtle strangeness of working on a high-visibility project. My take on it was a little silly, but David Welton has a much more thoughtful analysis.

I think he's spot-on, and makes several great points. And he even gets comments! ;)

Sunday, February 17, 2008

N-ary the k-th

I'm n-ary the tree, I am
N-ary the tree I am, I am
I got traversed by the Iterator next door
She's iterated several things before
And every one was an n-ary (n-ary!)
She couldn't touch a stack, queue, or a DAG (or a DAG!)
I'm her k-th parse-ee, I'm n-ary
N-ary the tree I am!

Next pass -- same as the last!

I'm n-ary the tree, I am
N-ary the tree I am, I am
I got traversed by the Iterator next door
She's iterated several things before
And every one was an n-ary (n-ary!)
She couldn't touch a stack, queue, or a DAG (or a DAG!)
I'm her k-th parse-ee, I'm n-ary
N-ary the tree I am!

Parsed
Re-
Curse-ive-ly!

N-ary! (N-ary!)
N-ary! (N-ary!)
N-ary the tree I am, I am
N-ary the tree I am!

Saturday, February 16, 2008

On Internet-Fame

Android is the first project I've worked on that gets regular press coverage. This has been a wild experience so far, to say the least. It's at once weird, fascinating, amusing, and dismaying.

It's weird because occasionally I'll see stories or commentaries about Android that catch me off guard. For instance, when I did the Android introduction video that's on YouTube, I knew what to expect from the viewer comments. This is the Internet after all, and I'm a USENET veteran. What I didn't expect was to be compared to a Muppet.

Other times, I'll find myself fascinated by the latest detailed analysis of our "Android strategy." It's fascinating to read about your own plans and goals -- especially when they're wrong. The resulting cognitive dissonance ("is THAT what we're doing?") is not to be missed. My favorites are the advertising-related conspiracies. We've said repeatedly that Android is not about advertising, but I guess some people either haven't heard or don't believe us.

In fact, I've had to get quite used to not being believed. It's amazing how many times we'll say one thing and then see someone else repeat it, but with a preface like "Google claims that..." I think it's just that people are well-trained to mistrust companies. Sometimes no matter how flatly I state something, people assume I'm just a soulless spokesweasel. On occasion I take it as a sort of personal rhetorical challenge to see how many slightly different ways I can rephrase the same pointblank statement until someone actually believes me.

Ahh, good times. I think next week I need to write a lot of code to counteract all this. Hacking on some software is the universal antidote to the silly squishiness of the world of evangelism.

Sunday, February 10, 2008

Dear Munich: May I come back, please?

When I was a kid, my family moved around a lot. I saw (and lived in) well more than my fair share of the United States, and maybe that burned me out on travel. I certainly never had the wanderlust that many people I know have. Sure, I thought it might be neat to see London or maybe Paris once, but you know, I hate planes, and travel means a lot of hassle and time away from home...

So that's how it came to pass that I spent thirty years on this spinning blue ball and only ever visited the North American land mass. But boy howdy, did THAT change in a hurry. It never rains but it pours, and a couple weeks ago I had occasion to collect a couple new continents.

I visited Munich, Tel Aviv, and London. It was a whirlwind tour, and I only really had any time to look around in Munich. That time, though, was well spent. I loved Munich, and I can't wait to go back.

My friend and colleague Jason Chen traveled with me. (I should say I traveled with him, because he's the experienced globetrotter. If you ever need to travel internationally, you could do a lot worse than picking Jason as your companion.) I mention that mostly to say that if you want to see some photos, you'll have to visit his site, since I am "between cameras" at the moment.

So as I said, I loved Munich. But there are so many questions! Like these:
  • Remember, it's spelt M-U-N-I-C-H, but it's pronounced "München". I guess? I need someone to explain this to me.
  • The Marienplatz is pretty awesome, but why is that building called the Glockenspiel? Was it named after a person? I need someone to explain this to me.
  • Some parts of the autobahn appeared to have speed limits, but I was unable to figure out what the pattern was. I need someone to explain this to me.
  • What is the name of that character that looks like a Greek beta with a long tail, and what is its phoneme? I need someone to explain this to me.
I realize that I could answer these questions via Wikipedia, but that would mean I'd have less reason to go back. I'd rather let them remain a mystery until I can take Aimee and go back, and have some of the amazingly friendly locals explain them to me.

Oh, and our Android session went really really well, too.

Friday, February 8, 2008

Shopkeeper to the Pioneers

Shortly after I graduated from undergrad, my mom once asked me why I liked computer science so much. I told her that writing a piece of software is as near to the act of pure creation as you can get: you start with nothing but an idea, but you end up with something tangible. Or well, at least something to show for your work, if not something "tangible" per se.

Later in my career I refined that answer a little. I realized that for me, there's more to the joy of programming that the simple purity of writing lines of code. I like to do new things, but I'm impatient. I like to stand on the shoulders of giants, making connections between grand ideas that others have come up with.

If I were a pioneer in colonial America, I wouldn't be one of the folks beating the earth into submission with nothing but sweat and a plowshare. I'm not the kind of person who goes out to blaze new trails and explore uncharted territories. Those kinds of pioneers get the thrill of doing something new, but usually the only scenery they get to see is the scenery they're personally exploring.

Instead, I'd be an enabler: the general store-keeper. All those pioneers out there making history have to get their supplies from somewhere, and when they do, I'd get to hear their stories. I'd get to see their homesteads, and pass along advice and lore among visitors to my store. I probably wouldn't get to make history on my own, but I'd get to be a small part of a lot of great histories.

That's why I love my current job. Google has a lot of developer tools, like GWT, Gears, and Android, and a lot of pioneers are out there on the web doing cool new things with them. I'm not out there building the sites; instead, I'm tending my shop, getting them the tools they need, spreading their lore, and being a small part of a lot of great sites. Sometimes I even devise a new tool of my own, to help them out.

I'm excited by constantly being exposed to neat new ideas, and I love learning. I can't imagine a job more fun than where I am now, sitting in the middle of this crazy, furious, flurry of new ideas that is the open web and mobile.