Maybe as I designer I don’t “get” the concept entirely, but hearing that as an argument in favor of a specific development technology seems fundamentally flawed. Shouldn’t the most appropriate technologies be used to solve the specific problem at hand? I don’t have the belief that I can design once and it works everywhere, so why should development be any different?

Jon is absolutely right. There’s a couple of things that drive this idea, though. The first is management. It’s cheaper to hire one developer that knows one technology than multiple people that know multiple technologies/platforms.

I think you could also argue the web is sometimes to blame. For a lot of companies, their first encounter with software development was creating a website. They built it once, then tweaked it for each OS/browser combination. Now those companies want native apps, and they don’t understand why it’s different.

There’s also simple laziness, something I’ve been guilty of before (waves to Adobe Air). You want to release something on all platforms, but that means learning those different technologies and spending time writing essentially the same code in different languages. In that situation, “Build Once, Run Everywhere” is very tempting.

The underlying assumption in all of these situations is that the resulting product will be “good enough”. It may not look quite like a Mac app, but it’ll be good enough. The UI may not be designed for touch interaction, but it’s good enough. In my experience, once a developer cares enough to not be ok with good enough, then they stop looking for shortcuts.