Saturday, May 27, 2006

Next in a series of critical somethings written from a background of almost total ignorance on my part: Why there are no indie video games.

This essay seems to have a few oddities. To start with the nitpicking:
Thanks to powerful new consoles, their graphics are approaching CGI quality.

Movie quality CGI, he presumably means. The better observation is that games look more like their cutscenes.
Richard Garriott peddled Ultima, the first major role-playing title, in plastic bags. Sid Meier's Civilization and Westwood's Dune II cracked open the strategy genre. Id Software's John Carmack and John Romero created the pioneering first-person shooter Doom. Will Wright gave us SimCity and open-ended "sandbox" simulations.

What happened to these pioneers? Garriott never produced another breakthrough like Ultima; he now works for online multiplayer giant NCsoft. Meier has spent most of the last decade updating his previous hits at a company owned by Grand Theft Auto publisher Take-Two Interactive. Id Software has clung to its independence but produced nothing further in the way of milestone games. Perhaps the lone indie superstar to retain his auteur status is Will Wright, who now has his own "studio" within Electronic Arts.

Well. Isn't the rise and fall of Richard Garriott a little more convoluted than that? And as for id, well, it seems to me that has less to do with a market and an industry that dislike innovation than it does the fact that Carmack likes to build 3D engines (and rockets) and only 3D engines (and rockets). There's not much else to the company. It seems like the larger story is that video game creators seem to really only have one or two big ideas in the course of their career.

Having said that, O'Brien is absolutely right about there being nothing but sequels and little hope of anything different ever. Except that there is an independent video game movement, of sorts. And that some of those updates of classic games seem more like fairly radical reinterpretations. (Ocarina of Time, Prince of Persia)

He concludes:
The video game now holds much promise as a cultural mover. If the big studios stay in charge, it may return to its former status: the pastime of teenage boys and middle-aged nerds at gaming conventions.

But isn't it just that course that led to the vast video game market of today? Wouldn't pursuing edgy neatness scare off the casual masses? Are games like Katamari Damacy expanding the appeal of games, or does an expanded video game market simply make for more room for such games?

Which is to say, I grant the diagnosis, but what are the reasons for it?

Tuesday, May 23, 2006

Here's something interesting I read at Wired today.
Microsoft, chipmakers and PC firms aim to increase PC usage in the developing world with a new flexible payment program to lower the initial costs of buying a computer, the companies said on Sunday.

From a comfortable position of knowing absolutely nothing about the economics of the developing world, I can comfortably postulate that there are probably large numbers of people who could get a lot of use out of a computer (which is to say, they'd be able to plug it in to grids electrical and, uh, informatic; which this sort of thing can't take for granted in the regions we're talking about) but can't spare the cash for one.
Using Microsoft's FlexGo software technology, a customer can buy a computer loaded with the Windows operating system then purchase prepaid cards or pay a monthly subscription fee at a cost similar to using a computer at a local internet cafe, Microsoft said. When the usage time ticks down, a customer can go online or to a local retailer to buy more minutes.

This, however, raises a question, and perhaps it is more theoretical than practical, but say I create a document on my FlexGo-powered PC, or record a song or a bit of video onto it, and then don't pay up at the end of the month. It would appear that that data becomes inaccessible. Would I be violating my license agreement by bypassing the OS's sleep mode, as it were, to pull out that data? Are the hardware components designed to act in a similar fashion? Imagine if you had a notebook with a coinslot in the spine, and every so often the pages went blank and asked for a new coin, and stayed blank if you didn't provide one.

What troubles me, I suppose, isn't the OS having a built-in Logan's Run mode per se. After all, as strange as it might seem at first glance, we've all pretty much accepted that we're only buying a license when we buy software. It's the way those terms would seem to be extended to the things users create with those tools in this case.

Anyway, the story is light on the details, so hopefully this is something that's been addressed. Still, wasn't Microsoft making noises about switching Windows to a subscription/automatic update model for everyone some time ago?