The process of testing Halo3

I came across a fascinating article on Wired which looks into some of the processes that Bungie, the developers of the Halo game series for Xbox and Xbox360, have been using to test the latest iteration, Halo 3.

image

Thousands of hours of testing with ordinary game players has been recorded, and the footage synchronised so the developers can replay what was on the screen, what the player’s face looked like, and what buttons were being pressed, at any point. They even record details of where and how quickly the players are getting killed, so they can figure out if some bits of the game are just too hard.

There have been a series of test waves of Halo3, some of which were for MS and Bungie employees only, and one public beta phase. The public beta was itself unprecedented in scale – over 800,000 users, racking up 12 million hours of online play. Do the maths, and that’s about 1,400 years of continuous play…

The internal only tests have been kept necessarily confidential (“It is not cool to talk about the alpha” was a common warning to anyone who thought about leaking screenshots or info). The final phase of testing is underway and the game is said to be very nearly complete.

I’m not going to mention any more about how the game looks, sounds, plays – except to say that you’ll all be able to find out just how awesome it is, on the 26th September (in Europe – the US gets it on the 25th). Might as well book the 27th and 28th off as holidays already 🙂 

BBC iPlayer kicks up a stink

It’s been interesting reading various news articles about the fact that the soon-to-be-released BBC iPlayer application will initially be available only to Internet Explorer and Windows XP users. The Register reports that a group called the Open Source Consortium is due to meet with the BBC Trust since the service will not be available at all to users (for example) of Firefox or Linux OS.

The Guardian‘s coverage points out that the same issues behind the iPlayer are shared with the commercial broadcasters’ services (ie Channel 4 and Sky). Channel 4 says:

Will I be able to access 4oD on my Mac?

Unfortunately not at the launch of 4oD.
This is an industry-wide issue caused because the accepted Digital Rights Management (DRM) system used to protect online video content, which is required by our content owners, is not compatible with Apple Mac hardware and software. The closed DRM system used by Apple is not currently available for licence by third parties and there is no other Mac-compatible DRM solution which meets the protection requirements of content owners. Unfortunately, we are therefore unable to offer 4oD content to Mac users at this stage.

The fact is, all of these services are being required to use DRM since they don’t own much of the content they’re “broadcasting”, and the content owners are saying that they’ll only allow it to be broadcast if it can be protected. And nobody has (yet) built a DRM system that is up to the job of securing the content, for the other platforms in question (with the exception of FairPlay, which Apple won’t license).

Someone from the BBC comments about the fact that the Windows DRM may be a target for hackers…

“We expect it to get broken. When it gets broken, Microsoft releases a new version [of DRM] and the application gets updated. It’s an imperfect solution. But it’s the least imperfect solution of them all.”

So, it’s interesting that the Open Source Consortium is threatening to take this whole thing to the European Union under an anti-trust banner. What’s better – provide an innovative service to 70-85% of the market, or have no service to anyone because the content providers won’t allow it? Sure, the latter example is “fairer” since it doesn’t favour one platform vs another, but is it really in the best interests of the end users…?

Technology changes during the Blair era

So Tony Blair stepped down as the UK’s Prime Minister this week, just over 10 years since his ascendance to the position. Funnily enough, I got my “10 year” service award at Microsoft recently (a fetching crystal sculpture and a note from Bill ‘n’ Steve thanking me for the last decade’s commitment), which got me all misty-eyed and thinking about just how far the whole technology landscape has evolved in that time. I also did a presentation the other day to a customer’s gathering of IT people from across the world, who wanted to hear about future directions in Microsoft products. I figured it would be worth taking a retrospective before talking about how things were envisaged to change in the next few years.

When I joined Microsoft in June 1997, my first laptop was a Toshiba T4900CT – resplendent with 24Mb of RAM and a Pentium 75 processor. My current phone now has 3 times as much internal storage (forgetting about the 1Gb MicroSD card), a CPU that’s probably 5 times as powerful and a brighter LCD display which may be only a quarter the resolution, but displays 16 times as many colours.

In 1997, there was no such thing as broadband (unless you fancied paying for a Kilo- or even Mega-stream fixed line) and mobile data was something that could be sent over the RAM Mobile Data Network at speeds of maybe 9kbps. I do remember playing with an Ericsson wireless adapter which allowed a PC to get onto the RAM network – it was a type III PCMCIA card (meaning it took up 2 slots), it had a long retractable antenna, and if you used it anywhere near the CRT monitor that would be on the average desk, you’d see massive picture distortion (and I mean, pulses & spikes that would drag everything on the screen over to one side) that would make anyone think twice about sitting too close to the adapter…

The standard issue mobile phone was the Nokia 2110, a brick by modern standards which was twice as thick & twice as heavy as my Orange SPV E600, though the Nokia’s battery was only half as powerful but was said to last almost as long as the SPV’s. Don’t even think about wireless data, a colour screen, downloadable content or even synchronisation with other data sources like email.

People didn’t buy stuff on the internet in 1997 – in fact, a pioneering initiative called “e-Christmas” was set up at the end of that year, to encourage electronic commerce – I recall being able to order goods from as many as a handful of retailers, across as many as a few dozen product lines!

One could go on and on – at the risk of sounding like an old buffer. If Ray Kurzweil is right, and the pace of change is far from constant but is in fact accelerating and has been since the industrial revolution, then we’ll see the same order of magnitude change in technology as we had in the last ten years, within the next three.

In tech terms, there was no such thing as the good old days: it’s never been better than it is now, and it’s going to keep getting better at a faster rate, for as long as I think anyone can guess.

“Success kills” – Marc Andreessen on Facebook

Like so many other people in the last few weeks, I started using Facebook. They’re growing at such a ridiculous rate, adding 100,000 new users every day, and it’s reckoned that 50% of the millions of active users return to the site every day.


Following a link from a post by Steve, and reading Marc Andressen’s opinions on why Facebook is so successful (and what it’s done spectacularly right, and what in his opinion its shortcomings are), one particular section shocked me the most… after discussing the viral spread of Facebook applications and focusing on iLike as probably the most successful. Facebook app developers need to host their own servers (and bandwidth) to provide the services that Facebook will provide the gateway to. When iLike launched, they had near-exponential take up of their application, which completely hammered the servers they had access to. Here’s what Andreesen says subsequently:



Yesterday, about two weeks later, ILike announced that they have passed 3 million users on Facebook and are still growing — at a rate of 300,000 users per day.

They didn’t say how many servers they’re running, but if you do the math, it has to be in the hundreds and heading into the thousands.

Translation: unless you already have, or are prepared to quickly procure, a 100-500+ server infrastructure and everything associated with it — networking gear, storage gear, ISP interconnetions, monitoring systems, firewalls, load balancers, provisioning systems, etc. — and a killer operations team, launching a successful Facebook application may well be a self-defeating proposition.

This is a “success kills” scenario — the good news is you’re successful, the bad news is you’re flat on your back from what amounts to a self-inflicted denial of service attack, unless you have the money and time and knowledge to tackle the resulting scale challenges.


I love that analogy – self-inflicted DOS 🙂 But what a scary situation to be in – suddenly having to provide real-time, world-class infrastructure or else risk losing the goodwill of everyone who’s accessing the service if it fails or is too slow.


All of which makes me think – where on earth does the revenue to pay for all stuff this come from?

iPhone gets a date

Well I’m sure it’ll have plenty of people courting it when it comes out, but Apple have announced that the iPhone will be available in the US at the end of June.

At least when the phone actually ships, everyone will know what it’s really like rather than the seemingly endless speculation (of which this post is yet another part…) The reason for my adding to the noise concerns the commentary that iPhone will run a version of OS X … that’s as maybe, but the CPU in the phone is a different architecture than either the older PowerPC or the newer Intel cores that the Mac has used for years – though yet to be officially confirmed, it’s said to be using the XScale family (whose lineage goes back to ARM Holdings and Intel, but has been sold off to Marvell).

I used to get irked by people describing Windows CE (the core OS underneath Windows Mobile devices, which has been around for more than a decade) as “a cut-down version of Windows”: it isn’t, never has been. WinCE may share some of the same APIs as Windows (which can make it more straightforward for developers to write software for it, since they do the same/similar things on both platforms), but it’s a long way from being a “cut-down” version.

So even if you’ll be able to buy 3rd-party applications for the iPhone (which Steve Jobs alluded to at last week’s D Conferece), there’s no way you’re going to be taking an existing app from OS X and just pushing it down to the phone. According to All Things D, again:

Whoa. Jobs says iPhone runs “real OS X, real Safari, real desktop email.”
Walt follows up: If that’s true, could other OS X applications run on the iPhone? Jobs says no. They’re not designed to.

I’d venture to suggest that “not designed to” in this case doesn’t mean “not optimisied for”, but instead, “not able to run because the underlying OS and CPU architecture is completely different”. Windows developers can build applications targetted at Windows Mobile using the same tools and many of the same techniques they use to build Windows applications (Visual Studio, in other words), but the apps have to be specifically developed for the platform, given that screen size & orientation and the UI model is so different – and iPhone developers (assuming there will be some besides Apple & Google), will need to do the same.

Surface er, surfaces

The secret project codenamed Milan was announced today, as “Microsoft Surface”: there have been a few videos floating around from Microsoft Research open days and the like, but the announced technology has had a good deal of marketing gloss applied and it really looks fantastic.

Check out the videos on http://www.microsoft.com/surface – it’s interesting to note that this is technology that’s been developing for years, not just some great idea that’s been annonced before it has any legs. The actual device is going to be relatively expensive ($1000s) and won’t be available until late this year, but it has some interesting possible applications – particularly in face:face scenarios where having a screen/keyboard between two people could be divisive.

The scenarios in the several videos might seem a bit far fetched right now, but who knows where this technology could be in 5 or 10 years’ time?

Is that a Zune in your pocket…?

(or are you just pleased to see me)

OK, after talking about media players just over a month ago, I couldn’t stop myself from buying a Zune when I was in the US last month (no willpower).

Zune 30 GB Digital Media Player (Brown)At $249 + tax, with the exchange rate currently so favourable, it worked out about £130, which is a pretty good price. I had to go for the brown one, since it’s a little different to the predictable shiny white or black that every other gadget seems to be. The edges of the device even have a distinctive green hue which appears to catch the light nicely (click on the pic and zoom in on the resulting image, to get an idea).

I think a lot of the reviews of the Zune have been a bit disappointed about the fact that it’s not really all that revolutionary – apart from the wireless sharing piece – but I personally think that’s one of its strengths. It’s packed full of nice touches which aren’t immediately obvious – like the fact it switches any playing media to Paused if you pull the headphones out.

Another nice usability feature – when the device is in the “sock” that it comes with, you can feel the dished 4-way control below the screen, so it’s easy to change volume or skip back and forward without needing to see the Zune… something I didn’t really appreciate until I started using it.

Now I’m looking forward to the launch in the UK!

Virtual Earth Mobile – mapping on the move

Microsoft’s Virtual Earth technology continues to take strides forward – not just in the inevitable mash-ups, but in new ways of accessing the maps (as well as from http://local.live.comwhich I keep on trying to access as live.local.com… d’oh). There are 3D maps in beta, as well as a cool add-in for Outlook 2000/3 (though yet to be updated for Outlook 2007).


I installed a newer version of Virtual Earth Mobile on my Pocket PC the other day… on searching for a business called Microsoft in Reading, here’s what I was offered as an initial map…


 


Switching to road + aerial, zooming in a bit and sliding the keyboard out to rotate the screen gives us…



… and it can still zoom in two more levels, so you can make out specific details like the parasols outside the restaurant!


I actually used this to get to a customer today – arrived at Waterloo station and realised that I didn’t know which immediate streets I needed to follow to get to the address I’d been given. I just searched for the street name, showed aerial view, walked past the London Eye and found it with no hassle … be careful though: prolonged use could lead to very large data bills 🙂


Have a look yourself from the Windows Mobile Blog.

Vista Aero Glass – performance hit (or not)

Just read an interesting analysis at http://firingsquad.com/hardware/windows_vista_aero_glass_performance/ where they tested a couple of different systems running Windows Vista with Aero Glass switched on and off. (Windows Aero – if you’re not aware of it by name – is the new user interface functionality, with transparent windows and the swish new effects present all through Vista)

The cynic in most techies would assume that flashy graphics mean hammering the system performance; I’ve known plenty of people who even switched off all the fancy UI features, on the basis that the machine would be a few % more responsive… remember the old advice on Windows 3.1 or 95 to not use a graphical desktop backdrop since that put an overhead on system performance?

Anyway, the FiringSquad results are predictably games-focused, but draw an interesting conclusion – graphical performance is, in some cases, marginally better with Aero switched on, and even in the cases where it isn’t, it’s only fractionally less so.

“Quite frankly, we were shocked by these results.”

So, the moral of the story is… switch on all the bells and whistles if you can 🙂

Sansa e280 – I took the plunge

After my post last month about getting a new MP3 player, I went ahead and bought a Sansa e280 8Gb device from Amazon UK. In general I’m pretty pleased with it – battery life looks good, sound quality is good, it supports direct sync from Windows Media Player etc.

There are a couple of grumbles though – the touted ability to display Album Art is only available if you manually copy a file called “Album Art.jpg” into the folder on the device where the music lives… which is a fairly tedious process to go through IMHO. Come on Sansa … Windows Media player already stores AlbumArt<somenumber>.jpg files in the same folder – any chance you could come up with a sync utility which automates the copy process?

Oh, and the European models don’t have an FM radio… something I’d missed in the specs (assuming that it was the ability to record FM that was missing from European models, not the entire FM tuner). Ho hum, not a big deal but a minor niggle nevertheless.