The biggest file I’ve ever seen – 3Tb PUB.EDB

Well I haven’t seen this for myself, but I was sent a screenshot of it. Actually, it was 3 different Exchange public folder servers, each of which had ~3Tb of public folder data…

image

That’s scary and impressive in equal measure.

Reminds me of some of the stories people posted in response to my How does your Exchange garden grow? post nearly 3 years ago, on the Exchange Team blog…

Tips for optimizing Vista on new hardware

Ed Bott over at ZDNet posted a really interesting article yesterday, detailing the journey he had of making his friend’s brand new Sony Viao laptop work properly with Windows Vista Business. In short, his friend upgraded a trusty old XP Vaio to a new machine which came with Vista, but had a terrible experience of crashes, slow start up, bogging performance etc.

In a nutshell, the advice is pretty straightforward, at least for technically minded folk and backs up the experience of some of us who’ve been using Vista all through the beta program:

  • Start with Vista-capable hardware. It’s almost a waste of money trying to upgrade old PCs to run Vista. New machines which (supposedly) have been designed to run Vista with modern architectures, devices which have a good chance of having decent Vista drivers and enough horsepower to do it justice, are so cheap now, it’s hardly worth trying to tweak anything older than a couple of years old to get Vista working well on it.
  • Use the latest, best quality drivers you can. It still amazes me how many manufacturers ship machines pre-loaded with years-old device drivers, or (conversely), how many update drivers & BIOSes frequently but with poor attention to quality (the device driver certification program is there for a reason; if you have a piece of hardware that comes with a non-certified driver, you have to ask: if the manufacturer of the device cut corners in bothering to get it certified, where else did they trim savings?)

    I got a new Lenovo Thinkpad tablet a few months ago, and it was (and still is) a brilliant piece of kit. Lenovo have done a class-leading job of making it easy to keep everything up to date – including the system BIOS – in a single application, the ThinkVantage System Update. Think of that as a single app which already knows exactly what hardware you have, and checks the Lenovo site to see if there’s anything to update.

    I’ve had so many PCs where the vendor’s driver download page needs you to know everything about the internal bits of the hardware (Dell, stand up and be counted) – after choosing the machine type, why do I need to know which iteration of network controllers it has, or whether it’s got the optional super-dee-dooper graphics card or bog standard one? Can’t the manufacturer figure that out, especially if they ask for a serial number to help identify what the machine is?

  • Don’t put any unnecessary crapware on it. This starts off as a fault of the OEM who supplied the machine (sorry Dell, I have to single you out again, but you’re far from unique). It’s worth making sure you don’t install any old junk from the internet and leave it lying around on your machine. Ed Bott even suggests doing some basic installs (like Acrobat, Flash etc) then taking a full machine backup, so you can always revert to a nice starting point. Combine that with the Really Rather Good backup software in Vista (or even the Windows Easy Transfer software) which can make sure your data is safe, and it’s not unthinkable that every six or twelve months a savvy user could easily blow away the machine and recover the starting image & last data backup to be in a good state again.

    Most people accept that they need to service a car regularly to keep it running well – a modern PC is a good bit more complicated than a car (albeit with generally less terrible consequences if it all goes boom).

Part of Ed’s summary neatly encapsulates his thinking…

Well, for starters, Vista doesn’t suck. And neither does Sony’s hardware. That four-pound machine with the carbon-fiber case is practically irresistible, as my wife continues to remind me.

But when you shovel Windows Vista and a mountain of poorly chosen drivers, utilities, and trial programs onto that beautiful hardware without thinking of the customer, the results can be downright ugly. That was certainly the case with the early-2007 vintage Vaio, and it’s still true today, with too much crapware and not enough attention to quality or the user experience.

Tip for finding when an appointment was created

Here’s a tip for when you suspect someone has magicked up an appointment to coincidentally collide with an Outlook meeting request you sent them…

In your own calendar (and other people’s), you can see when a meeting was scheduled (ie request was sent or created), as well as other facts (like when you accepted it) – eg:

image

If a blocked out time in the calendar is just an appointment (ie something that was just put there by the owner of the calendar), you don’t see the date it was added…

image

Remember, they’re all just forms in the end 

Way back when Exchange was young (it started at 4.0), the design was that emails/meeting requests etc, were just an "item" (which is a collection of fields, different depending on the type of item it is), and a "form" which was associated with a particular kind of item using the Message Class to denote it.

In other words, an email message would have fields like Sender, date, recipients, subject, etc. And when you went to open a message, the Exchange client (later, Outlook) would look at the class on the item (IPM.Note, for a message) and would find the appropriate form to open that item. Clear? If you really want examples of lots of different Outlook items, see MSDN.

Anyway. If I’m looking at an appointment which wasn’t a "meeting" (ie it was just put into my or someone else’s calendar, not via a meeting request/acceptance), I might not be able to see the date it was created, but the underlying item definitely does have that property. Displaying it in Outlook is pretty straightforward, if a little contrived. Here’s one quick & dirty method of doing so (I may post a more elegant solution if there’s interest)…

image1. Get to "Design this form"

Older versions of Outlook had a Developer item on the menu structure which allowed you to select (via several pop-outs if I recall) to design the current form. Outlook 2007 simplified the menus (now using the Ribbon) and no longer shows that Developer menu. One quick way of putting it back is to add that specific command to the "Quick Access Toolbar"…

Click on little down-arrow just to the right of the Quick Access Toolbar on the top left of a form (eg the form of the appointment you’re looking at), then choose "More Commands"…

On the resulting dialogue, select Developer tab in the "Choose commands from:" drop-down list box, then scroll down to find "Design This Form" (note "This Form", not "a Form…". Select that command, click on Add, then OK out of the customize dialogue.

image

Now you have a little icon supposed to represent designing actions (pencil, ruler, set square) in your toolbar:

image

Click on the icon and you get into the form designer, with the current item being loaded. You’ll see a bunch of tabs – these correspond to "pages" within the form, and any in brackets are hidden. Select the "All fields" tab, choose Date/Time fields from the drop-down (or try "All Appointment fields").

image

You should now see just the date fields, including the original creation date…

image

This might seem a real palaver, but once you have the icon on the QAT, it’s a 5 second action to show the dates… and can be very handy 🙂

Imperialism, Metric-centricity and Live Search

I’m a child of a mixed up time when it comes to measures and the likes. I am feet and inches tall, stones and pounds heavy, when it’s cold outside, it’s below zero degrees, but when it’s hot, it’s in the 80s.

I learned small measurement in mms and cms, so have no real idea of how big an inch is, but long distances are thought of in miles (and petrol is bought in litres to go into a car which reports how many miles per gallon it’s getting).

Now and again, I’ll need to try & recall how many chains there are in a fathom, or ounces per metric tonne, and typically call on the services of a search engine. That used to be searching for something like:

image

… where we’d normally get taken to a site in the results, which has a wizard of its own to do the calculation. Often times, the reason I want to convert something is because I’m already doing a calculation and I just need to know the ratios involved…

Which is why I love the little innovation that Live Search introduced:

image

Right at the top of the results list, there you have it – dead right this is useful 🙂

Exchange 2007 clustering advice

I appreciate it’s been a while since I blogged last – a combination of "not much to talk about, really" with even more "no time to talk about it"… 🙁

Anyway, a few questions came in the other day from a reader:

SCR and CCR seems to work with SAN and DAS. When DAS (direct attached or local storage) is used, and it most probably it’s attached to the Active node, how does the Passive node function if it hasn’t got connection to the DAS/Local storage of the Active node?

In CCR, it’s important to realise that the passive node has its *own* set of disks, which contain its *own* copy of the data – doesn’t really matter if they are DAS or SAN disks (at least not conceptually). So, in a CCR failover scenario, the (as was) passive node switches to being the active node and uses its own copy of the database (which by now becomes the main one). SCR is different in the way failover happens, but in principle it’s similar – the secondary copy of the data is brought online and takes over servicing the clients, but using its own copy of their database.

-Some clients are indicating that having CCR or SCR one wouldn’t have a need for Backup of mailbox servers. Do you have any comments?

Absolutely not. That’s like saying, because my car has an airbag, I don’t need to wear a seatbelt. Check out the High Availability Strategies section of the Exchange documentation for more detail on the options.

Having CCR gives you the ability to fail over in effectively real time, for the purposes of planned maintenance or after an unexpected failure. SCR adds the possibility of having another replica of the data, potentially in a different location, which can be brought online through a manual recovery process (whereas CCR will bring the data back automatically, since it’s part of a cluster).

Backup is still important (What happens if you lose all servers? What about long-term archival of data?) There’s always the possibility that databases could be corrupted or infected in some way, and if that happened, the replica(s) of the databases would also likely suffer the same fate … so taking regular backups would give you the ability to roll back to earlier versions of the database.

There’s always the scenario where users delete some information that needs to be brought back sometime in the future – there are various options around item recovery with Exchange 2007, but if it was deleted (say) a year ago, then you’d be looking at a backup as the means of recovery.

Data Protection Manager would be worth looking into, to help with backup requirements – it allows you to take regular snapshots of a running server, which can later be spooled out to offline storage.

– In SCR, is there a bandwidth utilization estimate used for replicating the Active and Standby/passive node? I understand that in CCR and SCR the log sizes are reduced to 1MB from standard 5MB though.

The log files in Exchange 2007 are reduced from 5Mb to 1Mb anyway – partly because of CCR and LCR (and later SCR), but even if you don’t configure any of the replication technology, you’ll still be on 1Mb logs.

As far as how much bandwidth you’re going to need between nodes for the purposes of replication, well that depends – if your servers are very busy, then they’ll obviously need to shift more data, and latency will come into play.

There is a detailed section in the Exchange TechCenter online documentation which covers planning for replication at a hardware, software configuration and network level.

Seadragon begets Silverlight “Deep Zoom”

There’s a headline that might baffle…

Seadragon Inc was a Seattle-based software company who had done a load of work on handling vast quantities of imagery and being able to manipulate the data in real-time, on-screen. Microsoft acquired Seadragon and has been beavering away behind the scenes to finesse the technology further and to integrate it into other means of delivery – if you haven’t seen it, check out the awesome demo done by Blaise Aguera y Arcas at last year’s TED conference:

Using photos of oft-snapped subjects (like Notre Dame) scraped from around the Web, Photosynth (based on Seadragon technology) creates breathtaking multidimensional spaces with zoom and navigation features that outstrip all expectation. Its architect, Blaise Aguera y Arcas, shows it off in this standing-ovation demo. Curious about that speck in corner? Dive into a freefall and watch as the speck becomes a gargoyle. With an unpleasant grimace. And an ant-sized chip in its lower left molar. "Perhaps the most amazing demo I’ve seen this year," wrote Ethan Zuckerman, after TED2007. Indeed, Photosynth might utterly transform the way we manipulate and experience digital images.

Well, the Seadragon technology gets closer to being available as part of Silverlight 2.0 Beta 1, now referred to as "Deep Zoom". It was announced recently at Mix08, and I must have missed the significance of this piece but when I saw the first Deep Zoom demo site, I thought "Wow".

One of the demos at the Mix08 conference in Vegas last week, was of a pretty amazing site put up by Hard Rock Cafe, showcasing some of the rock memorabilia they have – mosey over to http://memorabilia.hardrock.com and you’ll get prompted to install Silverlight 2.0 beta 1 if you want.

image

The Hard Rock site was built from the ground up in one month, and contains many gigabytes of visual imagery. Not that you’d notice when you visit for the first time having installed Silverlight 2.0…

The back end of the Memorabilia site uses Sharepoint for its content management, although the front end is all custom in Silverlight. There was a parallel announcement at MIX about the Silverlight Blueprint for Sharepoint, more details here.

No more to say about this other than it’s really, really, cool. Combine the early delivery of stuff like the Hard Rock Cafe demo site, with Blaise’s idea in the TED Video about how this technology could be used to present information in a non-linear way – imagine being able to zoom into the full stop at the end of a sentence to get pages and pages more detail about what the sentence contained – and the future way that web pages could be delivered to us might be very different from the linear, monolithic way a lot of information is presented today.

Exciting, isn’t it?

More info on "Deep Zoom:

http://blogs.msdn.com/usisvde/archive/2008/03/09/silverlight-deep-zoom-goodies.aspx

http://labs.live.com/Silverlight+2+Deep+Zoom.aspx

http://joestegman.members.winisp.net/DeepZoom/

image

http://www.vertigo.com/DeepZoom.aspx

Deep Zoom composer tool preview

Windows Media Center query-based recording

Here’s a tip for anyone running Windows Vista Home Premium or Ultimate editions (the ones with Media Center functionality), if you have a suitable tuner set up and configured. I mentioned this in passing to someone who uses Media Center as their primary TV tuner, and they didn’t know it was possible – largely because it’s a bit obscure and not exactly easy to find.

I don’t use Media Center as my primary TV – we have a Sky HD box to do that, and although I’m generally happy with the functionality and reliability of the Sky box, its UI isn’t anywhere near as flexible as MC’s. The Guide is one example of that – Sky lets you browse the guide but the options to search it are a bit thin, so it’s OK if you know there’s something you want to record. MC allows you to query the schedules (including all the obscure channels you might never watch) to find specific named programs, or even ones where the metadata matches your search.

My PC in the study has a cheap Hauppauge USB Freeview tuner installed, and an XBox 360 in the living room allows us to watch stuff that gets recorded on the PC.

If you go to Recorded TV on the main MC menu, and select to Add a Recording, you get:

image

… meaning, you can record something based on searching the Guide. If you choose the "Create a custom recording" feature, however, you can have MC automatically record a programme that isn’t scheduled yet, on the off-chance that it will be shown again at some point. Useful for catching up with old films that appear every few months.

In this example, maybe I want to record Ghostbusters. Select Keyword from the custom list:

image

Now, selecting any of the first 4 options will search against the current guide, and if there’s nothing scheduled, you won’t be able to select it. If you pick Generic keyword, however, and you get a slightly different UI:

image

Media Center will allow you to save your query, and will record anything that shows up in the guide at some future date, which features the word you just entered..

image

If you want to check what custom recordings you have scheduled, start again from "Recorded TV", and select "View scheduled" – you’ll see a list of anything that’s set to record, but only if it exists already in the guide.

image To see what you have set to record on schedule, choose the "Series" option on the left, and anything that shows up as "ANY CH" means it will record whenever the guide can match your query.

image

As I said, not exactly obvious… but very cool!

Virtuali(z)sation & datacenter power

It’s been very quiet here on the Electric Wand for the last month or so:

  • I took a new job in December which means things have been pretty hectic at work. I’m now managing a new group, and lots of time spent building up a great team.
  • Just back from Seattle from a week’s Microsoft internal technical conference.
  • To be honest, I haven’t had much to talk about on the blog 🙂

The TechReady conference I went to in Seattle had a few interesting themes, but much of the technical stuff presented is still internal only so can’t be discussed (yet) online. A good chunk is probably "subject to change" anyway …

There are a few themes which were either covered in a number of different sessions, or which really made me think hard about the way IT is going – amongst them Virtualisation (I do hate using the "z", even though it’s technically OK – it just seems so un-British), the march towards multi-core parallelism (instead of clock speed race) and the whole Green IT agenda of power usage.

I’m planning to write a bit more about both these topics in current weeks, along with business case for Office Communications Server, but here’s some food for thought:

A major enterprise datacenter could well be consuming 10s of Megawatt/Hs of power – something that could be equated to many, many flights or other so-called demons of carbon emissions. A back-of-an-envelope calculation of all Microsoft’s own datacenter power usage (including all the online services) would equate to over 100 Jumbo Jet flights from London to Seattle every day. That’s 100 planes, not 100 passengers…

This power usage topic is one which is going to grow in importance – not just because power prices are rising (eg a 100 MW/h power usage for a large internet datacenter could easily cost more than £15m per annum in power costs alone). One project internally in Microsoft is looking at the actual power usage and the equivalent tonnes of CO2 emissions of all of its datacenters – a concept that’s surely to become more mainstream in the future.

Citing datacenters by renewable energy sources (such as Google’s massive datacenter by the Colorado river in Oregon) makes the power usage more palatable, but it doesn’t remove the need to reduce heat (and air conditioning requirements) and overall power usage – even if it means employing people to physically go round pulling the plugs at night-time on the myriad rack servers.

Anyway, as I said, more on this topic in coming weeks – in the meantime, I’ve not gone away … just waiting for the right time to pipe up 🙂

The Firmware of Everyday Things

(with apologies to Donald Norman, for paraphrasing his excellent book title)


As an IT person, I’m pretty used to the idea that I need to update software now and again. Sometimes, it’s to make it more secure (closing down vulnerabilities that afflict any software exposed to the outside world), fixing bugs (which affect all software, period) or adding new features and functionality (which maybe the designers of the software didn’t think of before, or which they just didn’t have time to implement). Maybe the update is the form of a patch, maybe it’s a whole new version that I will choose to buy.


As the reach of software gets more and more pervasive, it’s interesting to note the difference between what people will do in an IT world, and what they expect from the rest of the world around them.


Now, I spent a good chunk of the last weekend updating the firmware in my car, specifically the software which controls the entertainment systems, the Sat-Nav etc – in my case, it’s Audi’s excellent MMI system, but many other manufacturers are moving to some kind of multi-modal, software-based control mechanism for the myriad systems in the car.



BMW “popularised” such a system with it’s iDrive control technology, which seemingly took several revisions to be usable by any car reviewer, even though it made perfect sense to me.


Getting software upgrades for these things is far from easy: arguably, nor should it be. What business do ordinary consumers have in getting hold of low-cost or free updates, which they apply themselves, to make their ownership experience better?


As it happens, I heard about a series of updates which would make the navigation system built in to my car quite a bit better – and on asking, my dealer was more than happy to supply the software update for me and install it, for only £100+VAT to cover the labour involved.


I managed to find far more than I ever dreamt I needed to know, through various online forums – http://www.navplus.us/ particularly – that opened up a whole series of secret key-press combinations to bring up hidden menus, the part numbers of the CDs I’d need to order from the dealer (at about £1.50 each), and the procedures to upgrade the whole thing myself.


Here’s an example of just one such hidden menu…



So many other devices which previously would have been considered an appliance, now have the capability to be upgraded if only the suppliers embrace the idea and maybe even make it easy. Examples abound – the Philips Pronto universal remote control has spawned a huge user community to modifying the way it works, precisely because Philips made the software available to do so easily, and regularly updates the device’s capability based on user feedback.


I upgrade my mobile phone with beta software regularly, my Zune music player got a whole new look and feel courtesy of some free software and firmware upgrades. There are secret menus on my TV that show the software version, the satellite receiver downloads firmware updates automatically (even though sometimes it manages to crash when it tries to install them). Even the DVD drive that I fitted to my home PC has a little bit of software that checks online for updates to its firmware, and the PC into which I fitted it was having all sorts of trouble with its memory until I applied a BIOS update from the manufacturer.


Much of this stuff is very much beyond the ken of the man on the Clapham Omnibus, but as IT hardware awareness spreads out to the general public in time, maybe it’s not going to be too far in the future when people routinely expect improvements to come to any piece of electronic equipment through periodic updates.


Of course, it can all go horribly wrong – twice on Saturday, I got myself into a situation with the car where none of the MMI system worked, meaning I had no radio, no navigation, no GUI to any of the other systems like parking sensors or suspension settings etc… and it took a good deal of fuse-pulling and rebooting to get it all working again.


Maybe it’s better to just rely on someone to do it all for you …