How to cook the perfect fillet steak

OK, this is pretty far removed from the norm of an IT blog, but it is the weekend so I feel it’s allowed. The topic has some technical (practical) aspects, and is something I’ve been talking with a few people about lately.

I’ve seen various techniques on how to cook steak properly, but I came across one individual’s website (which, frustratingly, I can no longer find) a few years ago, which summarised everything beatifully and set me trying out a few different ways from the norm. In a nutshell: cook the steak from room temperature, oil the meat and not the pan, use a pan as hot as you can, and let the steak rest for at least as long as you cooked it.

It’s all about heat

I used to work in a professional kitchen. Well, I was a waiter in a nice restaurant, which meant I spent a bit of time in a pro kitchen (generally on the “other” side of the hot plate). Professional chefs seemingly have a duty to verbally abuse their waiting staff, which mine did with gusto if not applomb.

Several years later, I was being shown round a call centre (as The Client), when I recognised one of the chefs who’d been giving me verbal, was now trying to sell software over the phone. Presumably, the world of cooking hadn’t worked out for him quite as he’d hoped.

Anyway, one thing I learned about cooking steak back then was, it’s all about heat. Now the trick to cooking a good fillet steak (and that’s pronounced fill-it, not fill-eh, unless you currently live in France), is to try to get close to restaurant kitchen heat levels in a domestic kitchen. It can make for a lot of smoke, but it’s very effective. Here’s the deal…

    • Take your fillet steak out of the fridge at least 20 minutes before you want to cook it, to allow it to get to room temperature. Taking a cold steak and throwing straight onto hot metal won’t do anything for the tenderness of the end result.
    • Pat the steak with kitchen roll to remove any excess moisture (if the meat is wet, when you put it on the heat, that water will vapourise and only form a barrier between the steak and the heat source).
    • Once patted dry, rub a little sunflower or vegetable oil into the steak, with your fingers (don’t use olive oil – it burns at too low a temperature), and leave to sit for a few minutes. Season with salt & pepper if you like.
    • Put a small, dry, frying pan on maximum heat on the biggest ring/burner on your hob. Leave it there for a least a couple of minutes. In the meantime, go and open some windows. Things are likely to get smokey. I’ve heard of some people leaving the pan on heat for as much as 10 minutes, but you might struggle to see the cooker by the time you’re ready to put the fillet on.
  • When the pan is as hot as you can suffer, gently place the steak onto the surface. After 30 seconds or so, move it so it doesn’t stick and burn. Now it’s a straight function of how hot your pan is, how big the steak is, and how you like it cooked, which will determine how long to leave it there. I tend to find 2-3 minutes each side will give a nice medium-rare on a decent sized fillet on a pan that’s been on heat for a few minutes.

There’s a trick to being able to tell when the steak is properly cooked, and it involves prodding your own hand. Pressing on the surface with your finger, you’ll feel the flesh give way a little, and it should be about the same firmness as if you press with the finger on the fleshy part of your hand when pinching thumb and fingers together. It’s easier to show than describe:

Rare Medium/Well-don

So if you touch thumb and index finger, the firmness of your hand will be about the same as a rare steak, while thumb and little finger will be more like well done. Experience and practice will help you out here, and don’t be scared of cutting the steak to check it’s cooked as you’d like – better a well-cooked dish with a cut in the middle, than an undercooked but nicely presented one.

I can’t really understand wanting to cook a lovely fillet of beef to “well done”. You might as well save a bit of cash and buy a cheaper cut. In fact, according to Anthony Bourdain (and I’ve heard this of other chefs too), the skankier bits of beef get set aside for serving to restaurant customers who like their meat good ‘n burnt – with a label on the meat saying “SFWD”, or “Save For Well Done”.

Finally, take the steak off the heat and put on a warmed plate and just let it rest for 10 minutes or so, before serving. You might want to deglaze the pan with a little red wine and maybe a knob of butter, to make a nice sauce. Mmmmmmmmm.

Custom presence states in Office Communicator

 I just discovered how to modify presence states in Office Communicator 2007: it’s documented in the Office Communicator Deployment imageGuide (page 21, if you’re interested), and allows for either  the managed deployment of Communicator with additional corporate-set presence states, or if a user is savvy enough to do it themselves, they could have some fun…

The custom states appear like shown in this screenshot (the one in the deployment guide seems to be in error – it doesn’t actually show any custom states), and you can have up to 4 of them and set which of the coloured statuses you want to apply to each of your defined presence states.

I’d originally noticed this was possible when I glanced down at the beautiful screen on my newly-acquired “Tanjay” phone (as shown on Gurdeep’s desk here, along with a bunch of other UC devices, and akin to the LG-Nortel 8540), and I saw Adrian’s status was “Delivering …”

img014

… which set me off to find out how he’d done it. Note my own status is also displayed on the Tanjay, and updates in real time…

The business case for Exchange 2007 – part IV

Another installment in a series of posts outlining the case for going to Exchange 2007. Previous articles can be found here.

GOAL: Make flexible working easier

“Flexible Working” might mean different things to differing organisations – some might think of mobile staff who turn up at any office with a laptop, sit at any free desk and start working – others might imagine groups of workers who can work from home part- or even full-time. Whatever your definition is, there’s no doubt that the technology which can enable these scenarios has evolved in great strides in recent years.

RPC Over HTTP – magic technology, even if the name isn’t

The “Wave 2003” of Exchange Server 2003/Outlook 2003/Windows XP SP2/Windows Server 2003 brought to the fore a technology which wasn’t really new, but needed the coordination of server OS, server application, client OS and client applications to make it available: if you’ve been using or deploying RPC/HTTP, you’ll know exactly what it does and why it’s cool. If you haven’t deployed it, the name might mean nothing to you… in short, the way in which Outlook talks to Exchange Server when you’re on the internal network, can be wrapped up within a secure channel that is more friendly to firewalls – hence “tunneling” that protocol (RPC) inside a stream of data which your firewall can receive (HTTP, or more correctly, HTTPS).

What this means in practice is that your users can connect in to your environment using a widely-supported network mechanism (ie HTTPS), and without requiring a Virtual Private Network connection to be established in the first place. This manifests itself in the fact that as soon as a user’s PC finds a connection to the internet, Outlook will attempt to connect to your network using HTTPS, and if it succeeds, will become “online” with Exchange and (if they’re using the default “cached mode” of Outlook) will synchronise changes between Outlook and Exchange since the client was last online.

image

A sometimes overlooked benefit of using regular internet protocols to connect the client & servers together, is that the communication will be able to leave one protected network, traverse the unprotected internet within a secure channel, then enter a second protected network. This means that (for example) your users could be connected to a customer or partner’s own internal network, but be able to go through that network’s firewall to reach your Exchange server. If you required a VPN to be established to connect Outlook and Exchange, then it almost certainly won’t be possible to use a protected network as your starting point, since the owners of that network will not allow the outbound connections that VPN clients use, but will allow outbound connections on HTTPS.

Now, RPC/HTTP was part of Outlook and Exchange 2003, however it’s been improved in Exchange 2007 and is easier to get up and running. If you’re also using Outlook 2007, the client configuration is a whole lot simpler – even if it’s the first time a user has ever connected to Exchange, all they may need to know is their email address and password, and Outlook will be able to find the Exchange server and configure itself using whatever default you’ve set. The technology behind the ease of configuration is called the Autodiscover Service, and the whole area of “connecting over the internet” functionality has also been given a more descriptive (to the non-techies, anyway) term: Outlook Anywhere.

From an end-user point of view, this technology is almost silent – for remote laptop users working at home, they often just start up their laptop, which connects automatically to a home wireless network and out to the internet, then Outlook just goes straight to Exchange and they’re online. Deploying this technology in Microsoft saw the volume of VPN traffic reduce dramatically, and the calls to the help desk concerning remote access dropped significantly too.

NET: Using Outlook 2007 and Exchange 2007 together simplifies the provision of remote access to remote users, particularly when using Outlook in “cached mode”. This configuration reduces, or even removes, the need to provide Virtual Private Network access, which could make the user experience better and save management overhead and expense.

Web client access instead of Outlook

Another element of flexible or remote working might be to use the web to get to email – maybe your remote users just want to quickly check email or calendar on their home PC, rather than using a laptop. Maybe there are workers who want to keep abreast of things when they’re on holiday, and have access to a kiosk or internet cafe type PC. Or perhaps your users are in their normal place of work, but don’t use email much, or don’t log-in to their own PC?

Outlook Web Access has been around for a number of versions of Exchange, and just gets better with every release. The 2007 version has added large areas of functionality (like support for the Unified Messaging functionality in Exchange, or huge improvements in handling the address book), meaning that for a good number of users, it’s as functional as they’d need Outlook to be. It’s increasingly feasible to have users accessing OWA as their primary means of getting to Exchange. One possible side benefit here is a licensing one – although you’d still be required to buy an Exchange Client Access License (which gives the user or the device the rights to connect to the server), you won’t need to buy Outlook or the Microsoft Office suite.

Outlook Web Access not only gives the web-user the ability to use email, calendar etc, but it can also provide access to internal file shares and/or Sharepoint document libraries – where the Exchange server will fetch data from internal sources, and display to the reader within their browser. It can also take Office documents and render them in HTML – so reading a spreadsheet or document could be done on a PC with no copy of Office available, or simply can be read without needing to download a copy of that document for rendering client-side in an application.

It’s possible to control what happens to attachments within OWA – some organisations don’t want people to be able to download attached files, in case they leave copies of them on public PCs like internet cafes – how many users would just save the document to the desktop, and maybe forget to delete it? Using server-side rendering of documents, all traces of the document will be removed when the user logs out or has their connection timed out.

Even for predominantly office-based users, OWA can provide a good way of getting to mail from some other PC, without needing to configure anything or log in to the machine – in that respect, it’s just like Hotmail, where you go to a machine and enter your username and password to access the mail, rather than having to log in to the whole PC as a given users.

If you deploy Outlook Anywhere (aka RPC/HTTP), you’ll already have all the infrastructure you need to enable Outlook Web Access – it uses the same Exchange Client Access server role (in fact, in Microsoft’s own deployment, “Outlook Anywhere” accounts for about 3/4 of all the remote traffic, with the rest being made up of OWA and Exchange Activesync).

NET: Outlook Web Access gives a very functionally-rich yet easy to use means of getting to data held on Exchange and possibly elsewhere on the internal network, in a secure means of communications to an external web browser. OWA 2007 has replicated more of Outlook’s functionality (such as great improvements to accessing address books), such that users familiar with Outlook will need little or no training, and users who don’t have Outlook may be able to rely on OWA as their primary means of accessing mail.

Mobile mail with ActiveSync

Exchange 2003 SP2 and an update to Windows Mobile 5 introduced the first out of the box “push mail” capability for Exchange, which forms part of the Microsoft Exchange Activesync protocol that’s also licensed to a number of other mobile device vendors. This allows Exchange to use the same infrastructure that’s already in place for Web access and for Outlook Anywhere, to push mail to mobile devices and to synchronise other content with them (like calendar updates or contact information). The Exchange Activesync capability in Exchange 2007 has been enhanced further, along with parallel improvements in the new Windows Mobile 6 client software for mobile devices.

Now it’s possible to flag messages for follow-up, read email in HTML format, set Out of Office status, and a whole ton of other functional enhancements which build on the same infrastructure described above. There’s no subscription to an external service required, and no additional servers or other software – reducing the cost of acquisition, deployment, and (potentially) in TCO. Analyst firm Wipro published some research, updated in June 2007, looking into TCO for mobile device platforms in which they conclude that Windows Mobile 5 and Exchange Activesync would be 20-28% lower in cost (over 3 years) than an equivalent Blackberry infrastructure.

NET: Continuing improvements in Exchange 2007 and Windows Mobile 6 will further enhance the user experience of mobile access to mail, calendar, contacts & tasks. Overall costs of ownership may be significantly lower than alternative mobile infrastructures, especially since the Microsoft server requirements may already be in place to service Outlook Anywhere and Outlook Web Access.

A last word on security

Of course, if you’re going to publish an Exchange server – which sits on your internal network, and has access to your internal Active Directory – to the outside world, you’ll need to make sure you take account of good security practice. You probably don’t want inbound connections from what are (at the outset) anonymous clients, coming through your firewall and connecting to Exchange – for one, they’ll have gone through the firewall within an encrypted SSL session (the S part of HTTPS) and since you don’t yet know who the end user is, an outsider could be using that connection as a way of mounting a denial of service attack or similar.

Microsoft’s ISA Server is a certified firewall which can be an end-point for the inbound SSL session (so it decrypts that connection), can challenge the client to authenticate and can inspect that what is going on in that session is a legitimate protocol (and not an attacker trying to flood your server with traffic). The “client” could be a PC running Outlook, a mobile device using Activesync or a web browser trying to access Outlook Web Access. See this whitepaper for more information on publishing Exchange 2007 onto the internet using ISA.

The Wal-Mart Effect

Here’s an interesting book on a business force which is changing the way that the US economy works, if you believe what the author is saying. Wal-Mart (which owns ASDA in the UK) has been growing like crazy in recent years, to the point where they’re big enough, supposedly, to have a direct impact on the inflation rate in a economy the size of the US.

The

One startling aspect of this exposé, is the effect that a company as powerful as Wal-Mart can have on its suppliers… normally reported as a bad thing, but there are good things too. An example of the latter was of one company who was shipping goods into the US, which were then taken to its own distribution centres, repackaged and sent out to Wal-Marts distribution chain, and then on down to the stores.

Once the two companies started sharing more detailed information with the other, Wal-Mart revealed that it was sending empty trucks back to its regional centre, from stores all over the country, which could be used by this supplier – so the supplier started importing its goods bound for Wal-Mart into Florida, and using Wal-Mart’s own trucks to ship the merchandise straight to their own distribution centres, thereby cutting out waste & expense.

It’s an interesting read – there may even be some parallels between Wal-Mart and Microsoft, some positive and others not. Microsoft’s Chief Operating Officer used to be Wal-Mart’s CIO, responsible for (among other things) one of the largest databases in the world, where Wal-Mart’s suppliers could see into the sales of their products across the entire distribution chain, as they happened… Quite some system…

Identity & presence: the key to anyone’s Unified Communications strategy

 I spend a lot of time talking with customers about what Microsoft is doing with various new technologies, mostly involving or revolving around the Unified Communications stuff with OCS and Exchange. It’s really interesting to see how many people just “get” the point of UC technology, whereas others are either blind to its potential, or even doing the fingers-in-ears, shut-eyes, repeating “no, no, no” denial that a lot of this stuff is coming whether they like it or not.

I don’t mean that software companies are somehow going to compel everyone to adopt it, more that end-users themselves will be expecting to use technology at work which they have grown used to at home. For several years now, it’s been typical that people have better IT at home than they’d have in the office – from faster PCs, bigger flat screens, to the software they use – it’s exactly this kind of user who has driven the growth of services like Skype, and possibly helped shape the way enterprises will look at telecoms & communications in the future.

Various pieces of research, such as Forrester Groups’ 2006 paper on “Generation Y” types (as reported at TMC.Net), predict that people who were born in the 1980s and beyond, are adopting technologies into their lives faster than previously… and as those same “Millenials” are making their way into the workforce, they’re bringing their expectations with them, and possibly facing the “Computer says no” attitude that some, er, older, IT staff might still be harbouring.

Instant Messaging concerns

It’s already been reported that teens use IM more than email so it seems inevitable that IM will come to the enterprise one way or another. Some enterprises have turned something of a blind eye to “in the cloud” IM services such as Windows Live/MSN Messenger, AOL, Yahoo, Google Talk etc. Others have actively shut down access to these services by blocking firewall ports. Both of these approaches will need, at some point, to be re-evaluated or formalised through acceptable use policies etc – just as businesses in the past didn’t give users internet access or even email, due to concerns that they’d just waste all their time chatting, or the threat to security of opening up to the world.

In reality, users will waste time on IM initially, just like they’ll possibly spend worktime surfing the web or playing Solitaire on their PC, but sooner or later they’ll get over the novelty and start using the technology to be productive, and even if they still “play” during working hours, the net effect will be positive.

IM as email reduction strategy

Many people agree that they get too much email, and that culturally, email is used when it would be better to pick up the phone or talk to someone face-face. IM can reduce the volume of email sent, not just for the disposable communication (the “have you got a minute?” type) but for the fact that people who are not online at the time, don’t tend to get IM. It’s all too easy to blast an email out to a group, asking for help – now, when the people in that group who’ve been out of the office next log in, they’ll get your request … even though your problem may well have been solved by now. That just doesn’t happen with IM, and some customers I’ve talked with estimate that adoption of enterprise IM sees a >50% drop in internal email volumes.

Presence is the magic ingredient

What makes IM useful is the “presence“: the knowledge of who, in the company (even, possibly, people you haven’t ever added to a contact list like you’d need to do in the public services), is available and in a position to respond to you. Cliff Saran of Computer Weekly wrote a blog post recently which was scathing of presence, but illustrates a fundamental lack of understanding of what it “is”:

Yes it’s fine to be able to know that someone is free, but it relies on the user having to update their Presence each time they walk over to the coffee machine, have a chat and a laugh with a colleague, go to the toilet, leave for the train, get home, go to the pub, have dinner, watch TV and go to bed.

— “Microsoft’s unified productivity killer“, Cliff Saran, 28th August 2007

Sorry Cliff, but you’re about as far wrong as it’s possible to get without changing the subject entirely. The whole point of presence is that it’s something the user shouldn’t have to worry about. And if they want to, they can. Culturally, some people won’t want to use the technology at all, which is fine… though sooner or later they may realise they’re losing out, and come back to the party.

image

I start my PC up, and if it finds a network, Office Communicator logs in and sets me to be online. When my Outlook calendar says I’m busy, my presence changes to “In a meeting”. When I pick up the phone, it’s “In a call”, all done automatically.

When I lock my screen (as I’d do – WindowsKey+L – any time I’m away from my desk for more than a few seconds), my status goes to “Away”, and restores when I log back in. If I just walked away without locking, after 5 minutes, I’d be “Inactive” then 10 minutes later,  it would be “Away” (at least that’s the default timeouts and behaviour… they can be tweaked). And all the while, by clicking that big coloured button in the top left, I can over-ride the automatically set presence and do it myself. Or even sign out.image

As well as controlling what my own status is (and by extension, how phone calls will be routed to me and when), I can also set what level of information I’m prepared to share with others – from allowing select people to interrupt me even when I’ve set “Do not Disturb”, to blocking people from even seeing that you’re online altogether.

Presence and UC telephony

Look at the strategies of any IT or telecoms company who’s involved in this space: finding a user (based on some identity, probably not just their phone number) and seeing their presence is a key part of the value of UC. Making it integrated into other applications and devices the user is working with, and giving the user the choice to use it or not use it as they see fit, is vital to the success of presence being adopted and embraced (rather than rejected by users as big brother-ism or invasion of privacy).

The Return of Exchange Unplugged

In late 2005, to prepare for Exchange 5.5 going out of support (and to help customers understand what was involved in moving up to Exchange 2003), we did a really well received tour of the country arranged around the theme of “Exchange Unplugged“.

We all wore “tour T-shirts” (in fact, every attendee got one), and keeping with the theme, I even carried my acoustic guitar and provided musical accompaniment at the start of each session. The nearest I’ll ever get to being paid to play music, I don’t doubt.

Anyway: we’re doing it all again! With 8 “gigs”, session topics titled:

  • Warm up act & welcome
  • Architecture Acapella
  • Migration Medley
  • Email & Voicemail Duet
  • Mobility Manoeuvres in the Dark
  • Y.O.C.S. (that’s about Office Communication Server).

… it’s clearly no ordinary event. Come along and see Jason try to squeeze into the tour shirt without looking like Right Said Fred, or find out if the YOCS session is presented wearing a stick-on handlebar moustache and leather hat.

Dates:

The Joy of Mapping

We all tend to take maps for granted. In the 17th/18th centuries and even beyond, there were decent sized areas of the world which were just being explored and mapped for the first time. Now, the ease of access to cartographical data means we don’t much give them a second thought.

I bought a couple of Ordnance Survey Explorer maps the other day, and was quite surprised at how expensive they are – £7.99 each – and started wondering if they were worth the money, when I could just go ahead and get data online for free. There’s something unique about poring over a real map, though: not necessarily looking for anything, just finding out what’s there. A neighbour came round at one point when I was looking through my new maps, and said that (like I did), he used to sit in the car as a passenger and study the maps around the places they were driving through. He even used to take the Atlas of the World to bed and just look at it, which I figured was a bit weird and best not discussed any further.

Thinking about how accessible mapping information has become brings a few interesting points up, though: Ordnance Survey maps are actually pretty good value given that they must cost a fair bit to print and distribute, and if you’re out on a walk or cycle in the middle of the country, knowing that you could get a decent aerial view from Google Earth or Windows Live Local might not be of any use, whereas a good map in your pocket makes all the difference.

image

Meanwhile, I’ve become a big fan of Windows Live Mobile, especially after bonding my CoPilot bluetooth GPS receiver with the Smartphone (tip: it’s a BTGPS3 unit, and the passkey is unfathomly set to 0183 by default).

I’ve also used CoPilot for Smartphone as an in-car GPS/navigation aid, and it works really well (even if you don’t have a place to mount the phone properly, it can bark instructions from the passenger seat, just like a real navigator or navigatrix would). There are also lots of other fun apps (like Jason’s favourite, SportsDo) which can use GPS to record where your device has been – for later analysis on your PC. Or here, a developer at MS has built a real-time GPS locator which sends his coordinates back to a web service on his PC, so his family can see where he is all the time. Spooky, maybe…

Autoroute vs online maps

I remember when the application Autoroute first came out, in the early 1990s: it was an old DOS application which shipped on floppy disks, and cost hundreds of pounds at the time. The target audience was fleet delivery managers and the likes, who would generate route plans for the drivers rather than have the trucks wandering their own route and taking longer/using more fuel than might be optimal. So even though Autoroute cost a lot of money, it could save a lot of money and was considered funds well spent.

Microsoft bought the company who made Autoroute, and released the by-now-Windows-application for a much more reasonable price. Autroute 2007 retails today for about £50, and with a USB GPS receiver, £85.

image It’s quite interesting now that Autoroute 2007 has direct integration with Windows Live Local – so you can find somewhere on Autoroute, then search the web for information about local businesses, or view the aerial/hybrid views from that point. It’s obvious to think that future evolutions of Windows Live Local might offer more of the route planning stuff that Autoroute is so good at, though UI-wise it could be more of a challenge…

Currently, Windows Live Local doesn’t offer the ability to do more than a simple “drive from here/to here” route – there’s no waypoints, no “avoid this area” type functionality. Google Maps does offer some of these things but it’s not quite as slick as Autoroute for now.

Rather than loading up Autoroute, though, it’s often quicker to go straight to the likes of Windows Live Local and zoom to a place you’re looking at (maybe you’re thinking of buying a house, for example – the single most useful aspect of this technology if my experience of house hunting last year is at all typical), so the usage patterns of all these types of applications is changing as the technology gets better.

One cool and current use of mapping technology is Bikely.com, which uses Google Maps to do routes that a user can draw or import from GPS devices, then share with others. Still has a long way to go functionality-wise when it comes to smart route planning, but it’s easy to use to do the basics, and is a good portent of things to come.

The process of testing Halo3

I came across a fascinating article on Wired which looks into some of the processes that Bungie, the developers of the Halo game series for Xbox and Xbox360, have been using to test the latest iteration, Halo 3.

image

Thousands of hours of testing with ordinary game players has been recorded, and the footage synchronised so the developers can replay what was on the screen, what the player’s face looked like, and what buttons were being pressed, at any point. They even record details of where and how quickly the players are getting killed, so they can figure out if some bits of the game are just too hard.

There have been a series of test waves of Halo3, some of which were for MS and Bungie employees only, and one public beta phase. The public beta was itself unprecedented in scale – over 800,000 users, racking up 12 million hours of online play. Do the maths, and that’s about 1,400 years of continuous play…

The internal only tests have been kept necessarily confidential (“It is not cool to talk about the alpha” was a common warning to anyone who thought about leaking screenshots or info). The final phase of testing is underway and the game is said to be very nearly complete.

I’m not going to mention any more about how the game looks, sounds, plays – except to say that you’ll all be able to find out just how awesome it is, on the 26th September (in Europe – the US gets it on the 25th). Might as well book the 27th and 28th off as holidays already 🙂 

Playing with Roundtable prototype

I’ve been looking forward to Roundtable coming out… it’s a very interesting type of hybrid between a standard conference speakerphone and a series of web-cams, all tied together by plugging it into a PC and running the new LiveMeeting 2007 client software.

The concept of Roundtable is quite simple really – put it in a room with a round table in the middle, and people who join the meeting online will see a panoramic view of what’s going on in the room, with the “active speaker” being identified in software based on where the sound is originating from. Other participants not in the room can be the active speaker too, if they have a webcam attached.

I got my hands on a prototype device the other day to have a play with (so I could figure out how to talk to my customers about it), and gathered a bunch of others in the same room…

image

We messed about for half an hour or so, and recorded the whole meeting – resulting in a series of files about 2Mb per minute, including the surprisingly high quality video. The first picture above shows me just stretching my arm around the device, and caused great hilarity like some kind of freaky Mr Tickle was sitting in the room.

Mark Deakin (UC product manager in Microsoft UK, and also featured as the active speaker in the picture above (on the left), was trying to emulate “Brother Lee Love” from the Kenny Everett TV show from the 80s…

image

The quality was very good, and once we start using these things in anger, the novelty of the camera will soon wear off and it’ll be useful for real business purposes… 🙂

I have to say, I was very prepared to be underwhelmed (ie the risk of over-promising and under-delivering seemed on the high side), but instead I was blown away by the Roundtable (even though the device itself could probably benefit from a number of physical improvements…)

I can’t wait for them to be deployed around our campus now!

The Roundtable user guide & quick reference card have already been published, and the device should be available through your local Microsoft subsidiary, in the next few months.

The business case for Exchange 2007 – part III

This is a continuation of an occasional series of articles about how specific capabilities of Exchange 2007 can be mapped to business challenges. The other parts, and other related topics, can be found here.

GOAL: Lower the risk of being non-compliant

Now here’s a can of worms. What is “compliance”?

There are all sorts of industry- or geography-specific rules around both data retention and data destruction, and knowing which ones apply to you and what you should do about them is pretty much a black art for many organisations.

The US Sarbanes-Oxley Act of 2002 came to fruition to make corporate governance and accounting information more robust, in the wake of various financial scandals (such as the collapse of Enron). Although SOX is a piece of US legislation, it applies to not just American companies, but any foreign companies who have a US stock market listing or who are a subsidiary of a US parent.

The Securities Exchange Commission defines a 7-year period for retention of financial information, and for other associated information which forms part of the audit or review of that financial information. Arguably, any email or document which discusses a major issue for the company, even if it doesn’t make specific reference to the impact on corporate finance, could be required to be retained.

These requirements understandably can cause IT managers and CIOs to worry that they might not be compliant with whatever rules they are expected to follow, especially since they vary hugely in different parts of the world, and for any global company, can be highly confusing.

So, for anyone worried about being non-compliant, the first thing they’ll need to do is figure out what it would take for them to be compliant, and how they can measure up to that. This is far from an easy task, and a whole industry has sprung up to try to reassure the frazzled executive that if they buy this product/engage these consultants, then all will be well.

NET: Nobody can sell you out-of-the-box compliance solutions. They will sell you tools which can be used to implement a regime of compliance, but the trick is knowing what that looks like.

Now, Exchange can be used as part of the compliance toolset, and in conjunction with whatever policies and processes the business has in place to ensure appropriate data retention is put in place, and that there is a proper discovery process that can prove that something either exists or does not.

There are a few things to look out for, though…

Keeping “everything” just delays the impact of the problem, doesn’t solve it

I’ve seen so many companies implement archiving solutions where they just keep every document or every email message. I think this is storing up big trouble for the future: it might solve an immediate problem of ticking the box to say everything is archived, but management of that archive is going to become a problem later down the line.

Any reasonable retention policy will specify that documents or other pieces of information of a particular type or topic need to be kept for a period of time. They don’t say that every single piece of paper or electronic information must be kept.

NET: Keep everything you need to keep, and decide (if you can) what is not required to be kept, and throw it away. See a previous post on using Managed Folders & policy to implement this on Exchange.

Knowing where the data is kept is the only way you’ll be able to find it again

It seems obvious, but if you’re going to get to the point where you need to retain information, you’d better know where it’s kept otherwise you’ll never be able to prove that the information was indeed retained (or, sometimes even more importantly, prove that the information doesn’t exist… even if it maybe did at one time).

From an email perspective, this means not keeping data squirreled away on the hard disks of users’ PCs, or in the form of email archives which can only be opened via a laborious and time consuming process.

NET: PST files on users’ PCs or on network shares, are bad news for any compliance regime. See my previous related post on the mailbox quota paradox of thrift.

Exchange 2007 introduced a powerful search capability which allows end user to run searches against everything in their mailbox, be it from Outlook or a web client, even a mobile device. The search technology makes it so easy for an individual to find emails and other content, that a lot of people have pretty much stopped filing emails and just let them pile up, knowing they can find the content again, quickly.

The same search technology offers an administrator (and this would likely not be the email admins: more likely a security officer or director of compliance) the ability to search across mailboxes for specific content, carrying out a discovery process.

Outsourcing the problem could be a solution

Here’s something that might be of interest, even if you’re not running Exchange 2007- having someone else store your compliance archive for you. Microsoft’s Exchange Hosted Services came about as part of the company’s acquisition of Frontbridge a few years ago.

Much attention has been paid to the Hosted Filtering service, where all inbound mail for your organisation is delivered first to the EHS datacentre, scanned for potentially malicious content, then the clean stuff delivered down to your own mail systems.

Hosted Archive is a companion technology which runs on top of the filtering: since all inbound (and outbound) email is routed through the EHS datacentre, it’s a good place to keep a long-term archive of it. And if you add journaling into the mix (where every message internal to your Exchange world is also copied up to the EHS datacentre), then you could tick the box of having kept a copy of all your mail, without really having to do much. Once you’ve got the filtering up & running anyway, enabling archiving is a phone call away and all you need to know at your end is how to enable journaling.

NET: Using hosted filtering reduces the risk of inbound malicious email infecting your systems, and of you spreading infected email to other external parties. Hosting your archive in the same place makes a lot of sense, and is a snap to set up.

Exchange 2007 does add a little to this mix though, in the shape of per-user journaling. In this instance, you could decide you don’t need to archive every email from every user, but only certain roles or levels of employee (eg HR and legal departments, plus board members & executives).

Now, using Hosted Archive does go against what I said earlier about keeping everything – except that in this instance, you don’t need to worry about how to do the keeping… that’s someone else’s problem…

Further information on using Exchange in a compliance regime can be seen in a series of video demos, whitepapers and case studies at the Compliance with Exchange 2007 page on Microsoft.com.