Tip o’ the Week 460 – AI, AI, Oh…

clip_image002Artificial Intelligence has been dreamt of for decades, where machines will be as smart – or maybe smarter – than humans. AI in popular consciousness is not just a rubbish film, but if you’re a brainless tabloid journalist, then it means Siri and Alexa (assuming you have connectivity, obvs … and hope there’s no Human Stupidity that forgot to renew a certificate or anything), and AI is also about the robots that are coming to kill us all.

Of course, many of us know AI as a term used to refer to a host of related technologies, such as speech and natural language recognition, visual identification and machine learning. For a great example on practical and potentially revolutionary uses of AI, see Dr Chris Bishop’s talk at Future Decoded 2018 – watch day 1 highlights starting from 1:39, or jump to 1:50 for the example of the company using machine learning to make some world-changing medical advances.

clip_image004Back in the mundane world for most of us, AI technologies are getting more visible and everyday useful – like in OneDrive, where many improvements including various AI investments are starting to show up.

One simple example is image searching – if you upload photos to consumer OneDrive (directly from your phone perhaps), the OneDrive service will now scan images for text that can be recognized… so if you took a photo of a receipt for expenses, OneDrive might be able to find it if you can remember what kind of food it was.

clip_image006There’s also a neat capability where OneDrive will try to tag your photos automatically – just go into www.onedrive.com and look under Photos, where you’ll see grid of thumbnails of your pictures arranged by date, but also the ability to summarise by album, by place (from the geo-location of your camera phone) clip_image008or by Tag. You can edit and add your own, but it’s an interesting start to see what the visual search technology has decided your photos are about… not always 100% accurately, admittedly…

More AI goodness is to come to Office 365 and OneDrive users in the near future – automatically transcribing content from videos stored online (using the same technology from the Azure Video Indexer and Microsoft Stream), to real-time PowerPoint captions. Watch this space… and mind the robots.

Tip o’ the Week 430 – developers, developers, developers

clip_image001This week has seen the Microsoft developer conference, called //build/ in its current guise, take place in “Cloud City”, Seattle (not so-called because it rains all the time – in fact, it rains less than in Miami. Yeah, right). Every major tech company has a developer conference, usually a sold-out nerdfest where the (mostly) faithful gather to hear what’s coming down the line, so they know what to go and build themselves.

Apple has its WWDC in California every year (for a long time, in San Francisco), and at its peak was a quasi-religious experience for the faithful. Other similar keynotes sometimes caused deep soul searching and gnashing of teeth.

The Microsoft one used to be the PDC, until the upcoming launch of Windows 8 meant it was time to try to win the hearts & minds of app developers, so //build/ became rooted in California in the hope that the groovy kids would build their apps on Windows and Windows Phone. Now that ship has largely sailed, it’s gone back up to the Pacific North West, with the focus more on other areas.

clip_image003Moving on from the device-and-app-centric view that prevailed a few years back (whilst announcing a new way of bridging the user experience between multiple platforms of devices), Build has embraced the cloud & intelligent edge vision which cleverly repositions a lot of enabling technologies behind services like Cortana (speech recognition, cognitive/natural language understanding etc) and vision-based products such as Kinect, HoloLens and the mixed reality investments in Windows. AI took centre stage; for a summary of the main event, see here.

clip_image005The cloud platform in Azure can take data from devices on the edge and process it on their behalf, or using smarter devices, do some of the processing locally, perhaps using machine learning models that have been trained in the cloud but executed at the edge.

With Azure Sphere, there’s a way for developers to build secure and highly functional ways to process data on-board and communicate with devices, so they can concentrate more on what their apps do, and on the data, less on managing the “things” which generate it.

For all of the breakouts at Build and the keynotes on-demand, see here.

Back in the non-cloud city, Google has adopted a similar developer ra-ra method, with its Google I/O conference also taking place in and around San Francisco, also (like WWDC and Build) formerly at Moscone. It happened this past week, too.

Like everyone else, some major announcements and some knock-em dead demos are reserved for the attendees to get buzzed on, generating plenty of external coverage and crafting an image around how innovative and forward thinking the company is.

Google Duplex, shown this week to gasps from the crowd, looks like a great way of avoiding dealing with ordinary people any more, a point picked up by one writer who called it “selfish”.

Does a reliance on barking orders at robot assistants and the increasing sophistication of AI in bots and so on, mean the beginning of the end for politeness and to the service industry? A topic for further consideration, surely.