But Leaves Many Unanswered Questions About “Your Co-pilot for Work”
I spent an interesting afternoon watching the launch of Microsoft 365 Copilot (the recording is available here) and reading the associated blog, announcement, and message center notification (MC528800). Copilot is all about leveraging the power of AI technology based on GPT-4 to reuse information stored in Microsoft 365 to perform tasks. The idea is that the AI can take care of creating basic documents and leave users to refine and complete the content. In effect, Copilot is a kind of personal assistant on steroids.
Microsoft’s demo was compelling, but I still came away with a nagging doubt whether I will use Copilot when it’s available. For instance, the OWA demonstration of Copilot deciding how to prioritization email in a busy inbox reminded me of previous Microsoft attempts to help users impose order on inboxes with the Clutter feature and then Focused Inbox. Maybe it’ll be a case of third-time lucky.
On the surface, I am a classic example of a person that will benefit from Copilot. My data is stored in Microsoft 365. I use apps like OWA, Teams, SharePoint Online, and OneDrive for Business. I work on Word documents, Excel spreadsheets, and PowerPoint presentations daily. The large language model that Microsoft will deploy in Azure to service the queries generated by Copilot and respond to apps with suggestions should cater for my every need. But yet, I still have doubts.
Perhaps it’s just the nature of demos. I reminded myself midway through the magic that what I was watching was a carefully produced show put on by Microsoft executives rehearsed to within an inch of their lives using content crafted to generate the best possible results. The demonstration exhibited perfection of a kind seldom found in real life. I wondered if the way I work and the content I generate will result in such wonderful documents, spreadsheets, and presentations. And even more pertinently, how will the AI make sense of the information stored by the average Office user?
After all, GPT-4 depends on the content it knows about and the context delivered by Copilot. If someone asks Copilot to generate a project report about Contoso and not much can be found about Contoso in Exchange Online, SharePoint Online, OneDrive for Business, or Teams, then GPT-4 will probably generate something that’s banal and uninteresting, not to say possibly wrong (a danger acknowledged by Microsoft).
The Experts Conference 2023 European Roadshow
Join us for practical security insights into hybrid AD and Microsoft 365 from April 17-21!Register Today
Encouraging More Use of Office Features
The best thing about the magic was the promise to liberate people from their lack of knowledge about the potential of applications. One of the speakers said that most people knew how to use 10% of PowerPoint’s features. The same is probably true of Word and Excel too, and it could be construed as a reflection on the complexity of the software rather than the inability of their human users. But the magic is that Copilot knows how to generate snazzy documents using features that I certainly don’t know how to use – or at least need to look up each time I attempt to use.
Encouraging more use of Office features could be considered value if it doesn’t hide the actual content among the bells and whistles conjured up by the Copilot AI. Clarity and conciseness are often attributes that better serve business communications.
The thing about automatically-generated content is that it can be wrong, so it was nice to see how Microsoft crafted in features to tell Copilot to redo something after the human (in charge) made a change or to do things like make text clearer. That was a nice touch.
No Technical Details Available Yet
From a technical perspective, there’s much that we don’t know about how the “sophisticated processing and orchestration engine” used by Copilot to link apps, the Graph, and GPT-4’s large language models. The Microsoft Technical Community post is full of aspirations and low on detail. On the upside, Jarod Spataro gave a brief insight into the technology when he said that the interaction between Microsoft 365 apps and the AI goes like this:
- Tell the AI what the user needs.
- Use the Graph to extract information from the user’s data to add context. I imagine that this involves Graph queries against repositories like SharePoint Online and OneDrive for Business to find documents, and Teams and Exchange Online to find messages. This process is called “grounding.”
- Pass the refined query to the AI/Large Language Model for processing.
- Post processing to make sure that the AI response is appropriate and meets security and compliance requirements.
- Respond to the app, which inserts the text and images.
There are many open questions that arise from this brief description. How is security enforced? Assumably this means that the information surfaced by the AI is trimmed similarly to the way Microsoft Search handles results to make sure that no one sees any information that they’re not allowed to. How is compliance handled or what does compliance mean when chatting with an AI bot? What auditing is performed so that organizations know who is using Copilot? Can use be restricted to certain users (perhaps by licensing, but maybe along to handle the requirements of unions or works councils who aren’t happy about AI usage). To be fair, Microsoft is still working out the administrative framework for Copilot.
Microsoft 365 Copilot Licensing and Cost
Other things that we don’t know about Copilot include licensing, availability in other languages than English, and implementation dates for different Microsoft 365 apps (online, desktop, and mobile). Microsoft says that they “will share more about pricing and details in the coming months.” Given the early stage of the software, it’s reasonable for Microsoft to not want to make a definitive statement at this point. We know from products like Loop (due to reach public preview on March 22) that it can take much longer than anticipated to bring software from concept to shipping product. My best guess is that we’ll see some Copilot features light up in Microsoft 365 apps later in 2023.
As to cost, Copilot looks like it is a premium feature instead of something that every Microsoft 365 subscriber will get. My hope is that Microsoft will include Copilot in products like Office 365 E5 and Microsoft 365 E5 instead of requiring customers to buy yet another add-on license. Of course, it would be nice if Microsoft made add-on licenses available for organizations that don’t have high-end licenses. However, recent examples like Teams Premium show that Microsoft like to carve AI-based features out into separate licenses, possibly to drive as much revenue as possible.
Don’t get me wrong. Despite my doubts, it’s clear that bringing Copilot forward so quickly to a point where it can be demonstrated across a range of Microsoft 365 apps is a significant technical achievement. The hard work of making the implementation fit for purpose in corporate environments usually takes a lot of effort and that’s the phase we are now in. I look forward to seeing how much time Copilot can save me, if it can make sense of what I store in the Graph and it doesn’t cost too much.
TEC Talk: Five Things Microsoft 365 Security Administrators Should Do in 2023
Don't miss Tony Redmond's free TEC Talk on this topic on March 23rd at 11:00 am EST.Register Today!
we should call this “f*ck”. Because, after we have used this for a while, users will no longer know the underlying content; if anyone asks, “f*ck knows”. [yes, there are references you can follow; but people barely read a brief email!]
more seriously; this sounds like recycling old content. Is there not a danger that old thinking and habits are perpetuated, and, in simple terms, creating something from scratch becomes a lost skill and frowned upon? “eugh! he wrote that *himself* – what a DINOSAUR! Everyone knows you should f*ck this up!”
will novel thought and creativity, and the associated risk taking, become a lost skill? an unacceptable practice? something awkward that challenges group think? Once an organisation is f*cked up, how will they break the mould and adapt?
I’m wondering will I trust the AI to summarise a 20 email thread to the extent I mark the thread as read. Also, will the generated content across all apps/services result in bland output. I’ll be very interested to try this out, the key though is whether or not clients will buy into it
Just a glorified, new, Clippy
First, I applaud MS for this innovative move. Like many innovative initiatives, both MS and potential users will undergo a learning curve. The need for patience, error mitigation and updates cannot be adequately emphasized.
As highlighted in the article, many questions remain unanswered and my feelings are that some of these answers will continue to be illusive and of course, some users will remain distrustful. In summary, the jury at this point is still out
I agree with you Tony, there is an element of mistrust in all of this. Also a stigma that will build for people who use it. Some employers will be suspicious of anything that is Ai generated / end user refined. Which shouldn’t be the case.
As for licensing I think Microsoft will give customers just enough access for free to get them hooked on it and eager to buy the premium or pay as you go usage for their orgs. Whether that is a limited number of actions or a monthly allowance we shall wait and see. However given that the bing Ai is currently free but probably subsidized by ads or will be later on. I hate to think what they will do with that. Maybe make you sit through an ad before you get your response. Kind of like YouTube then possibly with a paid version to remove the ads.
As for how well it will perform. It will certainly be a real lot slower than the demos for sure. So people will need to be patient. Also for refining and verifying that needs to be a slick experience.
I am currently using the Bing AI and find it really useful. They appear to be letting more people in now at random.
The compose option in there has so far been excellent. So simple to use and the results not half bad. Including blog writting.