What AI providers can learn from Microsoft
What AI providers can learn from Microsoft
Microsoft has long dominated large organisation digital workspaces, 93% of Fortune 100 companies rely on Teams for communications (source: DemandSage).
Users however, especially developers, have long complained about the quality of the product compared to competition. Both good paid-for and open-source alternatives exist that offer more simplified integrations and/or greater configurability with external tools that we often rely on (cloud providers, GitHub, CI/CD providers).
The standard start-up tooling is a Notion-Slack pairing with a (likely) Gsuite bolt on for occasionally critical work like slides and spreadsheets.
A European built open-source digital workspace, La Suite numerique is a clone of the most commonly used Microsoft services designed to provide improved data residency following the CLOUD act that would likely leave GDPR toothless.
Why then do large organisations keep choosing to implement the service, and what could this teach us about which AI companies are likely to suceed?
Firstly, Microsoft products are extremely low risk. They cover almost every possible use-case a standard large organisation is likely to have and new joiners will be familiar with the products. I would expect any analyst to know how to create a pivot table in Excel but would be understanding if they needed guidance with Sheets.
The “core” services provided (document editing with Word, presentations with Powerpoint, analysis with Excel) are standard knowledge across most industries and you can trust they will continue to work in future.
As developers we find frustating edge-cases as we try to integrate increasingly complex workflows that drive us to alternatives but it is important to remember that most people using the services in a large organisation are not developers but users with other priorities. My experiences trying to reliably query a Sharepoint API that Microsoft themselves didn’t seem to understand is a headache, but this is not convincing to other users who just need Sharepoint to share files with access controls.
When making an organisation-wide decision to move to a new system, lots of existing working patterns and knowledge is lost - there has to be a really clear reason to move people away from the standard. This brings me onto my second reason…
Microsoft hosts the data and makes egress a real pain. If my organisation is already on Microsoft, trying to move years of files to a new service is a huge piece of work. It’s a headache for the engineers moving the data, the users trying to replicate pre-existing workflows and for leaders demonstrating the immense pain is worth the payoff. Its just not worth the fuss.
These are concepts that a successful, Microsoft-esque AI provider would do well to replicate:
- Make your tools service the “core” needs well
- Have, as much as possible, these tools become standard implementations
- Make it easier for organisations to stick with you over switching to a competitor
The first point is really the key one. It’s still unclear what the “core” services required by an organisation is. There’s been a top-down push on AI adoption across the market which has muddied the answer to this question.
I suspect this will come down to some reliable workflow automation (manual or computer-infered setup), an improved search functionality across internal and external sources to improve information flowing - enabled by much better automated data standardisation enabling better search functionality, and some automated on-the-fly tool generation such as code agents for developers and apps for stakeholders.
If an organisation can crack getting their tools to reliably work across an essential core offering they will do well. At present AI tools are external plugins or don’t seem to work.
Microsoft’s own Copilot is a good example of a service that could be fantastic but the tool isn’t “good enough” to be a standard AI integration. The model is somehow worse that the GPT model that powers it as accessed on ChatGPT, and it fails to search and filter context in a reliable enough way. My current only use-case for it is generating images when I hit rate limits on external tools.
There is a world where Copilot allows me to authorise my GitHub data, extracts up-to-date context from across the Microsoft suite and provides me with a timely and accurate summary an hour ahead of a planning meeting without my intervention. This is possible with current tooling, it just needs someone to build it.