A new report states that the US House of Representatives has banned its staff members from using Microsoft's Copilot AI assistant due to possible leaks "to non-House approved cloud services.
I’m lead 365 admin for a major corporation and have been working with MS to identify if Copilot would be beneficial and secure for my org. Some major takeaways from my recent meetings with them:
There’s two parts to Copilot. 1. Copilot 2. Copilot for 365.
The first is basically Chat GPT. It reaches out to the web to get info and essentially works as a search engine.
The 2nd part is internal only. It can do things like summarize meetings, compare documents, and search your emails. It abides by the same security, compliance, encryption, and DLP policies as the rest of your tenant.
You can open up access to one or both.
Government tenants are a unique case. There’s a specific 365 license for government entities, and their offerings are different from other organizations. This news article isn’t surprising - all new 365 offerings take a while before they’re available to government licenses. It will eventually be available.
Few questions about that, unless they’re literally taking their model and putting it into your own box using it’s own compute power, I don’t see how that’s possible. They can call it “your” copilot all they want but if they’re reading your data and prompts and computing that on their own box then they’re using your data, right?
Major organizations use encryption where they hold the keys so Microsoft is unable to read their data. They can have thousands of servers running on Microsoft’s Azure stack and yet Microsoft is unable to read the data that is being processed.
If all auditors are uncorrupted, highly competent and have full overview. Boeing was able to corrupt it’s government auditors to save some money on redundant sensors. With Microsoft pushing big on gathering and selling data I wouldn’t trust a byte that passes their server.
Microsoft has to compete with other cloud providers on security. Unlike Boeing who has no domestic competition. Any of Google, Amazon, or Oracle would love to find out that Microsoft is decrypting user data to sell to partners because they would be screaming to the high heavens that O365/Azure is insecure and enterprises must switch to their solutions. SaaS/IaaS subscriptions are much more profitable than selling user data, there is a near 0 chance that Microsoft is improperly handling enterprise data (on purpose)
I’m not an admin, but I do provision ms cloud licensing and have run across this question more than a few times. At the enterprise level, I’m told the copilot data is “walled off” and secure, and not harvested by MS. I have nothing to back that up, but that’s what I’m told. I’m certain if it weren’t true, I would have heard about it by now.
I’m lead 365 admin for a major corporation and have been working with MS to identify if Copilot would be beneficial and secure for my org. Some major takeaways from my recent meetings with them:
There’s two parts to Copilot. 1. Copilot 2. Copilot for 365.
The first is basically Chat GPT. It reaches out to the web to get info and essentially works as a search engine.
The 2nd part is internal only. It can do things like summarize meetings, compare documents, and search your emails. It abides by the same security, compliance, encryption, and DLP policies as the rest of your tenant.
You can open up access to one or both.
Government tenants are a unique case. There’s a specific 365 license for government entities, and their offerings are different from other organizations. This news article isn’t surprising - all new 365 offerings take a while before they’re available to government licenses. It will eventually be available.
Few questions about that, unless they’re literally taking their model and putting it into your own box using it’s own compute power, I don’t see how that’s possible. They can call it “your” copilot all they want but if they’re reading your data and prompts and computing that on their own box then they’re using your data, right?
Major organizations use encryption where they hold the keys so Microsoft is unable to read their data. They can have thousands of servers running on Microsoft’s Azure stack and yet Microsoft is unable to read the data that is being processed.
If all auditors are uncorrupted, highly competent and have full overview. Boeing was able to corrupt it’s government auditors to save some money on redundant sensors. With Microsoft pushing big on gathering and selling data I wouldn’t trust a byte that passes their server.
You clearly do not understand encryption or corporate auditing.
Microsoft has to compete with other cloud providers on security. Unlike Boeing who has no domestic competition. Any of Google, Amazon, or Oracle would love to find out that Microsoft is decrypting user data to sell to partners because they would be screaming to the high heavens that O365/Azure is insecure and enterprises must switch to their solutions. SaaS/IaaS subscriptions are much more profitable than selling user data, there is a near 0 chance that Microsoft is improperly handling enterprise data (on purpose)
Microsoft cannot decrypt your data when you hold the keys.
Thanks for the explanation
I’m not an admin, but I do provision ms cloud licensing and have run across this question more than a few times. At the enterprise level, I’m told the copilot data is “walled off” and secure, and not harvested by MS. I have nothing to back that up, but that’s what I’m told. I’m certain if it weren’t true, I would have heard about it by now.