AI & Data Security: Understanding Copilot’s Potential

In an ever-evolving digital landscape where AI and machine learning are revolutionizing the way we work, the insights from thought leaders like Mike Hughes are invaluable. Mike’s knowledge is unsurpassed in cloud architecture and digital transformation, and his deep dive into the intricacies of Microsoft 365’s Copilot —its benefits, security protocols, and governance structures—offers a rare glimpse into the future of workplace productivity tools.

This interview illuminates how AI technologies and data security can transform organizational efficiency and data management practices when integrated with critical data security and governance. Mike’s experiences and anecdotes are always accessible and offer wit and wisdom that help organizations navigate the complex yet promising world of AI-driven digital transformation.

Can you explain what Copilot is and how it benefits Microsoft 365 users?

Metrics on how Copilot benefits users

Mike: You would be amazed at how many people I speak to who know what ChatGPT is but have not heard of Copilot. Copilot is a Microsoft brand name used for various Generative AI products developed by Microsoft based on OpenAI/ChatGPT technologies.

Copilot is offered in three basic forms and many specialized forms beyond these:

  • Copilot – replicates ChatGPT chatbot functionality
  • Copilot Pro – is for users licensed for Microsoft Home
  • Copilot for Microsoft 365 – is for Microsoft 365 work/school accounts.

Copilot for Microsoft 365 and Pro is your personal productivity assistant integrated into the Microsoft apps and services you use daily. Unlike ChatGPT, it has live access to the web for more up-to-date responses and can securely search the Microsoft 365 data you have access to.

Some of the other Copilot apps are

  • Copilot is built into the Bing search engine, which anyone can use without a license.
  • Copilot for Github
  • Copilot for Security
  • Copilot for Azure
  • Copilot Studio
  • Copilot for Intune

How does Copilot work under the hood?

Mike: At the core of Copilot’s functionality are Large Language Models based on generative AI technology developed by OpenAI in combination with Microsoft’s in-house development. Then, you have what is called the Semantic Index. This index of your Microsoft 365 data organizes it for contextual searches, queries, and prompts. As it encompasses the relationships between various data points within the Microsoft 365 environment, Copilot can provide contextually relevant answers, representing a significant advancement over traditional search tools. The Semantic Index makes Copilot’s interactions more intuitive and generates results that are more accurate.

Can you share an example of how Copilot has been used effectively?

Mike: Sure. I see three common personas for individuals. The first person is a person in lots of meetings and receives lots of emails. Copilot does a fantastic job of summarizing meetings and long email threads. Copilot saves a lot of time by helping you quickly catch up at the end of a busy day.

The second persona is a content-creating individual. An example would be a marketing person.  Someone who needs to create content for others to digest or needs to respond to requests for information. I know of one individual who responds to RFP for their job. With the help of Copilot, what used to take her an hour and a half has been consolidated to around fifteen minutes.

The third persona is the individual who is learning or searching for information, which is almost everyone. Copilot for Microsoft 365 not only can search the live Internet and consolidate information into meaningful content, but it can also securely search within your Microsoft 365 account with the data you have access to and provide the same type of context-based Generative AI response.

After the individual, you get into departmental and organizational specialized use cases.

What are the Copilot security features that protect my company data?

Copilot for Security

Mike: Security is a cornerstone of Copilot. The first thing to understand is the Copilot Large Langue Models (LLMs) are never trained on any company data. Like ChatGPT, the Microsoft LLMs are trained on public internet data. The LLM training process is designed to ensure the models are not only powerful in their generative capabilities but also aligned with Microsoft’s responsible AI principles to maintain privacy and security.

Depending on which application you use to interact with Copilot, the response will be either public Internet information or data from within your Microsoft 365 account that the user has permission to see. Copilot will never respond with information that a user cannot access.

What can companies do to protect their data?

Mike: Before the cloud, most data were stored in on-premise file servers, which the IT department controlled. Users were replaced into security groups, and these groups were assigned access to network file shares and folders.

With the advent of cloud storage and collaboration services, some responsibility for file security shifted from the IT department to the user. Users can now create and store files in many cloud locations, some of which the IT department does not even know a user is using. This is what we call shadow IT.

A Microsoft 365 account has many tools at its disposal to secure and protect data. The challenge is maintaining a balance between data security and Usability. If a company overly restricts what a user can do, users will find their own way without its knowledge.

Microsoft provides over 300 Sensitive Information Types (SITs) to identify and classify sensitive items within your organization’s data. These SITs are pattern-based classifiers that can detect sensitive information such as social security numbers, credit card numbers, bank account numbers, and more. You can also create your custom SIT to find and classify sensitive data. Files are automatically classified into these types as uploaded and stored in the Microsoft Cloud or on-premises file servers. 

Microsoft data encryption with sensitivity labels and data loss prevention (DLP) are an organization’s primary tools to secure data. With sensitivity labels, you can encrypt data either manually or automatically, regardless of where it is stored; with data loss prevention (DLP), you can automatically prevent sensitive data from leaking out when a user tries to share it.

While this may seem like a daunting task, remember that Copilot does not expose data to users they do not already have access to. Copilot has brought the data governance conversation to the forefront of discussion. Most organizations I work with are deploying Copilot licenses and having data security and exposure assessments done simultaneously. They ask users to report anything unusual or something they should not have access to to their manager or the IT department.

Why is a Copilot readiness assessment crucial for organizations?

Mike: A readiness assessment is vital because it helps organizations understand their preparedness for deploying Copilot and ensures their data governance and security frameworks are robust. Microsoft has already done the heavy lifting for us by creating Copilot Adoption and Success kits with all the information an organization needs to successfully deploy Copilot. They even provide email templates and handouts.

Microsoft does provide some visibility into where users have placed sensitive information, but we have found that using third-party tools provides clearer visibility. With these tools, we help organizations obtain a better understanding of data exposure within their Microsoft account, which helps find holes and weaknesses. It also helps with understanding what your users are doing so you can better govern data.

How does the integration of Copilot impact collaboration and data management?

Mike: Put simply, Copilot promotes a more efficient and collaborative workplace. In the brief time I have been using it, I cannot imagine not having it. It is a fantastic tool that saves me time every day, and I would not want to give it up. The same goes for most everyone I interact with who uses Copilot, too.

How would you summarize the best way for organizations to approach the integration of Copilot?

Mike: Again, Copilots integration shows the importance of data governance and security practices, but do not let that stop you from purchasing licenses. Deploy Copilot to some key individuals to develop internal knowledge, then start a readiness and data security assessment.

Embracing Copilot represents a significant step forward in productivity and efficiency, provided it is done thoughtfully and strategically.

Book a session with Covenant Global Cloud Specialist.

As the digital workspace continues to evolve, embracing AI tools like Copilot necessitates a nuanced understanding of security, governance, and how to maximize productivity. Mike Hughes and his team of Cloud architects at Covenant Global are uniquely equipped to guide customers through this AI journey. With a wealth of experience deploying AI solutions within the Microsoft 365 ecosystem, they provide the expertise to navigate these transformations effectively. By prioritizing data security, enhancing governance frameworks, and leveraging the full potential of AI, Mike Hughes and Covenant Global represent the idea partners for organizations aiming to make a seamless transition into a more efficient, secure, and AI-integrated future.

Book a session with one of our Cloud Specialists to assess your position on the AI track and discover strategies to enhance your productivity while maintaining security.

Scroll to Top