AI Strategy, Cybersecurity, Compliance Automation & Microsoft 365 Managed IT for Security-First Financial Institutions | ABT Blog

Copilot Governance Dashboard: How to Monitor AI Usage in Your M365 Tenant

Written by Justin Kirsch | Wed, Apr 08, 2026

Your examiner asks: "How do you monitor what Copilot sees and does?" If your answer involves a pause, a glance at your IT director, or the phrase "we're working on that," you have a governance gap. That gap has a specific shape, and Microsoft Purview now has specific tools to fill it.

Forty-one percent of security teams express strong interest in identifying risky users based on their AI queries. Security teams consistently report needing more confidence managing data flowing into AI applications. These are not hypothetical concerns. Copilot reads every document, email, and Teams message the user can access. If your permission model is loose, Copilot's access is loose. The governance dashboard is how you prove to your examiner, your board, and yourself that you know what is happening. If you have not yet addressed the broader AI governance gap across banks, that gap becomes your examiner's first question.

Here is what the Purview AI governance tools show, what the March 2026 credential scanning upgrade adds, how to configure DLP policies that actually restrict Copilot's reach, and how Agent 365 extends the monitoring into territory Purview does not cover.

84%
of security professionals say their organization needs to do more to protect against risky employee use of AI tools, up from the prior year
Source: Microsoft security research

What the Purview AI Governance Dashboard Shows

Microsoft Purview's Data Security Posture Management (DSPM) for AI is the central command for AI governance in your M365 tenant. You access it through the Purview portal at purview.microsoft.com under Solutions, then DSPM for AI. From the Overview page, you get a Get Started section with three immediate actions: turn on Microsoft Purview Audit (on by default for new tenants), install the Purview browser extension (required for monitoring third-party AI sites), and onboard devices to Microsoft Purview (also required for third-party AI monitoring).

The dashboard organizes AI governance into four sections, each serving a different audience in your organization.

What Purview Monitors

Copilot usage patterns across M365 apps (Word, Excel, Teams, Outlook, PowerPoint)

Sensitivity label enforcement on AI-accessed content

DLP policy violations triggered by Copilot interactions

Insider Risk Management signals from AI usage anomalies

Audit logs for every Copilot interaction (query + response metadata)

Third-party AI site visits via browser extension and device onboarding

What Purview Does Not Monitor

Third-party AI tools used on unmanaged devices without Purview browser extension

AI agent autonomous actions across multiple apps (CoWork workflows)

Cross-tenant data exposure from multi-model AI processing

Shadow AI tool adoption on personal devices outside M365 ecosystem

Governance policy compliance for custom Copilot Studio agents at scale

The Recommendations panel is where compliance officers should start. Purview evaluates your existing policies (DLP, Communication Compliance, Insider Risk) against current AI regulations, including ISO 42001 and the NIST AI Risk Management Framework. Each recommendation links to a guided walkthrough. If you have not enabled audit logging for AI interactions, the panel walks you through it. If you lack a Communication Compliance policy for Copilot, it offers a one-click creation path that puts the policy in simulation mode first, so you can review matches before enforcement.

The Reports section breaks AI usage into two categories: Microsoft Copilot Experiences and Enterprise AI Apps. For each category, you see total interactions over time, sensitive interactions (where Copilot accessed labeled content), and total visits to third-party generative AI sites. These reports populate after audit logging is active and typically take 24 hours for initial data to appear. For a credit union with 150 employees using Copilot, this section answers a basic question your examiner will ask: how many AI interactions happen per day, and how many touch sensitive data?

The Activity Explorer gives your compliance team granular visibility into individual interactions. Each audit record includes the user identity, timestamp, application host (Word, Excel, Teams, Outlook, or BizChat), and a list of every resource Copilot accessed to generate its response. That resource list includes file names, SharePoint site URLs, and sensitivity label IDs. If a user asked Copilot to summarize a folder of loan documents, Activity Explorer shows you exactly which files Copilot read and whether any carried a sensitivity label.

The Data Assessments panel addresses oversharing before it becomes an AI problem. Purview now runs a weekly automatic assessment of all SharePoint sites used by Copilot, flagging sites where permissions are broader than they should be. You can also create custom assessments targeting specific users or site collections. Each flagged site includes a Protect tab with one-click remediation: restrict the site from Copilot using SharePoint Restricted Content Discoverability, or create an auto-labeling policy for sensitive content found on the site.

1
Enable Audit

Verify Purview audit logging is active (default for new tenants). This powers all AI interaction tracking.

2
Onboard Devices

Install Purview browser extension and onboard endpoints. Required for third-party AI site monitoring.

3
Review Reports

Wait 24 hours, then review Copilot interaction counts, sensitive data access, and third-party AI visits.

4
Run Data Assessment

Scan SharePoint sites for oversharing. Remediate with restricted discoverability or auto-labeling.

5
Create Policies

Use one-click policy creation for DLP, Insider Risk, and Communication Compliance for Copilot.

The six governance dashboard metrics every compliance team needs to track for Microsoft Copilot oversight.

The Credential Scanning Feature Your DLP Just Got

In March 2026, Microsoft expanded Purview's DLP capabilities to include credential scanning. This is separate from Entra ID Protection's leaked credential detection. Purview credential scanning looks for credentials, API keys, connection strings, and authentication tokens stored in documents, emails, and SharePoint sites across your tenant.

Why Credential Scanning Matters for AI Governance

When Copilot summarizes a document that contains an embedded API key or database connection string, it can surface that credential in a chat response visible to anyone in the conversation. Purview's credential scanning identifies these exposures before Copilot finds them. Combined with sensitivity labels, this closes one of the most practical data leakage pathways in AI-enabled environments. The same risk class is what makes prompt injection so dangerous when combined with oversharing.

The credential scanning feature works through Purview's existing DLP policy framework. You create a policy targeting Exchange, SharePoint, and OneDrive locations, then add conditions that detect credential patterns. Purview ships with built-in sensitive information types for common credential formats: API keys with standard prefixes, Azure Storage connection strings, SQL Server connection strings, and generic authentication tokens. When a match triggers, the alert flows into the same Purview compliance dashboard your team already monitors for other DLP violations.

For financial institutions, credential exposure is not theoretical. Development teams embed database connection strings in SharePoint-hosted documentation. IT administrators paste Azure AD app registration secrets into Teams channels for quick sharing. Mortgage workflow configuration files contain LOS integration credentials. Copilot treats all of this content as fair game when a user asks it to summarize or search across their accessible files. Credential scanning catches these before Copilot can surface them.

Without Credential Scanning

A loan officer asks Copilot to summarize the IT setup documentation for a new branch office. The summary includes an Azure Storage connection string embedded in paragraph four of the setup guide. That connection string is now in a Teams chat visible to the entire branch setup team.

With Credential Scanning Active

Purview DLP flags the setup guide as containing credential data. The sensitivity label prevents Copilot from processing the document. The IT admin receives an alert to remove or vault the connection string. The loan officer's summary pulls from clean documentation only.

Configuring DLP Policies for Copilot

Purview offers two distinct DLP policy types that control what Copilot can access and process. Both use the Microsoft 365 Copilot and Copilot Chat policy location, which became its own dedicated scope in 2025. When you select this location, all other locations (Exchange, SharePoint, Devices) are disabled for that policy, so you need separate policies for Copilot-specific and traditional DLP.

DLP Policy Type What It Blocks Status License Requirement
Sensitivity label-based Copilot cannot process files or emails with specified sensitivity labels. Items may still appear in citations but content is excluded from responses. GA M365 E5, Business Premium, or Purview add-on
Sensitive info type (SIT) in prompts Copilot refuses to respond when a user types a prompt containing specified SITs (SSN, credit card, bank account numbers). Blocks internal Graph grounding and external web search. GA (March 2026) Any M365 tenant with Copilot access (E1, E3, E5)
Web search restriction for SITs Prevents Copilot from sending prompts containing SITs to external web search. Copilot still responds using internal Graph data if licensed. Preview (March 2026) Any M365 tenant with Copilot access

The sensitivity label-based policy is the foundation. You create a custom DLP policy, select the Microsoft 365 Copilot and Copilot Chat location, add a Content Contains condition with your sensitivity labels (typically Highly Confidential and Confidential), and set the action to Prevent Copilot from processing content. Deploy to a pilot group first. Microsoft notes that policy changes can take a few hours to propagate across all Copilot endpoints.

The SIT-based prompt policy is the newer capability. It scans what users type into Copilot in real time. If someone pastes a Social Security number, credit card number, or bank account number into a Copilot prompt, the policy blocks the response entirely. This works with all 300+ Microsoft built-in sensitive information types and custom SITs you create, with one exception: SITs created through document fingerprinting are not supported in Copilot DLP policies.

For GLBA-regulated financial institutions, the minimum DLP configuration for Copilot should include four sensitive information types: Social Security numbers, bank account numbers, credit card numbers, and Individual Taxpayer Identification Numbers (ITINs). These are the same four GLBA types that Guardian's DLP stack monitors. Adding them to a Copilot DLP policy ensures that if a user types member data into a Copilot prompt, the system stops the response before Copilot processes it.

One detail compliance teams miss: DLP policies for Copilot are separate from your existing DLP policies. Creating a new Exchange + SharePoint DLP policy does not automatically apply to Copilot. You need a dedicated policy with the Copilot location selected. The two types of Copilot DLP policies (label-based and SIT-based) also require separate policies because they use different detection mechanisms.

The Agent 365 Governance Layer

Purview handles the Microsoft-native monitoring. Agent 365 extends governance into the territory Purview does not cover: third-party AI tools, autonomous agent actions, multi-model data processing, and the cross-application workflows that Copilot CoWork introduces.

Agent 365 reaches general availability in May 2026 at $15 per user per month (included in the E7 Frontier Suite at $99 per user per month). It adds three governance capabilities that Purview lacks.

Manage: Agent Inventory

Every AI agent in your tenant gets a built-in identity, registered in Entra ID. Agent 365 maintains a real-time inventory of all agents: Copilot Studio custom agents, third-party integrations, and CoWork autonomous workflows. Each agent inherits your existing Conditional Access and DLP policies through its Entra identity.

Govern: Lifecycle Controls

Every agent requires a human sponsor accountable for its actions. Agent 365 tracks agent lifecycle from creation through retirement, detects orphaned agents whose sponsors have left the organization, and enforces access packages that limit what data each agent can reach. This is the governance gap examiners will ask about when CoWork agents start acting autonomously.

Secure: Behavioral Detection

Conditional Access policies now apply to agents, not just users. Agent 365 monitors for risky agent behavior: unusual data access patterns, cross-application actions that exceed the agent's defined scope, and network requests to unauthorized endpoints. Continuous risk assessment scores each agent the same way Entra ID scores user sign-in risk.

For financial institutions, the governance gap is not "can we monitor Copilot?" Purview answers that. The gap is "can we monitor everything AI touches in our environment?" Agent 365 plus Purview is the complete answer. Guardian's Productivity Insights layer tracks all AI tool usage, including Copilot, providing the full monitoring evidence that examiners at FFIEC-regulated institutions expect.

The three-stage governance pipeline that turns Copilot audit data into examiner-ready evidence.

The practical question for most ABT clients: do you need Agent 365 today? If your institution uses only Copilot within M365, Purview covers your monitoring needs. If you are deploying custom Copilot Studio agents, connecting third-party AI tools like Claude or ChatGPT Enterprise to your tenant, or planning to use CoWork for autonomous workflows, Agent 365 fills the governance gap that Purview cannot. ABT's Copilot readiness assessment maps your current license to the governance features available at each tier.

2024
Purview DLP for M365 Copilot (GA)

Block Copilot from processing files with specific sensitivity labels. Basic usage audit logging. Label-based enforcement only.

2025
Insider Risk Management + AI Detectors

Anomalous AI usage patterns trigger risk scoring. DSPM for AI dashboard launched. One-click policy creation. Activity Explorer for Copilot interactions.

March 2026
SIT-Based Prompt DLP + Credential Scanning

Real-time scanning of Copilot prompts for sensitive info types. Credential detection in documents. Web search restriction for SIT-containing prompts (preview). Palo Alto Prisma and Island Browser integrations.

May 2026
E7 Frontier Suite with Agent 365

Full governance stack: Purview + Agent 365 + CoWork monitoring. Agent inventory, lifecycle management, behavioral risk scoring. Multi-model AI governance for Claude and GPT within tenant.

The Examiner Conversation You Need to Prepare For

FFIEC examiners are increasingly asking about AI governance. The questions follow a pattern: What AI tools have you deployed? What data can they access? How do you monitor their usage? What controls prevent data leakage? How do you document AI-related incidents? The same FFIEC examination framework community banks already know now covers AI oversight, so these questions land in every IT exam that touches Copilot.

Purview audit logs answer questions two through four. Agent 365 covers question one (complete inventory) and question five (incident documentation for autonomous agent actions). The combination gives your compliance team evidence rather than explanations.

The audit log is the single most important piece of evidence. Every Copilot interaction generates a record with the operation type (CopilotInteraction for Microsoft Copilot, ConnectedAIAppInteraction for custom agents, AIAppInteraction for third-party AI), the user principal name, the timestamp, the application host (Word, Excel, Teams, Outlook, or BizChat), and a full list of every resource Copilot accessed. That resource list includes SharePoint site URLs, file names, file types, and sensitivity label IDs.

Audit Log Export

Export 90 days of CopilotInteraction records from Purview Audit. Filter by RecordType = CopilotInteraction. Show interaction volume by user, app, and department.

Sensitive Data Access Report

Pull the DSPM for AI Reports showing how many Copilot interactions touched files with sensitivity labels. Break down by label type: Confidential vs. Internal vs. Highly Confidential.

DLP Policy Summary

Document every active DLP policy scoped to the Copilot location. List the sensitivity labels and SITs each policy enforces. Include the policy creation date and last match count.

Data Assessment Results

Run a fresh DSPM data assessment for all SharePoint sites. Print the oversharing report showing sites flagged, remediation actions taken, and remaining exposure.

Third-Party AI Inventory

Pull the Enterprise AI Apps report from DSPM for AI. List every third-party AI application detected in your tenant, the number of users, and the volume of interactions.

Incident Response Documentation

Document your process for handling AI-related DLP violations. Include escalation steps, who reviews Communication Compliance alerts, and how Insider Risk Management AI signals are triaged.

To search for Copilot interactions in the Purview audit log, navigate to purview.microsoft.com, select Audit, and configure your search. Under Activities, search for "Copilot" and select "Interacted with Copilot." For precise filtering, use the Activities - operation names field and enter CopilotInteraction as the operation. Set your RecordType to CopilotInteraction. You can scope the search to specific users or leave it open for all users across the tenant.

Standard audit log retention is 180 days. Audit Premium (E5 Compliance licensing) extends retention to one year, with optional audit log retention policies for longer periods. For examiner preparation, export your audit log search results to CSV and format them into a report showing daily interaction volume, top users by interaction count, applications used, and any interactions that triggered DLP or sensitivity label events.

If your examiner asks for the actual content of Copilot prompts and responses (not just metadata), you need eDiscovery. Copilot interactions are stored like Teams messages in user mailboxes. Using Content Search or eDiscovery, search Exchange mailboxes with the condition type set to Copilot interactions. This generates a KQL query that returns the actual prompt and response text, which you can then review, collect, and export.

Partner Intelligence: AI Security Incidents Are Doubling

AI-related data security incidents are increasing as more organizations deploy generative AI tools. Data security incidents involving generative AI are a growing concern as these tools are deployed. For the 750+ financial institutions ABT serves, these numbers represent the regulatory scrutiny trajectory: examiners follow the incident data.

Source: Microsoft security research

The Three-Step Baseline

If you do nothing else, do these three things before your next exam: (1) verify Purview audit logging is active for Copilot, (2) apply sensitivity labels to your most sensitive data classifications, and (3) create a DLP policy scoped to the Copilot location that prevents processing of labeled content. These three steps take a few hours and give you the baseline governance evidence your examiner expects. Everything else in this article builds on that foundation.

Your examiner will ask how you monitor Copilot. The right answer is a dashboard, not a promise.

Frequently Asked Questions

Basic Purview DLP for Copilot is included with Microsoft 365 E5 and Business Premium. The SIT-based prompt DLP that reached general availability in March 2026 is available to any M365 tenant with Copilot access, including E1 and E3. Advanced features like Insider Risk Management, enhanced audit retention (1 year vs. 180 days), and credential scanning may require Purview add-on licenses or the E7 Frontier Suite. Your ABT licensing specialist can map your current plan to the governance features you need.

Yes. Purview audit logs capture Copilot interactions across all M365 applications including Teams, SharePoint, OneDrive, Outlook, Word, Excel, and PowerPoint. Each interaction record includes the user identity, timestamp, application host, and a list of every resource Copilot accessed with file names, SharePoint URLs, and sensitivity label IDs.

Purview monitors Microsoft-native AI (Copilot) within the M365 ecosystem and can detect visits to third-party AI sites. Agent 365 adds agent inventory management, lifecycle governance with human sponsors, and behavioral risk scoring for autonomous AI agents. Agent 365 manages and governs agents but does not deploy or execute them. For full AI governance at a regulated financial institution, you need both working together.

Navigate to purview.microsoft.com, select Audit, and configure a search. Under Activities, select "Interacted with Copilot" or enter CopilotInteraction as the operation name. Set the RecordType filter to CopilotInteraction. Each record shows the user, timestamp, application host, and every file or resource Copilot accessed. Standard retention is 180 days. Audit Premium with E5 extends to one year.

Yes. Credential scanning integrates with your existing Purview DLP policy framework. You can create policies that detect and alert on credentials found in documents, emails, and SharePoint sites. The alerts flow into the same Purview compliance dashboard your team already monitors for other DLP violations.

At minimum, configure your Copilot DLP policy to detect Social Security numbers, bank account numbers, credit card numbers, and Individual Taxpayer Identification Numbers (ITINs). These are the four Microsoft built-in GLBA sensitive information types. They match the same types Guardian's DLP stack monitors, giving you consistent protection across both your Copilot interactions and your broader tenant data.

Can You Answer Your Examiner's AI Governance Questions Today?

ABT deploys Purview AI governance, Agent 365 monitoring, and Guardian's Productivity Insights as a unified stack for credit unions, community banks, and mortgage companies. 750+ financial institutions trust ABT to make their AI deployment examiner-ready from day one.

Justin Kirsch

CEO, Access Business Technologies

Justin Kirsch has built compliance-first technology programs for financial institutions since 1999. As CEO of Access Business Technologies, the largest Tier-1 Microsoft Cloud Solution Provider dedicated to financial services, he helps more than 750 credit unions, community banks, and mortgage companies deploy AI governance frameworks that satisfy regulators and enable productivity.