OneDrive file sharing security: how to find every externally shared file (and fix it in one click)
.jpg)
Here's the question most IT admins can't answer without a week of work: right now, in your OneDrive tenant, how many files are shared externally, what's in them, and should those shares still exist?
The question sounds like it should have an easy answer. It doesn't, because the native tooling isn't built for it and the data exposure accrues one careless click at a time.
This post is the audit playbook. It covers what native Microsoft 365 tools can do, where they fall short, and how cloud DLP closes the gap. If you run a Microsoft 365 shop, this is the workflow you want.
The problem you probably already have
Two years of work-from-home, hybrid teams, and cross-functional collaboration produced a specific failure mode in OneDrive and SharePoint. Users created external share links casually, often with "Anyone with the link" permissions, and never revoked them.
Every first audit we've helped run shows the same pattern:
- Thousands of files are externally shared.
- Hundreds of those shares are to personal email addresses (Gmail, Yahoo, Outlook.com).
- Some of those addresses belong to people who no longer work at the company that originally received them.
- A non-trivial slice contains customer PII, contract drafts with financial terms, source code, or exec-level HR info.
No user is being malicious. They just shared for a specific meeting in 2023, forgot, and the link has been live since.
Native Microsoft 365 tooling: what you can do
Microsoft has been adding admin visibility to Purview and the Microsoft 365 Compliance Center. Here's what's genuinely usable.
Sharing reports. In the SharePoint admin center, you can run reports that list externally shared files across sites. Useful for a top-down view. Typically exported as CSV.
Purview Data Loss Prevention policies. You can define DLP rules that flag or block sharing of content matching certain patterns (SSN, credit card, custom classifiers). Useful for ongoing enforcement.
Sensitivity labels. Microsoft's answer to data classification. Labels can enforce protections (encryption, watermarks, access controls). Setup takes time. Adoption takes more.
Audit log. Tracks sharing events over time. Retention varies by license tier.
Microsoft Cloud App Security / Defender for Cloud Apps. Broader CASB-style controls. More expensive tier.
For orgs already committed to the Microsoft ecosystem, these are the primary tools.
A 10-minute native audit workflow
If you want to run a quick audit today with what you already have, here's the shortest path.
- Open the SharePoint admin center. Navigate to Reports > Sharing.
- Export the externally shared files report. Usually 30 days rolling window. Pull it as CSV.
- Sort by recipient email domain. Count unique external domains. Flag personal email domains (gmail.com, yahoo.com, outlook.com, icloud.com, etc.).
- Spot-check the top 20 most-shared files. Open a sample. Judge sensitivity by eye.
- Pull the top users by external share count. These are either power users (heavy legitimate sharing) or risk concentrations (sharing habits that need attention).
This gets you 20% of the picture in 10 minutes. It tells you whether the problem is big, medium, or "not really an issue for us."
Where the 10-minute audit falls short
The gaps are real. Most of them show up within the first hour.
Content inside files isn't classified. The report tells you "this file is shared externally." It doesn't tell you whether the file contains customer PII, contract terms, or your grandmother's cookie recipe. You have to open each one, or stand up a classification pipeline.
Nested content is invisible. A shared PowerPoint with an embedded Excel that has a PII tab looks the same in the report as a shared presentation about the company picnic.
Public-link shares without a specific recipient aren't always attributed. If someone created an "anyone with the link" share and emailed the link from Outlook, the audit log may not tell you who the link ultimately reached.
Ex-employee ownership. Files owned by departed employees often keep their sharing permissions. The reports don't always surface this cleanly.
Teams message attachments. Files attached in Teams chats live in the sender's OneDrive and inherit sharing permissions from there. Non-obvious in the default reports.
Personal Gmail recipients aren't flagged by default. You have to manually filter.
Shared folders, not files. A folder shared externally cascades permissions to everything inside. Report sometimes surfaces the folder without enumerating the downstream exposure.
None of these are criticisms of Microsoft specifically. They're natural consequences of what the tool is built for (admin operations) versus what you want right now (a risk audit).
How cloud DLP closes the gap
A purpose-built cloud DLP sits on top of Microsoft 365 (or Google Workspace) via API integration and does four things the native tools don't do well.
Continuous content scanning. Every file, on creation and on modification. Not a 30-day rolling window. The full tenant, kept current.
LLM-based content classification. Instead of regex for SSNs, the classifier reads the document the way a human reader would. Catches paraphrased sensitive content, screenshots containing PII, nested documents, and source code. Classifies into PII, PCI, PHI, and IP categories.
Cross-surface finding aggregation. Same file, same sharing, aggregated with any activity from endpoint DLP or SWG logs. Gives you one place to see everything about a risk event.
One-click remediation. Revoke external sharing. Transfer ownership. Quarantine files. Notify the original sharer with templated language. All from the findings view.
What CASB Neural adds
CASB Neural is dope.security's cloud DLP for OneDrive and Google Drive. Three things worth knowing.
AI-powered classification. LLM-based. Tuned to PII, PCI, PHI, and intellectual property. Handles the unstructured content that breaks regex DLP.
One-click remediation. From the alert view, you can revoke external sharing, reassign ownership to a live user, or quarantine a file. No ticket-opening required.
Continuous monitoring. The scan runs continuously, not on a weekly schedule. New shares get inspected as they happen.
Part of the dope.security SSE platform, so it lives in the same console as dope.SWG, Dopamine DLP, and Cloud Application Control. One alert view for the whole stack.
A 90-day remediation plan
The full audit process, at a realistic pace.
Days 1–7: Get visibility. Stand up the scan. Let it run. Don't act on findings yet.
Days 8–21: Triage. Sort findings by severity (sensitivity times exposure). Start with the top 100. Remediate the obvious high-risk items: public links to files with customer PII, PHI, or financial data.
Days 22–45: Policy. Write the sharing policy. Common elements: no "anyone with the link" shares for sensitive content, external shares expire after 90 days by default, sensitive content inherits a label that restricts sharing.
Days 46–60: Roll out labels and training. Apply sensitivity labels to the most sensitive content. Run a 15-minute training for managers. Keep it human.
Days 61–90: Automate. Set up continuous remediation for known failure patterns. Automatic revocation for links older than 180 days on sensitive content, for example.
After 90 days, this becomes a weekly review with automated enforcement for the 80% of patterns you've already catalogued.
A companion for Google Drive
The same audit plays out on Google Drive with minor adjustments. The finding types are similar, the native tooling is different (Admin console, Data Loss Prevention rules in Workspace Enterprise), and CASB Neural supports both environments from the same console.
If you run a mixed environment, start with whichever is bigger and apply the playbook to the second one 30 days later.
Start with one scan
The fastest way to know if this matters in your org is to run one scan on one tenant. In most first scans, the number of findings is surprising. What actually gets budget approved is the specific file, shared with the specific person, containing the specific sensitive thing.


.jpg)
.jpg)

