Expanding AI Beyond Copilot: Managing Security Risks with External LLMs
In our previous discussion, we focused on securing Microsoft Copilot, SharePoint AI, and Teams AI to ensure sensitive construction data remains protected within a Microsoft 365-controlled environment. However, as construction firms expand their AI strategies, many are considering external large language models (LLMs) such as OpenAI’s GPT, Google Gemini, Anthropic Claude, or other AI-powered analytics tools.
While Microsoft Copilot is designed to work within enterprise security policies, integrating third-party AI tools introduces new risks that Copilot does not inherently address:
Why Companies Using Copilot Are Considering External AI
While Microsoft Copilot is optimized for internal enterprise use, many construction firms are looking at third-party AI integrations for:
These integrations promise major benefits but also pose significant security and compliance challenges, especially when they are used alongside Microsoft Copilot.
The Security Risks of Mixing Copilot with External AI Models
Unlike Copilot, which respects enterprise permissions, external AI models do not automatically follow Microsoft 365’s security policies. This creates potential security gaps:
1. Copilot Cannot Enforce Role-Based Access Controls on External AI Tools
Within Microsoft 365, Copilot only retrieves information that users already have permission to access. However, if Copilot connects to an external AI system, it may:
Example Risk: A project manager asks Copilot:
“Summarize the latest material cost projections for our Sydney project.”
If Copilot is linked to an external LLM that processes industry reports, it could surface:
Without enterprise AI governance, companies risk exposing proprietary business data to external models.
2. Data Leakage Risks When Employees Use External AI Tools Alongside Copilot
Employees may unknowingly copy and paste sensitive company data into public AI models, such as:
Unlike Copilot, which operates inside Microsoft’s security perimeter, external AI models may store user inputs indefinitely. This could result in:
Example Risk: A compliance officer copies a construction permit application into an AI tool to generate a faster summary. If the AI provider retains query inputs, sensitive project data could be stored externally without the company’s knowledge.
3. External AI Models May Not Be Compliant with Construction Industry Regulations
Microsoft 365 and Copilot operate within ISO 27001, SOC 2, and GDPR-compliant environments, but external AI models may not meet the same security standards.
Failure to ensure compliance could lead to regulatory violations if sensitive project data is processed outside legal jurisdictions.
Example Risk: A firm integrates an AI-powered compliance checker that processes legal documents in an unapproved region. If a privacy regulator audits the company, it may face penalties for violating data residency laws.
Best Practices for Secure External AI Integration with Copilot
To ensure Copilot and third-party AI models work together securely, construction firms should:
1. Restrict Copilot from Pulling Data from External AI Tools by Default
Copilot should only retrieve information from pre-approved, enterprise-controlled sources within Microsoft 365, SharePoint, and Teams.
Example Solution: If an external AI model is used for industry benchmarking, limit its ability to process internal project data by enforcing data-sharing policies through Microsoft Defender.
2. Use Private, Enterprise-Hosted AI Models Instead of Public AI APIs
Instead of using consumer-grade AI (e.g., ChatGPT’s public API), construction firms should:
Example Solution: Rather than using OpenAI’s public API, a firm can deploy GPT-4 within Azure OpenAI to maintain full data security and privacy controls.
3. Establish AI Compliance Guidelines for Employees
Employees must understand what data can and cannot be shared with AI models.
Example Solution: Firms should implement an “AI Acceptable Use Policy” that explicitly outlines permissible and restricted AI interactions in company workflows.
4. Monitor AI Queries & Implement Data Loss Prevention (DLP) Policies
Microsoft 365 provides audit logs for Copilot, allowing security teams to track AI-generated queries and responses.
Example Solution: If an employee submits sensitive project estimates into an external AI tool, Microsoft Purview can detect and prevent the data transfer.
Copilot is Secure—External AI Requires Extra Safeguards
Microsoft Copilot is built for enterprise security, but when construction firms integrate external AI solutions, they must address new risks related to data privacy, compliance, and unauthorized AI processing.