diff --git a/.gitignore b/.gitignore
index 5c27a8e3a..170dbd82e 100644
--- a/.gitignore
+++ b/.gitignore
@@ -119,6 +119,17 @@ env.bak/
venv.bak/
**/secrets.env
+# DevTunnel and sensitive configuration files
+**/crm-mcp-connector.yaml
+**/*devtunnel*.yaml
+**/*.local.yaml
+**/*.personal.yaml
+
+# Additional sensitive files
+**/*.secret
+**/*.private
+**/local-config.*
+
# Spyder project settings
.spyderproject
.spyproject
diff --git a/MCP-COPILOTSTUDIO.md b/MCP-COPILOTSTUDIO.md
new file mode 100644
index 000000000..9c8a53621
--- /dev/null
+++ b/MCP-COPILOTSTUDIO.md
@@ -0,0 +1,159 @@
+# Exposing MCP Server via DevTunnel and Connecting to Copilot Studio
+
+This guide walks you through exposing your local MCP (Model Context Protocol) server using Microsoft DevTunnel and connecting it to Microsoft Copilot Studio using a custom MCP connector.
+
+## π Prerequisites
+
+Before starting, ensure you have:
+- β
Python 3.11+ installed
+- β
DevTunnel CLI installed (`devtunnel`)
+- β
Access to Microsoft Copilot Studio
+- β
Azure subscription (for Copilot Studio)
+- β
The OpenAI Workshop repository cloned locally
+
+## ποΈ Architecture Overview
+
+```mermaid
+graph TB
+ A[Local MCP Server
localhost:8000/mcp] --> B[DevTunnel
Port Forwarding]
+ B --> C[Public URL
https://xxx.use.devtunnels.ms/mcp]
+ C --> D[Copilot Studio
Custom MCP Connector]
+ D --> E[AI Agent/Copilot
with MCP Tools]
+
+ F[OpenAPI Spec
crm-mcp-connector.yaml] --> D
+```
+
+## π Step 1: Start Your Local MCP Server
+
+### 1.1 Set Up Python Environment
+
+```bash
+# Navigate to the backend services directory
+cd OpenAIWorkshop/agentic_ai/backend_services
+
+# Create and activate virtual environment (if not already done)
+python -m venv ../../../venv
+source ../../../venv/Scripts/activate # Windows
+# source ../../../venv/bin/activate # Linux/Mac
+
+# Install required dependencies
+pip install fastmcp httpx rich pygments pydantic uvicorn cryptography authlib python-jose
+```
+
+### 1.2 Start the MCP Service
+
+```bash
+# From the backend_services directory
+python mcp_service.py
+```
+
+You should see output similar to:
+```
+[09/01/25 09:00:01] INFO Starting MCP server 'Contoso Customer API as Tools' with transport 'streamable-http' on http://0.0.0.0:8000/mcp
+INFO: Started server process [12345]
+INFO: Waiting for application startup.
+INFO: Application startup complete.
+INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
+```
+
+### 1.3 Verify Local MCP Server
+
+Test your local server:
+```bash
+# Test the MCP endpoint
+curl -X GET http://localhost:8000/mcp
+```
+
+## π Step 2: Expose MCP Server via DevTunnel
+
+### 2.1 Create DevTunnel
+
+```bash
+# Create a new tunnel for your MCP server
+devtunnel create crm-mcp --allow-anonymous
+
+# Add port forwarding for port 8000
+devtunnel port create crm-mcp -p 8000
+```
+
+### 2.2 Start DevTunnel Host
+
+```bash
+# Start the tunnel host to expose your local server
+devtunnel host crm-mcp
+```
+
+You'll see output like:
+```
+Hosting port: 8000
+Connect via browser: https://p8er9sxt.use.devtunnels.ms:8000, https://p8er9sxt-8000.use.devtunnels.ms
+Inspect network activity: https://p8er9sxt-8000-inspect.use.devtunnels.ms
+
+Ready to accept connections for tunnel: crm-mcp.use
+```
+
+### 2.3 Test Public Access
+
+```bash
+# Test your public MCP endpoint
+curl -X GET https://p8er9sxt-8000.use.devtunnels.ms/mcp
+```
+
+**π Note**: Replace `p8er9sxt-8000.use.devtunnels.ms` with your actual tunnel URL.
+
+## π Step 3: Prepare OpenAPI Specification
+
+The `crm-mcp-connector.yaml` file contains the OpenAPI specification for your MCP server:
+
+### 3.1 Update the Host URL
+
+**β οΈ Important**: Update the `host` field in `crm-mcp-connector.yaml` with your actual DevTunnel URL:
+
+```yaml
+host: YOUR_ACTUAL_TUNNEL_URL.use.devtunnels.ms
+```
+
+## π€ Step 4: Connect to Microsoft Copilot Studio
+
+### 4.1 Access Copilot Studio
+
+1. Navigate to [Microsoft Copilot Studio](https://copilotstudio.microsoft.com/)
+2. Sign in with your Azure/Microsoft 365 account
+3. Create a new copilot or select an existing one
+
+### 4.2 Add Custom MCP Connector
+
+1. **In your Agent, Go to Tools**:
+ - Select **"Add a tool, new tool"**
+
+2. **Choose Connector Type**:
+ - Select **"Custom connector and new custom connector"**
+ - Choose **"Import from an OpenAPI file"**
+
+3. **Upload OpenAPI Specification**:
+ - Rename the `crm-mcp-connector.template` file to `crm-mcp-connector` and edit the host details
+ - Upload the `crm-mcp-connector.yaml` file
+ - Refresh and connect the newly added mcp tool in copilot studio
+
+```
+
+If you've followed all steps correctly, you should now have:
+
+β
Local MCP server running and accessible
+β
Public DevTunnel exposing your MCP server
+β
Custom connector configured in Copilot Studio
+β
Working MCP actions in your copilot
+
+Your copilot can now access customer data, billing information, and knowledge base through the MCP protocol!
+
+## Support
+
+If you encounter issues:
+
+1. Check the MCP server logs for errors
+2. Verify DevTunnel connectivity
+3. Test the OpenAPI specification in Swagger Editor
+4. Monitor network traffic via DevTunnel inspection URL
+5. Review Copilot Studio connector test results
+
+---
diff --git a/agentic_ai/applications/.env.sample b/agentic_ai/applications/.env.template
similarity index 90%
rename from agentic_ai/applications/.env.sample
rename to agentic_ai/applications/.env.template
index e7e0362ef..920046965 100644
--- a/agentic_ai/applications/.env.sample
+++ b/agentic_ai/applications/.env.template
@@ -2,10 +2,10 @@
# Azure OpenAI β chat model configuration #
############################################
# Replace with your model-deployment endpoint in Azure AI Foundry
-AZURE_OPENAI_ENDPOINT="https://YOUR-OPENAI-SERVICE-ENDPOINT.openai.azure.com"
+AZURE_OPENAI_ENDPOINT="https://YOUR-RESOURCE-NAME.openai.azure.com/"
-# Replace with your Foundry projectβs API key
-AZURE_OPENAI_API_KEY="YOUR-OPENAI-API-KEY"
+# Replace with your Foundry project's API key
+AZURE_OPENAI_API_KEY="YOUR-AZURE-OPENAI-API-KEY"
# Connection-string that identifies your Foundry project / workspace. Only needed when using Azure AI Agent.
AZURE_AI_AGENT_PROJECT_CONNECTION_STRING="YOUR-OPENAI-PROJECT-CONNECTION-STRING"
@@ -28,10 +28,10 @@ MCP_SERVER_URI="http://localhost:8000/mcp"
# AGENT_MODULE="agents.autogen.single_agent.loop_agent"
# AGENT_MODULE="agents.autogen.multi_agent.collaborative_multi_agent_round_robin"
# AGENT_MODULE="agents.autogen.multi_agent.collaborative_multi_agent_selector_group"
-# AGENT_MODULE="agents.autogen.multi_agent.handoff_multi_agent_domain"
+ AGENT_MODULE="agents.autogen.multi_agent.handoff_multi_agent_domain"
# AGENT_MODULE="agents.semantic_kernel.multi_agent.collaborative_multi_agent"
# AGENT_MODULE="agents.semantic_kernel.multi_agent.a2a.collaborative_multi_agent"
-AGENT_MODULE="agents.autogen.single_agent.loop_agent"
+#AGENT_MODULE="agents.autogen.single_agent.loop_agent"
# -----------------------------------------------------------
# If you are experimenting with Logistics-A2A, uncomment:
diff --git a/agentic_ai/backend_services/crm-mcp-connector.template.yaml b/agentic_ai/backend_services/crm-mcp-connector.template.yaml
new file mode 100644
index 000000000..b4066ddd4
--- /dev/null
+++ b/agentic_ai/backend_services/crm-mcp-connector.template.yaml
@@ -0,0 +1,18 @@
+swagger: '2.0'
+info:
+ title: CRM MCP Server
+ description: Allows to manage CRM data for specific customers, details, subscriptions providing tools to list, search, update, and create customer, product and invoice data.
+ version: 1.0.0
+host: YOUR_DEVTUNNEL_URL.use.devtunnels.ms
+basePath: /mcp
+schemes:
+ - https
+paths:
+ /:
+ post:
+ summary: CRM MCP Server
+ x-ms-agentic-protocol: mcp-streamable-1.0
+ operationId: InvokeMCP
+ responses:
+ '200':
+ description: Success
\ No newline at end of file