Last updated: π’ 2026-04-16 π’
An interactive banking demo that shows how databases power OLTP, OLAP, and AI workloads side-by-side, all wired together through Microsoft Fabric.
π Try the live app
| π Setup Guide | Get the app running locally |
| βοΈ Explore Fabric Workloads | RTI, Power BI, Data Agent walkthroughs |
| π§βπ» Learn More | Architecture, agents, embeddings, contributing |
- Transactions (OLTP) β real-time writes and reads against a Fabric SQL Database
- Analytics (OLAP) β an analytics dashboard with charts and summaries of spending habits. This represents an OLAP workload, running complex, aggregate queries over a large dataset.
- AI Agents β multi-agent LangGraph system with a coordinator, support, account, Fabric Data Agent (optional), and visualization agent
- Generative UI β agents create personalized interactive visualizations on the fly
- Real-time Monitoring β app usage and content safety data streamed to Fabric Eventhouse via Eventstream
Before you begin, install the following:
| Requirement | Notes |
|---|---|
| Node.js v18+ | Frontend |
| Python 3.11.9+ | Backend |
| Azure CLI | Auth β Windows Β· macOS |
| ODBC Driver 18 for SQL Server | Database connectivity |
| Microsoft Fabric capacity | Start a free 60-day trial if needed |
| Azure OpenAI resource | Create one in Azure Portal |
Recommended: VS Code (tested environment)
git clone https://github.com/Azure-Samples/agentic-app-with-fabric.git
cd agentic-app-with-fabric# Create and activate a virtual environment
# Windows:
python -m venv venv
.\venv\Scripts\activate
# macOS / Linux:
python3 -m venv venv
source venv/bin/activate
# Install packages
pip install -r requirements.txtnpm installaz loginUse your Microsoft Fabric account credentials. Watch for browser pop-ups.
β οΈ You must repeataz loginany time you restart the backend.
This single command creates the workspace, deploys all Fabric artifacts, creates SQL tables, and populates backend/.env with connection strings automatically.
# Windows:
python scripts/setup_workspace.py --workspace-name "AgenticBankingApp-{yourinistials}"
# Mac:
python3 scripts/setup_workspace.py --workspace-name "AgenticBankingApp-{yourinistials}"The script will prompt you to select a Fabric capacity, then create a workspace named AgenticBankingApp-{yourinistials}, or any other name to make the workspace name unique.
What gets deployed:
| Artifact | Type |
|---|---|
| agentic_app_db | SQL Database |
| agentic_cosmos_db | Cosmos DB |
| agentic_lake | Lakehouse |
| banking_semantic_model | Semantic Model |
| Agentic_Insights | Power BI Report |
| Banking_DataAgent | Data Agent |
| agentic_eventhouse | Eventhouse + KQL Database |
| agentic_stream | Eventstream |
| ContentSafetyMonitoring | KQL Dashboard |
| QA_Evaluation_Notebook | Notebook |
π‘ Alternative deployment: prefer Git integration? See Deploy via Git Integration.
After setup_workspace.py completes, run:
# Windows:
python scripts/finalize_views_and_report.py
# Mac:
python3 scripts/finalize_views_and_report.py This finalizes the Lakehouse SQL views, patches the Semantic Model, and deploys the Power BI Report. The workspace ID is read automatically from the previous step β no argument needed.
When done, your workspace lineage should look like this:
setup_workspace.py already auto-populated some of backend/.env. You only need to fill in the Azure OpenAI values, Cosmos DB Endpoint and EventHub connection details.
Open backend/.env and set (copy and rename .env.sample template file):
AZURE_OPENAI_KEY= Your API key for the Azure OpenAI service. You can find this in the Azure Portal by navigating to your Azure OpenAI resource and selecting Keys and Endpoint.
AZURE_OPENAI_ENDPOINT="https://<your-resource>.openai.azure.com/"
AZURE_OPENAI_DEPLOYMENT="<your chat model name, e.g. gpt-4o-mini>"
AZURE_OPENAI_EMBEDDING_DEPLOYMENT="text-embedding-ada-002"
β οΈ The embedding deployment must betext-embedding-ada-002β embeddings in the repo were generated with that model.
COSMOS_DB_ENDPOINT= You can find this in your Fabric workspace by navigating to the Cosmos DB artifact in your workspace, clicking the "settings" -> "Connection" tab and copy the endpoint string.In your Fabric workspace, open the agentic_stream Eventstream β click CustomEndpoint β SAS Key Authentication tab:

Copy the two values into .env:
FABRIC_EVENT_HUB_NAME="<Event hub name>"
FABRIC_EVENT_HUB_PRIMARY_KEY="<Connection string-primary key>"
NOTE: first click on the eye button near "Connection string-primary key" to reveal the value, then copy the valueThe following were written to .env automatically by the deployment script:
FABRIC_SQL_CONNECTION_URL_AGENTIC # SQL Database connection string
FABRIC_DATA_AGENT_SERVER_URL # Data Agent MCP endpoint
FABRIC_DATA_AGENT_TOOL_NAME # Data Agent tool name
USE_FABRIC_DATA_AGENT # Set to "true"
COSMOS_DB_DATABASE_NAME # Set to "agentic_cosmos_db"Open two terminal windows (both with the virtual environment activated and after az login):
Terminal 1 β Backend
cd backend
# Windows
python launcher.py
# Mac
python3 launcher.pyStarts two services:
- Banking API β http://127.0.0.1:5001
- Agent Analytics β http://127.0.0.1:5002
Terminal 2 β Frontend
npm run devOpens the app at β http://localhost:5173
| βοΈ Explore Fabric Workloads | Set up real-time monitoring, Power BI analytics, and the Data Agent |
| π§βπ» Learn More | Understand the architecture, multi-agent design, and how to contribute |

