-
Notifications
You must be signed in to change notification settings - Fork 0
sadanandu/AI-Powered-SQL-Assistant
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
# 💡 AI-Powered Oracle 23ai Query Assistant (POC) This project is a proof-of-concept (POC) system that allows users to ask natural language questions and receive answers powered by Oracle Autonomous Database (ADBS) using both SQL and vector search. It integrates a locally running LLM via Ollama and a backend FastAPI-based MCP server that handles execution logic and communication with the database. The system uses a modular design, enabling intelligent routing of queries, SQL generation, and response interpretation — all handled locally. ---  ## 🧱 System Components ### 👤 User - Inputs natural language questions. - Receives final human-readable answers. ### 🧠 LLM on Ollama (LLaMA 3.2) - Interprets user queries and generates appropriate SQL (traditional or semantic). - Also interprets SQL responses if needed for better human readability. - Runs locally via [Ollama](https://ollama.com). ### 🤖 SQL Assistant - Central orchestrator of the entire workflow. - Routes queries to the LLM, forwards generated SQL to the MCP server, and interprets final results. - Also performs intermediate logic for formatting or refining output before presenting to the user. ### 🌐 MCP Server (FastAPI) - Receives SQL statements from the SQL Assistant. - Executes SQL on Oracle ADBS and returns results. - Acts as the executor, not the decision-maker. ### 🗄️ Oracle Autonomous Database (ADBS) - Hosts structured and unstructured data. - Performs both standard SQL and vector-based semantic queries. - Embedding generation and storage is handled natively using Oracle's `ALL_MINILM_L12_V2` model. --- ## 🔁 Data Flow Overview 1. **User submits a natural language query** to the SQL Assistant. 2. **SQL Assistant forwards the query** to the local LLM on Ollama. 3. **LLM generates a SQL query** (traditional or vector-based). 4. **SQL Assistant sends the SQL** to the MCP server for execution. 5. **MCP server executes the SQL** on Oracle ADBS and retrieves results. 6. **Results are returned to the SQL Assistant** by the MCP server. 7. **SQL Assistant may optionally send results to LLM** for interpretation. 8. **LLM interprets and explains results** (optional). 9. **Final human-readable output is shown to the user**. This flow ensures both local processing and flexibility in adapting queries intelligently. --- ## 🔧 Technologies Used - **Oracle Autonomous Database (23ai)** - **Oracle Vector Search** - **LLaMA 3.2 (via Ollama)** - **FastAPI (MCP Server)** - **Python + oracledb driver** --- ## 🚀 Highlights - 🧠 Fully local LLM with intelligent SQL generation. - 🗃 No external vector DB needed — all vector embedding and querying is native to Oracle ADBS. - ⚡ Smart switching between structured SQL and semantic vector search. - 🧩 Modular architecture for easy expansion (e.g., adding UI, confidence scoring, etc.). - 🔄 Optional interpretation of SQL results via LLM for more natural answers. --- ## 📌 Status ✅ Embeddings inserted ✅ Local LLM integrated via Ollama ✅ MCP server connected to Oracle ADBS ✅ Semantic and SQL querying tested ✅ Diagram-based architecture established ### 🧪 Next Steps - UI development for better usability. - Add confidence scoring and query classification. - Explore multi-modal inputs (e.g., PDFs, images). - Implement user feedback loop for query refinement.
About
AI Powered SQL assistant
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published