Skip to content

CausalBit/conversation_app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Conversations App

A chat application built with React and FastAPI, featuring Lakebase-backed conversation storage with per-user thread isolation.

Features

  • Lakebase Integration: Conversations stored in Databricks Lakebase for durability
  • Per-User Isolation: Each user has their own private conversation threads
  • Thread Management: Create, rename, delete, and search conversation threads
  • Real-time Streaming: SSE-based streaming responses from AI models
  • Modern UI: TailwindCSS-based responsive design

Architecture

conversations_app/
├── backend/              # FastAPI backend
│   └── app/
│       ├── api/          # Route handlers
│       ├── cache/        # Lakebase client & conversation store
│       ├── models/       # Pydantic schemas
│       └── services/     # Business logic
├── frontend/             # React frontend
│   └── src/
│       ├── api/          # API client
│       ├── components/   # React components
│       ├── hooks/        # Custom hooks
│       └── types/        # TypeScript types
├── pipelines/            # Databricks Asset Bundles
│   └── notebooks/        # Setup and seed notebooks
└── app.yaml              # Databricks App manifest

Prerequisites

  • Python 3.10+
  • Node.js 18+
  • Databricks workspace with Lakebase enabled
  • Model serving endpoint configured

Setup

1. Deploy Lakebase Infrastructure

cd pipelines
databricks bundle deploy --target dev
databricks bundle run setup_lakebase --target dev

2. Backend Setup

cd backend
python -m venv venv
source venv/bin/activate  # or `venv\Scripts\activate` on Windows
pip install -r requirements.txt

3. Frontend Setup

cd frontend
npm install

Development

Start Backend

cd backend
export LAKEBASE_INSTANCE_NAME="conversations-lakebase-dev"
export SERVING_ENDPOINT_NAME="your-endpoint-name"
export LOCAL_API_TOKEN="your-databricks-token"
uvicorn app.main:app --reload --port 8000

Start Frontend

cd frontend
npm run dev

The frontend will be available at http://localhost:3000 with API proxying to the backend.

Deployment

Deploy as Databricks App

  1. Build the frontend:

    cd frontend
    npm run build
  2. Deploy the app:

    databricks apps deploy conversations-app --source-code-path .
  3. Grant Lakebase permissions:

    cd pipelines
    databricks bundle run grant_permissions --target prod

Environment Variables

Variable Description Default
LAKEBASE_INSTANCE_NAME Lakebase instance name conversations-lakebase
LAKEBASE_DATABASE Lakebase database name databricks_postgres
SERVING_ENDPOINT_NAME Model serving endpoint -
DATABRICKS_HOST Databricks workspace host -
LOCAL_API_TOKEN PAT for local development -

API Endpoints

Method Endpoint Description
GET /api/threads List user's threads
POST /api/threads Create new thread
GET /api/threads/{id} Get thread with messages
PATCH /api/threads/{id} Rename thread
DELETE /api/threads/{id} Delete thread
POST /api/chat Send message (streaming)
WS /api/chat/ws WebSocket chat
GET /api/user Get current user info

Database Schema

Tables (app_state schema)

  • threads: thread_id, user_id, title, first_query, created_at, updated_at, is_active
  • messages: message_id, thread_id, user_id, content, role, model, sources, metrics, created_at
  • message_ratings: message_id, user_id, rating, created_at

About

Databricks App with Lakebase instance to interact with an Knowledge Assistant Agent Bricks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors