Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
06ef885
updated readme with relevant info
Kenny-Angelikas Oct 31, 2025
60b5c87
Merge pull request #1 from KennyAngelikas/angelikask/setupBranch
NSimon1121 Oct 31, 2025
983b9dd
first
manvendersingh21 Oct 31, 2025
e511cca
Merge pull request #2 from KennyAngelikas/firstPersonna
KennyAngelikas Nov 3, 2025
5e2b0c9
refactor frontend + update UI
KennyAngelikas Nov 4, 2025
26c48dd
Merge pull request #3 from KennyAngelikas/angelikask/refactor
manvendersingh21 Nov 4, 2025
b768bf6
the gemini api is working here
KennyAngelikas Nov 6, 2025
73fe7b6
Merge pull request #4 from KennyAngelikas/angelikask/geminiapiWorking
manvendersingh21 Nov 6, 2025
052d448
Created Join Team Button
NSimon1121 Nov 6, 2025
d009c29
Merge pull request #5 from KennyAngelikas/JoinTeamBtn
manvendersingh21 Nov 7, 2025
5cf9d68
User matching done in a team
manvendersingh21 Nov 7, 2025
8f73513
Merge pull request #6 from KennyAngelikas/team-matching
pneman1 Nov 7, 2025
5a4adc4
Created DB module for model config + management
pneman1 Nov 7, 2025
e9dd805
Merge pull request #7 from KennyAngelikas/dbmodule
KennyAngelikas Nov 7, 2025
f6d9e84
Backend Refactored
manvendersingh21 Nov 11, 2025
16230ae
Merge pull request #8 from KennyAngelikas/refactoring-backend
NSimon1121 Nov 11, 2025
9af0507
Docker Update
NSimon1121 Nov 13, 2025
e0da7e0
dockerised
manvendersingh21 Nov 14, 2025
3001cb6
Merge pull request #10 from KennyAngelikas/refactoring-backend
KennyAngelikas Nov 14, 2025
9d804e6
Revise README for clarity and add testing instructions
KennyAngelikas Nov 14, 2025
30e3d4e
Merge pull request #11 from KennyAngelikas/angelikask/updateReadMe
NSimon1121 Nov 14, 2025
971ceda
Add user API key settings UI and support for custom Gemini API keys
pneman1 Nov 14, 2025
2e82975
multi teams
KennyAngelikas Nov 14, 2025
4ce35e5
Merge pull request #13 from KennyAngelikas/angelikask/multiTeams
NSimon1121 Nov 14, 2025
b0b8f14
Revise README.md for Project Overlap overview
KennyAngelikas Nov 17, 2025
97eb26b
Merge pull request #14 from KennyAngelikas/angelikask/readMeUser
pneman1 Nov 17, 2025
0071828
Merge branch 'main' into Docker
manvendersingh21 Nov 19, 2025
1ef57e7
Merge pull request #9 from KennyAngelikas/Docker
manvendersingh21 Nov 19, 2025
23022a4
Merge pull request #12 from KennyAngelikas/customAPIFeature
manvendersingh21 Nov 19, 2025
2e3891d
deploy
manvendersingh21 Nov 20, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added .DS_Store
Binary file not shown.
56 changes: 56 additions & 0 deletions .github/workflows/deploy.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
name: Docker Build & Deploy to Vercel

on:
push:
branches:
- main

jobs:
deploy:
runs-on: ubuntu-latest

steps:
- name: Checkout Code
uses: actions/checkout@v4

- name: Install Vercel CLI
run: npm install --global vercel@latest

# Step A: Pull Vercel Config
# We do this OUTSIDE Docker first so we have the valid .vercel folder
# to copy INTO the Docker container.
- name: Pull Vercel Environment Information
run: vercel pull --yes --environment=production --token=${{ secrets.VERCEL_TOKEN }}
env:
VERCEL_ORG_ID: ${{ secrets.VERCEL_ORG_ID }}
VERCEL_PROJECT_ID: ${{ secrets.VERCEL_PROJECT_ID }}

# Step B: Build the Docker Image
# Automates: docker build --build-arg VERCEL_TOKEN=... -t overlap-chatgpt .
- name: Build Docker Image
run: |
docker build \
--build-arg VERCEL_TOKEN=${{ secrets.VERCEL_TOKEN }} \
-t overlap-chatgpt .

# Step C: Extract Artifacts
# Automates: docker create, docker cp, docker rm
- name: Extract Prebuilt Artifacts
run: |
# Create a temporary container (don't run it, just create it)
docker create --name temp_container overlap-chatgpt

# Copy the .vercel output folder from the container to the runner
# Note: We overwrite the local .vercel folder with the build output
docker cp temp_container:/.vercel .

# Cleanup
docker rm temp_container

# Step D: Deploy to Vercel
# Automates: vercel deploy --prebuilt --prod
- name: Deploy to Vercel
run: vercel deploy --prebuilt --prod --token=${{ secrets.VERCEL_TOKEN }}
env:
VERCEL_ORG_ID: ${{ secrets.VERCEL_ORG_ID }}
VERCEL_PROJECT_ID: ${{ secrets.VERCEL_PROJECT_ID }}
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -157,4 +157,5 @@ cython_debug/
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
#.idea/
.vercel
150 changes: 150 additions & 0 deletions ClientServer.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
Overlap-chatgptCloneThis is the backend server for the Overlap ChatGPT Clone, a Flask application designed to serve a chat API. It is configured for deployment on Vercel using a custom Docker build environment.🚀 How to Run LocallyFollow these steps to run the server on your local machine for development.1. PrerequisitesPython 3.12A Python virtual environment (recommended)A config.json file (see below)2. Local InstallationClone the repository:git clone [https://github.com/KennyAngelikas/Overlap-chatgptClone]
cd Overlap-chatgptClone
Create and activate a virtual environment:python3 -m venv venv
source venv/bin/activate
Install the required Python packages:pip install -r requirements.txt
Create your configuration file. Your app reads settings from config.json. Create this file in the root directory.{
"site_config": {
"host": "0.0.0.0",
"port": 1338,
"debug": true
},
"database": {
"url": "postgresql://user:password@localhost:5432/mydb"
},
"api_keys": {
"gemini": "YOUR_GEMINI_API_KEY_HERE"
}
}
Run the application:python run.py
Your server should now be running on http://localhost:1338.📦 How to Deploy to VercelThis project is deployed using a prebuilt output from a custom Docker container. This complex process is required to build psycopg2 correctly for Vercel's Amazon Linux runtime.1. PrerequisitesDocker Desktop must be installed and running.Vercel CLI must be installed: npm install -g vercelA Vercel account.2. Required Project FilesYou must have these four files in your project's root directory.DockerfileThis file builds your project inside an environment identical to Vercel's (Amazon Linux 2023).# Stage 1: The "builder"
# USE THE OFFICIAL AWS LAMBDA PYTHON 3.12 IMAGE (Amazon Linux 2023)
FROM public.ecr.aws/lambda/python:3.12 AS builder

WORKDIR /app

# Install build tools, node, and npm using DNF
RUN dnf update -y && dnf install -y "Development Tools" nodejs npm

# 2. Install Python dependencies
COPY requirements.txt requirements.txt
RUN pip3 install --user --no-cache-dir -r requirements.txt
# Add Python's user bin to the PATH
ENV PATH=/root/.local/bin:$PATH

# 3. Install Vercel CLI
RUN npm install --global vercel@latest

# 4. Copy all your project files
COPY . .

# 5. Copy your Vercel project link
COPY .vercel .vercel

# 6. Build the project using Vercel CLI
ARG VERCEL_TOKEN
RUN VERCEL_TOKEN=$VERCEL_TOKEN vercel build --prod

# ---
# Stage 2: The "final output"
FROM alpine:latest

# Copy the entire .vercel folder
COPY --from=builder /app/.vercel /.vercel
vercel.jsonThis file tells Vercel how to build and route your Python app.{
"builds": [
{
"src": "run.py",
"use": "@vercel/python",
"config": { "pythonVersion": "3.12" }
}
],
"routes": [
{
"src": "/(.*)",
"dest": "run.py"
}
]
}
requirements.txtMake sure this file uses psycopg2-binary.flask
python-dotenv
requests
beautifulsoup4
psycopg2-binary
# ... any other libraries
.dockerignoreThis speeds up your Docker build by ignoring unnecessary files.# Venv
venv/

# Docker build output
.vercel

# Python cache
__pycache__/
*.pyc
3. ⚠️ Important: Fix config.json for VercelYour run.py script (which reads config.json) will fail on Vercel. Vercel uses Environment Variables for secrets, not JSON files.You must modify your run.py to read from os.environ.Original run.py (Local only):# ...
from json import load

if __name__ == '__main__':
config = load(open('config.json', 'r'))
site_config = config['site_config']
# ...
Modified run.py (Works locally AND on Vercel):from server.app import app
from server.website import Website
from server.controller.conversation_controller import ConversationController
from json import load
import os # Import os

# --- VERCEL FIX ---
# Check if running on Vercel (or any system with ENV VARS)
db_url = os.environ.get('DATABASE_URL')
site_port = os.environ.get('PORT', 1338) # Vercel provides a PORT

if db_url:
# We are on Vercel or similar
site_config = {
"host": "0.0.0.0",
"port": int(site_port),
"debug": False
}
# You would also load other configs (like GEMINI_API_KEY) here
# os.environ.get('GEMINI_API_KEY')
else:
# We are local, load from config.json
config = load(open('config.json', 'r'))
site_config = config['site_config']
# You would also load DB URL from config here
# db_url = config['database']['url']
# --- END FIX ---


# This logic is now outside the __name__ block
site = Website(app)
for route in site.routes:
app.add_url_rule(
route,
view_func = site.routes[route]['function'],
methods = site.routes[route]['methods'],
)

ConversationController(app)

# This will run for a 404
@app.route('/', methods=['GET'])
def handle_root():
return "Flask server is running!"

# This block is for local development only
if __name__ == '__main__':
print(f"Running on port {site_config['port']}")
app.run(**site_config)
print(f"Closing port {site_config['port']}")
4. Deployment StepsStep 1: One-Time Vercel SetupLog in to Vercel CLI:vercel login
Link your project:vercel link
Pull project settings:vercel pull --yes
Add Vercel Environment Variables:Go to your project's dashboard on Vercel.Go to Settings > Environment Variables.Add all your secrets (e.g., DATABASE_URL, GEMINI_API_KEY). These must match the os.environ.get() keys in your run.py.Step 2: The 6-Step Deploy ProcessRun these commands from your project's root directory every time you want to deploy a change.Build the Docker image: (This will take a few minutes)docker build --build-arg VERCEL_TOKEN="YOUR_VERCEL_TOKEN_HERE" -t overlap-chatgpt .
(Get your token from Vercel Dashboard > Settings > Tokens)Remove the old container (in case it exists):docker rm temp_container
Create a new container from the image:docker create --name temp_container overlap-chatgpt
Copy the build output from the container to your computer:docker cp temp_container:/.vercel .
Clean up the container:docker rm temp_container
Deploy the prebuilt output!vercel deploy --prebuilt --prod
🔌 Architecture: Client-Server InteractionThis repository is a JSON API backend. It is only the "server" part of your application.Client (The "Browser")A user visits your Vercel URL (e.g., https://overlap-chatgpt-clone.vercel.app).Vercel serves your static frontend (e.g., React, HTML/JS) from the Website routes.The user types a message in the chat.Server (This Flask App)Your frontend's JavaScript makes an HTTP request (e.g., a POST request to /api/chat) with the user's message.Vercel routes this request to your run.py serverless function.The ConversationController receives the request.It calls services like gemini_service (to talk to an AI) and teams_service (to get data).The teams_service uses db_model to query your PostgreSQL database (using psycopg2).The services return data to the controller.ResponseThe ConversationController formats a JSON response.Flask sends this JSON back to the client.Your frontend's JavaScript receives the JSON and displays the chat message to the user.
36 changes: 24 additions & 12 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,22 +1,34 @@
# Build stage
FROM python:3.8-alpine AS build
# Stage 1: The "builder"
# USE THE OFFICIAL AWS LAMBDA PYTHON 3.12 IMAGE (Amazon Linux 2023)
FROM public.ecr.aws/lambda/python:3.12 AS builder

WORKDIR /app

# CHANGED: Removed "Development Tools". We only need nodejs and npm.
RUN dnf update -y && dnf install -y nodejs npm

# 2. Install Python dependencies
COPY requirements.txt requirements.txt
RUN apk add --no-cache build-base && \
pip3 install --user --no-cache-dir -r requirements.txt
RUN pip3 install --user --no-cache-dir -r requirements.txt
# Add Python's user bin to the PATH
ENV PATH=/root/.local/bin:$PATH

# 3. Install Vercel CLI
RUN npm install --global vercel@latest

# 4. Copy all your project files
COPY . .

# Production stage
FROM python:3.8-alpine AS production
# 5. Copy your Vercel project link
COPY .vercel .vercel

WORKDIR /app
# 6. Build the project using Vercel CLI
ARG VERCEL_TOKEN
RUN VERCEL_TOKEN=$VERCEL_TOKEN vercel build --prod

COPY --from=build /root/.local /root/.local
COPY . .

ENV PATH=/root/.local/bin:$PATH
# ---
# Stage 2: The "final output"
FROM alpine:latest

CMD ["python3", "./run.py"]
# Copy the entire .vercel folder
COPY --from=builder /app/.vercel /.vercel
48 changes: 28 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,33 @@
Development of this repository is currently in a halt, due to lack of time. Updates are comming end of June.

working again ; )
I am very busy at the moment so I would be very thankful for contributions and PR's

## To do
- [x] Double confirm when deleting conversation
- [x] remember user preferences
- [x] theme changer
- [ ] loading / exporting a conversation
- [ ] speech output and input (elevenlabs; ex: https://github.com/cogentapps/chat-with-gpt)
- [ ] load files, ex: https://github.com/mayooear/gpt4-pdf-chatbot-langchain
- [ ] better documentation
- [ ] use react / faster backend language ? (newbies may be more confused and discouraged to use it)

# ChatGPT Clone
feel free to improve the code / suggest improvements

# Project Overlap
UPDATE THIS IMAGE WITH AN EXAMPLE IMAGE
<img width="1470" alt="image" src="https://user-images.githubusercontent.com/98614666/232768610-fdeada85-3d21-4cf9-915e-a0ec9f3b7a9f.png">

## Overview
### Project Abstract
Overlap is a collaborative chatbot ecosystem designed to enhance peer-to-peer knowledge sharing within a team. Instead of acting as a generic Q&A assistant, the system connects teammates to one another based on overlapping interests, recent learning activity, or self-declared expertise.

Each team member has a personal chatbot (e.g., in Slack). When a user asks their bot for help — for example, “Teach me React basics.” — the bot consults a shared, opt-in knowledge index to identify peers who either know or recently asked about React. If a match is found, the bot responds: “Bob has ‘React’ expertise, and Alice asked about React two days ago. Want to connect with them?”

This replaces solitary AI help with socially intelligent nudges that build team relationships while maintaining individual privacy. Later phases will visualize shared learning across the team through a mind-map view, showing connections between topics and people (opt-in only).

### Novelty
Why is this novel? Recent work shows that AI is eroding the social fabric inside and outside the classroom. Students are going to TAs, peers, and instructors at the lowest rates ever. That’s a problem because peer support is tightly linked to students’ wellbeing—and without it, students feel more isolated than ever.

_**Our AI directly confronts this. Instead of replacing relationships, it is designed to build them.**_

A second piece of novelty is how we handle “expertise finding.” Academic teams struggle to know who to go to for help. Our AI reduces that cognitive load by matching the questions students ask with the knowledge and experience already present in their community, connecting them to the right expert and strengthening human relationships along the way.


# Testing
feel free to improve the code / suggest improvements

## Client + Testing
To test our application, please go here --> https://overlap-chatgpt-clone-oah03fquq-manvender-singhs-projects.vercel.app/chat/
```
WARNING: PLEASE DO NOT PROMPT TOO MUCH OR IT WILL START CHARGING YOU
```

## Getting Started
## Getting Started Development
To get started with this project, you'll need to clone the repository and set up a virtual environment. This will allow you to install the required dependencies without affecting your system-wide Python installation.

### Prequisites
Expand All @@ -28,7 +36,7 @@ Before you can set up a virtual environment, you'll need to have Python installe
### Cloning the Repository
Run the following command to clone the repository:
```
git clone https://github.com/xtekky/chatgpt-clone.git
git clone this repo
```

### Setting up a Virtual Environment
Expand Down
Loading