⭐ Featured Project

AI-Powered Support Chatbot

Designed and built an end-to-end AI chatbot using OpenAI, vector search, and internal system integrations to provide instant learner and instructor support at scale, with automated ticket escalation for complex cases.

Technologies & Tools

OpenAI APIMongoDBNode.jsVector DatabaseGoogle Docs

📊 Impact: 24/7 support for 10,000+ learners annually

Problem

Tech Goes Home supports over 10,000 learners annually, generating high volumes of repetitive support questions related to course enrollment, device orders, program logistics, and learner resources. These requests traditionally required manual staff intervention, creating operational bottlenecks and slower response times. No centralized, searchable interface existed for institutional knowledge, and support requests scaled faster than staff capacity.

My Role

I owned the project end-to-end, including system architecture and technology selection, backend API design and implementation, AI prompt strategy and retrieval logic, knowledge ingestion and document synchronization, ticket escalation logic and integrations, and deployment with iteration based on real usage. This was built as a production-ready internal system.

Solution

Built a chatbot system using retrieval-augmented generation (RAG) to ground responses in internal knowledge, preventing hallucinations and ensuring policy-accurate answers. Used vector database for semantic search over institutional knowledge, with Google Docs as a living, non-technical content store that allows staff to update content without engineering help. Implemented confidence thresholds with automated escalation to internal ticketing system when AI cannot confidently answer, maintaining human-in-the-loop support.

Architecture

1

LLM Layer uses OpenAI APIs for natural language understanding. Vector Database enables semantic search using embeddings. Node.js backend handles requests, context building, and routing. MongoDB stores conversation state, metadata, and logs. Integrates with internal ticketing system for escalation and device order data sources. Data flow: User submits question

2

Query embedded and matched against vectorized knowledge

3

Relevant context injected into LLM prompt

4

Grounded response returned

5

If confidence threshold not met, support ticket automatically created with context attached.

Key Design Decisions

🔹Retrieval-Augmented Generation (RAG) to prevent hallucinations and stay aligned with Tech Goes Home policies
🔹Google Docs as knowledge source to allow non-technical staff updates without engineering help
🔹Human-in-the-loop escalation to avoid AI dead ends and maintain operational trust
🔹Transparent responses with clear fallback to humans for complex cases

Results

  • Enabled 24/7 automated support for learners and instructors
  • Reduced repetitive manual support requests significantly
  • Improved response time from hours/days to seconds
  • Created reusable AI platform for future internal tools
  • Supported operations at scale of 10,000+ users annually

Technologies Used

OpenAI APIVector DatabaseMongoDBNode.jsGoogle WorkspaceRAG Architecture