RY
All projects
2025·Solo (AI course)·Archived

AirBean Coffee Chatbot

Local LLM recommending coffee from live menu data.

Summary

A frontend-only chatbot for a coffee app. Fetches live menu data from AirBean's API and uses it as context in the prompt to a local LLM. Custom prompt template with few-shot prompting and recommendation logic based on customer preferences.

Stack

ReactLangChain.jsOllama (local LLM)

Highlights

  • Live menu data injected into the prompt context
  • Few-shot prompting for consistent recommendations
  • Fully local - no calls to external LLM providers

Background

AI course project. Build a frontend-only chatbot for a coffee app (AirBean) that can recommend drinks based on live menu data.

How it works

  1. On app start, live menu data is fetched from AirBean's API
  2. The menu data is injected as context into the prompt to a local LLM via Ollama
  3. The user chats with the bot - "I want something strong and not too milky" - and the bot recommends matching drinks

Custom prompt template

This is where the project got interesting. A plain prompt without structure performed poorly - the bot often forgot the menu or recommended drinks that didn't exist. The fix was a structured template with:

  • Few-shot prompting - 2-3 example conversations in the prompt that show the format and tone
  • Explicit constraint - "only recommend from the following list: ..."
  • Recommendation logic - instructions on how preferences (strong / sweet / vegan / etc.) should be weighted

Technical choices

  • Frontend-only - no backend, no database. All state in React, all AI logic via LangChain.js directly in the client against Ollama on localhost.
  • Local LLM via Ollama - no external API costs, but requires Ollama running locally for the app to work.

What I took away

Few-shot prompting is enormously more powerful than instruction prompting for consistency. Two or three examples turned outputs from randomly formatted text blobs into responses that strictly followed the rules.