Chat

Backend Example

Stream chunks of text from a LLM.

# src/chat.py
import morph
from morph import MorphGlobalContext
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

@morph.func
def langchain_chat(context: MorphGlobalContext):
    llm = ChatOpenAI(model="gpt-4o")
    messages = [HumanMessage(context.vars["prompt"])]
    for token in llm.stream(messages):
        yield token.content

Installation

npx shadcn@latest add https://morph-componennts.vercel.app/r/chat.json