Open Source LLM based Web Chat Interface

import os
import requests
import json
from fastapi import FastAPI
from fastapi.responses import HTMLResponse
from pydantic import BaseModel
import uvicorn

app = FastAPI()

#this is the list of free models in as of Dec 7, 2023.

class Prompt(BaseModel):
    text: str
    model: str

About this template

This app will be a web interface that allows the user to send prompts to open source LLMs. It requires to enter the openrouter API key for it to work. This api key is free to get on and there are a bunch of free opensource models on so you can make a free chatbot. The user will be able to choose from a list of models and have a conversation with the chosen model. The conversation history will be displayed in chronological order, with the oldest message on top and the newest message below. The app will indicate who said each message in the conversation. The app will show a loader and block the send button while waiting for the model's response. The chat bar will be displayed as a sticky bar at the bottom of the page, with 10 pixels of padding below it. The input field will be 3 times wider than the default size, but it will not exceed the width of the page. The send button will be on the right side of the input field and will always fit on the page. The user will be able to press enter to send the message in addition to pressing the send button. The send button will have padding on the right side to match the left side. The message will be cleared from the input bar after pressing send. The last message will now be displayed above the sticky input block, and the conversation div will have a height of 80% to leave space for the model selection and input fields. There will be some space between the messages, and the user messages will be colored in green while the model messages will be colored in grey. The input will be blocked when waiting for the model's response, and a spinner will be displayed on the send button during this time.

Introduction to Open Source LLM based Web Chat Interface

Welcome to the step-by-step guide on how to set up and use the Open Source LLM based Web Chat Interface. This template allows you to create a web chat interface that connects with various open-source language models provided by You'll be able to send prompts to these models and receive responses, simulating a conversation. The chat history will be displayed on the web page, with user messages in green and model responses in grey.

To begin using this template, click on "Start with this Template" on the Lazy platform.

Setting Environment Secrets

Before you can interact with the API, you need to set up an environment secret for your API key. Follow these steps to configure your environment secret:

  • Visit and register for an account to obtain your free API key.
  • Once you have your API key, go to the Environment Secrets tab within the Lazy Builder interface.
  • Create a new secret with the key `OPENROUTER_API_KEY` and paste your API key as the value.

This API key will be used to authenticate your requests to the API.

Using the Test Button

After setting up your environment secret, you can use the Test button to deploy your app. The Lazy CLI will handle the deployment process, and you will not need to provide any additional input at this stage.

Interacting with the Web Chat Interface

Once your app is deployed, Lazy will provide you with a dedicated server link to access your web chat interface. If you're using FastAPI, you will also receive a link to the API documentation.

To interact with the chat interface:

  • Open the provided server link in your web browser.
  • You will see a web page with a chat interface and a dropdown menu to select one of the available language models.
  • Type your message into the input field at the bottom of the page.
  • Choose the model you wish to converse with from the dropdown menu.
  • Click the "Send" button or press "Enter" to submit your prompt.
  • The conversation will update with your message in green and the model's response in grey.
  • While the model is generating a response, the send button will show a spinner, indicating that the process is ongoing.

Enjoy your conversation with the open-source language models! Remember, you can always switch between different models to explore various responses and capabilities.


By following these steps, you should now have a fully functional Open Source LLM based Web Chat Interface. This guide has walked you through setting up your environment secret, deploying the app with the Test button, and interacting with the web chat interface. If you encounter any issues or have further questions, please refer to the documentation provided by or reach out for support.

Last published
July 20, 2024

More templates like this

MP3ify: Youtube to MP3 Converter

A web application that allows users to download YouTube videos from URLs and provides the option to convert them to MP3 format.


Login and Registration

This is a good starting point for any app that requires login and registration

React JS

Discord Moderation Bot

The Discord bot monitors all messages in the server. If a message contains profanity, the bot deletes it and sends a warning to the channel. The bot also notifies the host about the deletion via a direct message. Additionally, the bot outputs a helpful error message to a channel if there is a permissions error, guiding the server admin to enable the required permission in the Discord Developer portal.

Open Source LLM based Web Chat Interface