It’s 2023, everyone is playing around with ChatGPT on OpenAI and a new and improved version of GPT is being released every couple of months with lots of new improvements which make you even more effective.
The problem with these new models is that they’re only available for customers with OpenAI plus subscriptions, setting you back $20 each month (excluding tax). Instead of using OpenAI’s version, you can also setup your own gpt-4 instance on Azure OpenAI .
Great! But what’s in it for me?
- You pay as you go; instead of a flat fee of $20 per month, you pay a small amount per query
- Your data isn’t used for training the models, whereas the OpenAI ChatGPT implementation is continuously used to improve the model.
- More flexibility: you can customize the temperature of the GPT model and customize the system prompt message
- It allows you to have way larger context windows for your queries, at the time of writing Azure OpenAI supports 32k token limit versus 8k token limit of ChatGPT's Plus subscription
But where’s my nice ChatGPT-like UI?
Luckily there are lots of open source solutions that have you covered. All very much inspired by ChatGPT’s interface, showing a prompt history, word per word output and nice formatting.
A personal favorite of mine is chatbot-ui. I run this as a docker container on a raspberry pi in my local network, accessible only within my local network or with a proper VPN connection:
Simply replace your OPENAI_API_KEY
, AZURE_DEPLOYMENT_ID
and OPENAI_API_HOST
with respectively your primary key, deployment-name and Azure OpenAI endpoint. Your UI will be available on http://localhost:3000.
Update May 2024: As of earlier this year, the original maintainer of chatbot-ui moved to a new interface, adding more features but also adding lots of extra dependencies. Therefore I'm using a personal fork on Docker Hub - bartmsft of the old instance.