Welcome to the home of Mini-AutoGPT, the pocket-sized AI with the heart of a giant. Here to demonstrate that local LLMs can still rock your digital world, Mini-AutoGPT runs on pure Python 3.11 and communicates through the magic of python-telegram-bot
. It's experimental, autonomous, and always ready to chatβwhat more could you ask for in a desktop companion?
This Repository is mainly a demonstration for using local LLMs for fully autonomous AI. It is meant as a guide for readers of my Book "Unlocking the Power of Auto-GPT and Its Plugins"
This Repository is also a Preview of Sophie-AI, a fully autonomous AI that runs on your local machine and can be used for various tasks, which acomplishes more complex thoughts and tasks than Mini-AutoGPT.
Mini-AutoGPT isn't just a stripped-down version of some monolithic AIβit's your friendly neighborhood bot that lives right in your Telegram! Itβs designed to be simple enough for anyone to tinker with, yet robust enough to handle the sophisticated needs of modern chat applications.
Due to the nature of local LLMs (3B, 4B, 7B, 8B etc.) being smaller than their cloud counterparts, Mini-AutoGPT is a great way to experiment with AI on your local machine without having to pay for cloud services.
Here's a sneak peek of its main ingredients:
- Python 3.11: Fresh and powerful.
- python-telegram-bot: Connects you directly to your users via Telegram.
- Autonomy: Runs fully on its own, no hand-holding required.
To get started, you'll need:
-
Python 3.11 installed on your machine.
-
A Telegram bot token (get yours from @BotFather).
-
Clone this repository and install dependencies:
git clone https://github.com/yourusername/mini-autogpt.git cd mini-autogpt pip install -r requirements.txt
-
Update the .env file with your Telegram bot token.
-
Run any of those local LLM servers with API enabled:
- LMStudio
- oobabooga/textgeneration-webui
-
Update .env file with the API URL of your local LLM.
- Meta/LLama-3-8B-Instruct (preferable a variation with more context length)
- NousResearch/Hermes-2-Pro-Mistral-7B
- argilla/CapybaraHermes-2.5-Mistral-7B
- Intel/neural-chat-7b-v3-1
- Nexusflow/Starling-LM-7B-beta
- mistralai/Mixtral-8x7B-Instruct-v0.1
Just run the script, and your bot will come to life:
./run.sh
or
python3.11 main.py
Mini-AutoGPT is still experimental. It might get a little too excited and repeat what you say or surprise you with unexpected wisdom. Handle it with care and affection!
Feel free to fork, star, and submit pull requests. Bugs can be reported in the issues section. Help Mini-AutoGPT learn the ways of this vast digital universe!
Mini-AutoGPT in Action π¬
Here's a snippet of what to expect when you fire up Mini-AutoGPT:
ββββββ
βββ βββ ββββββββββ βββββ ββ
ββββ ββββββββββββββ ββββ ββ ββββ
ββ ββ βββββββββββββββββββββββββ β ββ
ββββ βββββββββββββββββββββββββββββββββ ββββββ
β ββββββββββββββββββββββββββββββββββ ββ
βββββββββββββββββββββββββββββββββββ
ββββ ββββββββββ βββββββββββ ββββββββββ
ββββββββββ βββββββββββ βββββββββββ ββββ
βββββ βββββββββββββββββββββββββ βββββ β
ββββββ βββββββββββββββββββββββ βββββ
βββββ ββββββββββββββββββββββββββββββββββ βββββ
βββββ βββββββββββββββββββββββββββββββ ββββββ
βββββ βββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββββ
ββββββ βββββββββββββββββββββββββββββ βββββ
βββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββ
ββββββββββββββββββββββββββββββββββββββββββ
ββββββββββββββββββββββββββββββ
Hello my friend!
I am Mini-Autogpt, a small version of Autogpt for smaller llms.
I am here to help you and will try to contact you as soon as possible!
Note: I am still in development, so please be patient with me! <3
Forgetting everything...
My memory is empty now, I am ready to learn new things!
*** I am thinking... ***
Mini-AutoGPT is the small bot with a big dream: to make LLMs accessible on your local machine. Join us in nurturing this tiny digital marvel!
Mini-AutoGPT: Small in size, big on personality. π
This project is licensed under the MIT License. For more information, please refer to the LICENSE file.