An updated Emacs doctor.
This package provides a Large Language Model independent (LLM) backend with
multiple applications. For example, to honor its ancestor, you can talk to a new
Artificial Intelligence-enhanced (AI) Emacs doctor via M-x elaiza-doctor
.
For general chatting, you can use M-x elaiza-chat
directly from the minibuffer.
By default, you can chat with a local Rocket-3B model (see getting started).
You can switch backends by prefixing your commands with C-u
and selecting an
available LLM (see supported backends).
Similar to OpenAI’s GPTs, you can create your own assistants. For example, M-x
elaiza-editor
is a multilingual editor, giving you suggestions for your current
buffer. With elaiza-jupyter
you can get org-mode responses that are optimized for
Python code, similar to Jupyter Notebooks.
Using ELAIZA as the backend, you can build all kind of applications, like elaiza-businesscard
, which parses an image of a business card and returns the annotated, OCR’ed result in org-mode format.
Add the following to your packages.el
(package! elaiza :recipe (:host github :repo "SFTtech/emacs-elaiza" :branch "main"))
If you have no prior experience with LLMs and want to try ELAIZA, simply use M-x elaiza-chat
.
This will download a small model (Rocket-3B, 1.89 GB) to ~/llamafiles
and start the the server automatically.
Afterwards you can start chatting with the model.
Note, due to its size, it will have worse capabilities compared to larger models, such as GPT-4.
You can stop the server with M-x elaiza-llamafile-stop
.
In elaiza-mode
, you can use C-c <RET>
to continue the conversation (elaiza-chat-continue
) and C-c C-k
to interrupt the response (elaiza-chat-interrupt
).
Currently, the following applications are available:
M-x elaiza-chat
for direct minibuffer promptsM-x elaiza-doctor
for an AI-enhanced Emacs doctor experienceM-x elaiza-editor
as a multilingual editor for suggestions in your current bufferM-x elaiza-jupyter
likeelaiza-chat
but with special Python instructions.M-x elaiza-businesscard
parse an image of a businesscard using GPT-4o mini.
You can specify a default backend using customize.
Alternatively prefix the elaiza commands (C-u
) or call M-x elaiza-change-default-model
.
For example, in Doom Emacs, insert the following into your config.el
to use GPT-4o
as your default model.
(use-package! elaiza
:config (setq elaiza-default-model (make-elaiza-gpt-4o))
If you have downloaded a Llamafile already, for example, from https://github.com/mozilla-Ocho/llamafile, you can select it as default model by specifying its name and location.
(use-package! elaiza
:config
(setq elaiza-default-model (make-elaiza-llamafile
:name "Llamafile: LLaMA-3 8B"
:filename "~/llamafiles/llama3.llamafile")))
Some backends, such as ChatGPT and Claude 3, require an API key.
To securely store and retrieve API keys, use auth-source, as documented in the Emacs Auth Manual. Add the following to your auth-sources
file, for example, .authinfo.gpg
, to store them:
Create a key at https://platform.openai.com/api-keys.
Insert into your ~/.authinfo.gpg
:
machine api.openai.com port https login elaiza password <your-api-key>
Use GPT-4 Turbo
as your default model by adding the following to your config.el
:
(use-package! elaiza
:config (setq elaiza-default-model (make-elaiza-gpt-4-turbo))
Create a key at https://console.anthropic.com/settings/keys.
Insert into your ~/.authinfo.gpg
:
machine api.anthropic.com port https login elaiza password <your-api-key>
Use Claude 3 Opus
as your default model by adding the following to your config.el
:
(use-package! elaiza
:config (setq elaiza-default-model (make-elaiza-claude-opus))
Model | Provider | Sourcecode |
---|---|---|
GPT 4o | OpenAI | elaiza-openai.el |
GPT 4o mini | OpenAI | elaiza-openai.el |
GPT 4 | OpenAI | elaiza-openai.el |
GPT 4 Turbo | OpenAI | elaiza-openai.el |
GPT 3.5 Turbo | OpenAI | elaiza-openai.el |
Claude 3 Opus | Anthropic | elaiza-claude.el |
Claude 3 Sonnet | Anthropic | elaiza-claude.el |
Claude 3 Haiku | Anthropic | elaiza-claude.el |
Available Llamafiles | Llamafile | elaiza-llamafile.el |
Available Ollama Models | Ollama | elaiza-ollama.el |
- karthink/gptel GPTel: A simple LLM client for Emacs
- ahyatt/llm: llm package for emacs
- s-kostyaev/ellama: Ellama is a tool for interacting with large language models from Emacs.