Ellama is a tool for interacting with large language models from Emacs. It allows you to ask questions and receive responses from the LLMs. Ellama can perform various tasks such as translation, code review, summarization, enhancing grammar/spelling or wording and more through the Emacs interface. Ellama natively supports streaming output, making it effortless to use with your preferred text editor.
28.10.2023
- Switched from ollama's API to llm library. Many providers supported.
Install the package ellama
from
MELPA. Just M-x
package-install
Enter ellama
Enter.
By default it uses ollama
provider and zephyr model. If you
ok with it, you need to install
ollama and pull
zephyr. You can use ellama
with
other model or other llm provider. In that case you should customize
ellama configuration like this:
(use-package ellama
:init
(setopt ellama-language "German")
(require 'llm-ollama)
(setopt ellama-provider
(make-llm-ollama
:chat-model "zephyr:7b-alpha-q5_K_M" :embedding-model "zephyr:7b-alpha-q5_K_M")))
Ask Ellama about something by entering a prompt in an interactive buffer and continue conversation.
Ask Ellama about a selected region or the current buffer.
Ask Ellama to translate a selected region or word at the point.
Find the definition of the current word using Ellama.
Summarize a selected region or the current buffer using Ellama.
Review code in a selected region or the current buffer using Ellama.
Change text in a selected region or the current buffer according to a provided change.
Enhance the grammar and spelling in the currently selected region or buffer using Ellama.
Enhance the wording in the currently selected region or buffer using Ellama.
Make the text of the currently selected region or buffer concise and simple using Ellama.
Change selected code or code in the current buffer according to a provided change using Ellama.
Change selected code or code in the current buffer according to a provided change using Ellama.
Complete selected code or code in the current buffer according to a provided change using Ellama.
Add new code according to a description, generating it with a provided context from the selected region or the current buffer using Ellama.
Render the currently selected text or the text in the current buffer as a specified format using Ellama.
Create a markdown list from the active region or the current buffer using Ellama.
Create a markdown table from the active region or the current buffer using Ellama.
Summarize a webpage fetched from a URL using Ellama.
The following variables can be customized for the Ellama client:
ellama-buffer
: The default Ellama buffer name.ellama-user-nick
: The user nick in logs.ellama-assistant-nick
: The assistant nick in logs.ellama-buffer-mode
: The major mode for the Ellama logs buffer. Default mode ismarkdown-mode
.ellama-language
: The language for Ollama translation. Default language is english.ellama-provider
: llm provider for ellama. Default provider isollama
with zephyr model. There are many supported providers:ollama
,open ai
,vertex
,GPT4All
. For more information see llm documentationellama-spinner-type
: Spinner type for ellama. Default type isprogress-bar
.
Thanks Jeffrey Morgan for excellent project ollama. This project cannot exist without it.
Thanks zweifisch - I got some ideas from ollama.el what ollama client in Emacs can do.
Thanks Dr. David A. Kunz - I got more ideas from gen.nvim.
Thanks Andrew Hyatt for llm
library.
Without it only ollama
would be supported.