Skip to content

daanturo/starhugger.el

Repository files navigation

https://melpa.org/packages/starhugger-badge.svg https://stable.melpa.org/packages/starhugger-badge.svg

  • AI-powered text & code completion client.

Currently supported backends: Ollama, Hugging Face’s Inference API.

demo.webm

(Maybe not up to date) Mirror: https://github.com/daanturo/starhugger.el

Installation

Starhugger can be installed from MELPA - Starhugger, using the command package-install.

Of if you want to install from source, add one of the following to your configuration:

;; package-vc.el (built-in from Emacs 29 and above)
(unless (package-installed-p 'starhugger)
  (package-vc-install '(starhugger :url "https://gitlab.com/daanturo/starhugger.el")))

;; straight.el
(straight-use-package '(starhugger :files (:defaults "*.py")))

;; Doom
(package! starhugger :recipe (:files (:defaults "*.py")))

;; elpaca.el
(elpaca (starhugger :repo "https://gitlab.com/daanturo/starhugger.el" :files (:defaults "*.py")))

Or any package manager of your choice.

Usage

Primary commands

  • Previewing overlay: starhugger-trigger-suggestion to display the suggestion.

M-x starhugger-show-next-suggestion and starhugger-show-prev-suggestion to cycle suggestions.

M-x starhugger-accept-suggestion to insert current suggestion.

starhugger-dismiss-suggestion (bound to C-g by default when showing) to cancel.

There is also starhugger-auto-mode (non-global minor mode), but take note of its usage for the Hugging Face text inference backend because of the limit rate.

Example configuration

;; Optionally use another model, it's best to be set before loading this package, or use Emacs 29+ `setopt'
(setq starhugger-model-id "codellama")

(global-set-key (kbd "M-\\") #'starhugger-trigger-suggestion)

(with-eval-after-load 'starhugger
  ;; `starhugger-inline-menu-item' makes a conditional binding that is only active at the inline suggestion start
  (define-key starhugger-inlining-mode-map (kbd "TAB") (starhugger-inline-menu-item #'starhugger-accept-suggestion))
  (define-key starhugger-inlining-mode-map (kbd "M-[") (starhugger-inline-menu-item #'starhugger-show-prev-suggestion))
  (define-key starhugger-inlining-mode-map (kbd "M-]") (starhugger-inline-menu-item #'starhugger-show-next-suggestion))
  (define-key starhugger-inlining-mode-map (kbd "M-f") (starhugger-inline-menu-item #'starhugger-accept-suggestion-by-word)))

Use the Hugging Face backend:

(setq starhugger-completion-backend-function #'starhugger-hugging-face-inference-api)

(setq starhugger-hugging-face-api-token "hf_ your token here")

(setopt starhugger-model-id "organization/some-model")
;; Appropriate tokens for the model
(setq starhugger-fill-tokens '("<PRE>" "<SUF>" "<MID>"))
(setq starhugger-stop-tokens '("<|endoftext|>" "<EOT>"))

Additional settings:

;; For evil users, dismiss after pressing ESC twice
(defvar my-evil-force-normal-state-hook '())
(defun my-evil-run-force-normal-state-hook-after-a (&rest _)
  (run-hooks 'my-evil-force-normal-state-hook))

(advice-add #'evil-force-normal-state
 :after #'my-evil-run-force-normal-state-hook-after-a)

;; Workaround conflict with `blamer.el'
;; (https://github.com/Artawower/blamer.el): when at the end of line, blamer's
;; overlay's `after-string' property will display before starhugger's
;; `display' property, which will result in starhugger's part of suggestion on
;; current line (1) being pushed out of the display

;; <before point>|                            commit info<right edge of the window><suggestion after point, before newline>
;; <the rest of suggestion>

;; workaround: disable `blamer-mode' while `starhugger-inlining-mode'

(defvar-local my-starhugger-inlining-mode--blamer-mode-state nil)
(defvar-local blamer-mode nil)

(defun my-starhugger-inlining-mode-h ()
  (if starhugger-inlining-mode
      (progn
        (add-hook 'my-evil-force-normal-state-hook
                  (lambda () (starhugger-dismiss-suggestion t))
                  nil t)
        (setq my-starhugger-inlining-mode--blamer-mode-state blamer-mode)
        (when my-starhugger-inlining-mode--blamer-mode-state
          (blamer-mode 0)))
    (progn
      (when (and my-starhugger-inlining-mode--blamer-mode-state
                 (not blamer-mode))
        (blamer-mode 1)))))

(add-hook 'starhugger-inlining-mode-hook #'my-starhugger-inlining-mode-h)

Notes

When using the Hugging Face text inference backend, remember to set starhugger-hugging-face-api-token (from https://huggingface.co/settings/tokens), otherwise you may easily get hit by the limit rate.

Known quirks

From the model (https://huggingface.co/bigcode/starcoder):

  • Doesn’t use num_return_sequences (detailed_parameters) to return multiple responses, workaround by making multiple requests.
  • Doesn’t use use_cache, current workaround is forcing a different response via randomizing temperature.

Emacs overlays are used under the hood to display inline suggestion, there are some shortcomings with this approach:

  • Not possible to display PRE|<ov>SUF without using 2 different types of overlay properties when SUF isn’t emtpy (in the middle of the buffer) and empty (at buffer end)
  • At the end of the buffer (overlaystart = overlay-end), the overlay’s keymap property doesn’t work
  • Conflict with https://github.com/Artawower/blamer.el, mentioned in “Example configuration”