e2538b3666
Delete outdated packages
2024-08-19 12:02:37 -04:00
d2b7817dc1
Add optional region support to aimenu
...
This change modifies the function aimenu-get-buffer-with-line-numbers to accept optional start and end arguments, allowing it to add line numbers to a specified region instead of the entire buffer. Additionally, it adjusts usage in aimenu-show-outline to handle active region if present.
2024-08-15 00:04:09 -04:00
f32fbd670b
Add optional instruction input for custom outlines
...
Modified aimenu function to accept an optional instruction argument, allowing users to provide specific instructions for generating the outline. Updated related logic to handle the new argument and adjusted the system message accordingly.
2024-08-12 12:34:36 -04:00
862c55dfdc
Require gptel
2024-08-12 11:48:53 -04:00
71861b026a
Add aimenu for AI-powered imenu outlines
...
This commit introduces aimenu, a new package that generates imenu-like outlines using a language model. The new package includes functions for generating and handling outlines, managing cache, and interacting with the user to select headers.
2024-08-12 11:38:59 -04:00
9acc16c9aa
Add some utility AI replacement functions
2024-07-02 15:07:57 -04:00
11755d6201
Add spinner indicating when the AI is responding, and inhibit input during that time
2024-07-02 14:41:14 -04:00
c140740a32
When prompting about a whole buffer, don't put whole buffer in chat
2024-06-27 21:49:16 -05:00
78bc060018
Open a new Llama chat if you call llama-ask* without an existing chat
2024-06-18 16:00:09 -04:00
f1698e3247
Remove eval, add filter mechanism
2024-05-29 14:36:26 -04:00
d36967e371
Minor tweaks
2024-05-24 01:03:44 -04:00
8db9cc7e8f
Make a version of llama-replace-in-region that does not have context
2024-05-23 22:59:43 -04:00
6191654511
Give the AI the ability to evaluate elisp
2024-05-23 18:31:13 -04:00
cdd674fdcd
Many more llama improvement
2024-05-23 13:28:24 -04:00
e91b1d6480
Make llama chat nicer
2024-05-23 10:49:23 -04:00
cdd892d4e1
Switch out llm for llama
2024-05-23 00:53:18 -04:00
5faa99158d
Make the default llm chat aware that its inside of Emacs
2024-05-21 11:01:13 -04:00
3d39b0c3df
Fix llm output buffer sentinal formatting and add autoloads
2024-05-21 10:26:45 -04:00
df79eee678
Add llm-chat command and improve prompt code structure
2024-05-20 11:33:32 -04:00
3c09eb3cbd
Enable llm.el to work with ollama models
2024-05-17 13:30:09 -04:00
87584357a9
Set some customizations for llm integration
2024-05-16 10:37:26 -04:00
f4da3fc9e2
Add interface to the llm command-line tool
2024-05-15 13:28:36 -04:00
650a90b812
Make origami-treesit a proper minor mode
2024-05-08 10:59:18 -04:00
889abf2606
Add folding for ruby hash pairs and fix folding for calls
2024-05-08 10:36:41 -04:00
a0c0f6d1e2
Add ruby support to origami-treesit and make folding level configurable
2024-05-07 16:58:51 -04:00
81c2ad6963
Fix navi behavior when path doesn't exist and autoloads
2024-05-06 10:02:59 -04:00
a9ea6b4b2f
Enable treesit folding in regular yaml-mode
2024-05-05 20:36:28 -04:00
f7c927bca1
Make origami-treesit more generic
2024-05-04 22:02:51 -04:00
81106aae04
Autoload Navi class definitions
2024-05-03 21:50:11 -04:00
7a74212089
Write a navi integration
2024-05-03 15:29:31 -04:00