Commit Graph

1379 Commits

Author SHA1 Message Date
f49adb3b73 Update OpenAI reference to ChatGPT in init-ai.el 2024-07-21 08:06:51 -04:00
9fb96b3099 Refactor gptel backend selection and configuration
Reorganized backend definitions and selection process for gptel, introducing `gptel-backend-openai` and `gptel-backend-ollama` variables. Simplified backend selection logic and improved the `gptel-select-backend` function to dynamically prompt for models.
2024-07-21 00:49:29 -04:00
70599ad053 Add ace-link navigation configuration 2024-07-19 23:45:01 -04:00
ea7185c359 Get rid of spotify git hosts 2024-07-19 23:40:35 -04:00
ac0b59b569 Add eww and arc-mode setup to evil-collection 2024-07-19 23:40:21 -04:00
91ce6952ea Set avy-style to pre 2024-07-19 23:40:14 -04:00
6b9161a797 Add AI keybindings 2024-07-19 23:40:04 -04:00
31220b41b9 Support multiple GPT backends in gptel config
Enhanced the gptel configuration to support multiple backends (Ollama and OpenAI) and added a function to select the desired backend. Default backend is set to OpenAI.
2024-07-16 22:42:22 -04:00
9360580a94 Add Forge diff-for-pr function for pull request creation
This commit adds a new function forge-diff-for-pr to generate a Git diff for use in pull request creation. The function is called automatically when creating a pull request using Forge, providing an easy way to view changes between the source and target branches.
2024-07-16 14:24:30 -04:00
3537134c23 Add Terraform console functionality to Emacs.
This commit adds the ability to run a Terraform console at a specified path in a comint buffer, making it easier to interactively explore and manage Terraform state.
2024-07-16 14:24:16 -04:00
be07e673f9 Update gptel-commit-message prompt 2024-07-16 14:23:02 -04:00
bcb8d468a6 Add function to generate commit messages with gptel 2024-07-15 16:16:11 -04:00
0b8be7b2f7 Add function to jump to tf plan output 2024-07-15 16:16:11 -04:00
8b75209521 Remove casual-calc 2024-07-13 09:27:45 -04:00
57203210b8 Source local zshrc if exists 2024-07-11 11:30:22 -04:00
e29116f2f6 Add utility function to render ansi colors in a file 2024-07-09 13:20:24 -04:00
f394365ff9 Replace pixel-scroll with ultra-scroll-mac 2024-07-09 13:20:13 -04:00
fbc12a5434 Add casual-calc 2024-07-09 10:42:36 -04:00
6a33416fad Add "ask AI" mu4e action 2024-07-08 11:06:39 -04:00
8ac32f0955 Add posframe 2024-07-03 16:15:18 -04:00
e0f367faad Add gptel-quick 2024-07-03 16:09:55 -04:00
ccaf60abbd Add dbg macro 2024-07-03 15:21:40 -04:00
893a673616 Add gptel 2024-07-03 15:21:34 -04:00
9acc16c9aa Add some utility AI replacement functions 2024-07-02 15:07:57 -04:00
11755d6201 Add spinner indicating when the AI is responding, and inhibit input during that time 2024-07-02 14:41:14 -04:00
211904a190 Fix terraform devdocs eldoc docs 2024-06-28 13:58:29 -04:00
f8c3f76cbd Add ollama-copilot-mode to switch to local LLM for copilot 2024-06-28 13:21:47 -04:00
c140740a32 When prompting about a whole buffer, don't put whole buffer in chat 2024-06-27 21:49:16 -05:00
78bc060018 Open a new Llama chat if you call llama-ask* without an existing chat 2024-06-18 16:00:09 -04:00
93bc223e5b Fix plist header 2024-06-18 15:03:14 -04:00
b13cc823f5 Add witchcraft script injector 2024-06-18 15:02:50 -04:00
f3c529781c Fix delete-file implementation 2024-06-18 12:35:05 -04:00
68955452a5 Add dape for debugging 2024-06-13 13:49:30 -04:00
7aea6199cd Fix org-daily capture template 2024-06-13 13:49:23 -04:00
5f63c40b97 Use quit-window instead of bury-buffer 2024-06-13 13:49:10 -04:00
33cd6207f9 Enable evil-mode in the minibuffer 2024-06-06 16:04:19 -04:00
2960fc5ad1 Set up Avy 2024-06-06 11:49:48 -04:00
78dc31a1d9 Use pixel-scroll-precision-mode to scroll by pixel instead of by line. 2024-06-06 11:49:31 -04:00
5362bb63e3 Don't let switch-to-buffer mess with window layout 2024-06-06 09:50:17 -04:00
f993466d81 Run rspec in spec root if found 2024-06-06 09:50:08 -04:00
5169285be0 Add comfy-ui command 2024-06-05 13:13:57 -04:00
f1698e3247 Remove eval, add filter mechanism 2024-05-29 14:36:26 -04:00
d36967e371 Minor tweaks 2024-05-24 01:03:44 -04:00
8db9cc7e8f Make a version of llama-replace-in-region that does not have context 2024-05-23 22:59:43 -04:00
6191654511 Give the AI the ability to evaluate elisp 2024-05-23 18:31:13 -04:00
cdd674fdcd Many more llama improvement 2024-05-23 13:28:24 -04:00
e91b1d6480 Make llama chat nicer 2024-05-23 10:49:23 -04:00
8ae6544709 Use correct straight-use-package recipe format for a local repo 2024-05-23 00:53:32 -04:00
cdd892d4e1 Switch out llm for llama 2024-05-23 00:53:18 -04:00
5faa99158d Make the default llm chat aware that its inside of Emacs 2024-05-21 11:01:13 -04:00