Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions lua/lspconfig/configs/lsp_ai.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
return {
default_config = {
cmd = { 'lsp-ai' },
filetypes = {},
root_dir = nil,
single_file_support = true,
init_options = {
memory = {
file_store = vim.empty_dict(),
},
models = vim.empty_dict(),
},
},
docs = {
Copy link
Contributor

@khaneliman khaneliman Oct 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Getting an issue downstream with this being a table instead of just a string, do we need to support the desc sometimes being a table of strings ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pr welcome

Copy link
Contributor

@khaneliman khaneliman Oct 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was just double checking whether it was supposed to only be strings or you do expect tables for this, if it can be changed I can do a quick PR. Created #3374 to address

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah that looks like a mistake on my part thanks for the PR 👍

[[
https://github.com/SilasMarvin/lsp-ai

LSP-AI is an open source language server that serves as a backend for AI-powered functionality in your favorite code
editors. It offers features like in-editor chatting with LLMs and code completions.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does chat work? Is that a custom lsp request?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's one available (document/textGeneration IIRC) but it feels somewhat low-level, The more user friendly way is to define a prefix in config that enables the chat code action when present in the buffer.



You will need to provide configuration for the inference backends and models you want to use, as well as configure
completion/code actions. See the [wiki docs](https://github.com/SilasMarvin/lsp-ai/wiki/Configuration) and
[examples](https://github.com/SilasMarvin/lsp-ai/blob/main/examples/nvim) for more information.
]],
},
}