vscode extension, add a file, ask gpt3.5 with all the files you pass into it up to 16k tokens
Go to file
2023-06-25 12:22:16 -04:00
.github/ISSUE_TEMPLATE Update issue templates 2023-06-22 11:53:36 -04:00
.vscode Base files for creation of extension 2023-06-18 15:42:52 -04:00
images feature(sidebar): Renders webpanel on the side 2023-06-23 14:27:57 -04:00
test Base files for creation of extension 2023-06-18 15:42:52 -04:00
.eslintrc.json Base files for creation of extension 2023-06-18 15:42:52 -04:00
.gitignore Base files for creation of extension 2023-06-18 15:42:52 -04:00
.vscodeignore Base files for creation of extension 2023-06-18 15:42:52 -04:00
CHANGELOG.md Updated changelog to reflect releases 2023-06-22 12:35:24 -04:00
CODE_OF_CONDUCT.md Code of conduct relative to Contributor Covenant 2023-06-22 12:40:10 -04:00
CONTRIBUTING.md Update contribution guide, readme is documentation 2023-06-22 12:42:53 -04:00
extension.js fix(assistant): Assistant would not answer reponse 2023-06-25 12:22:16 -04:00
jsconfig.json Base files for creation of extension 2023-06-18 15:42:52 -04:00
LICENSE.txt Update LICENSE.txt 2023-06-22 12:59:06 -04:00
package-lock.json feature(openai-api): Allows the user to submit the question and contents through to the model. Untested 2023-06-21 11:45:34 -04:00
package.json fix(explorer-context): Shows the add files context for any file 2023-06-25 11:51:30 -04:00
README.md Update readme to disclose model used 2023-06-22 12:47:11 -04:00
vsc-extension-quickstart.md Base files for creation of extension 2023-06-18 15:42:52 -04:00

gpt-contextfiles

** currently in development, if you'll like to contribute or provide any feedback check out the link **

I was annoyed with copying responses into chatgpt and other LLMs for debugging my code across files, so I decided to make an extension that will do that.

You simply right click each file you want to pass through, check or uncheck the checkbox, then enter your question and pass along the response over the api to your LLM.

This extension uses the openai api, there are many models avaliable:

https://openai.com/pricing

However, being orientated with managing files this project defaults to the 16k context with GPT-3.5-turbo-16k

If you wish to change the model, you must change the model in the extension.js file

https://platform.openai.com/docs/models/gpt-3-5

Installation

Add your api key to OPENAI_API_KEY for your windows/linux environment variable (tested with system variable)

Features

Clear -> Clears the files currently available

Submit -> Submits the query to the api

Refresh -> refreshes the window so that all new files will be available for that session.

User must ctrl+shift+p and click on the Open GPT Context Panel option and then add files (before or after), then input the question.

Examples

We can select two files we want to pass through, however we can uncheck one of them for later debugging and enter our question:

What does this do?
c:\dev\test\gpt-contextfiles-test\program.js:
\```
	window.alert("Hello World!")
\```

Selected Files:
[x] c:\dev\test\gpt-contextfiles-test\program.js
[ ] c:\dev\test\gpt-contextfiles-test\program2.js

Expected Ouput:

The window.alert() method is a built-in JavaScript function that displays an alert box with a specified message and an OK button. In this case, the message is "Hello World!".