A code interpreter and ChatGPT plugin.

iamgreggarcia iamgreggarcia Last update: Feb 29, 2024

codesherpa

codesherpa is a code interpreter ChatGPT plugin and a standalone code interpreter (experimental). Read the Quickstart section to try it out.

You can use it as a

  • code interpreter plugin with ChatGPT
    • API for ChatGPT to run and execute code with file persistance and no timeout
  • standalone code interpreter (experimental).
    • Using OpenAI's GPT function calling, I've tried to recreate the experience of the ChatGPT Code Interpreter by using functions. For many reasons, there is a significant difference between this implementation and the ChatGPT Code Interpreter created by OpenAI. It's still very buggy and inconsistent, but I wanted to release it for those interested.

Standalone code interpreter demo:

CS-standalone.mp4

ChatGPT Plugin demo:

codesherpa.ChatGPT.Code.Interpreter.prompts.-.music.csv.demo-1-highlight.mp4

See more examples here

Recent Updates

  • July 13, 2023:
    • A basic standalone UI is now available. Read the Quickstart section to try it (requires an OpenAI API key). NOTE: expect many bugs and shortcomings, especially if you've been binging OpenAI's Code Interpreter since its been made generally available to Plus subscribers
    • New contributions from emsi (#18) and PeterDaveHello (#28)! đź‘Ź
  • June 21, 2023:
    • The ChatGPT plugin service will now fetch the openapi.json generated be the server. Also added request example data which is included in the api spec. This reduces the size of the plugin manifest description_for_model.
    • Updated the README section on future work.
Previous Updates
  • June 18, 2023: Added docker-compose.yml
  • May 31, 2023: Introduced new file upload interface via upload.html and corresponding server endpoint, allowing you to upload files at localhost:3333/upload or by telling ChatGPT you want to upload a file or have a file you want to work with: upload-demo Refactored Python code execution using ast module for enhanced efficiency. Local server and manifest file updates to support these features. Minor updates to REPL execution, error handling, and code formatting.
  • May 22, 2023: Refactored README to provide clear and concise instructions for building and running codesherpa.
  • May 20, 2023: codesherpa now supports multiple programming languages, including Python, C++, and Rust.

Quickstart

NEW: Standalone code interpreter (experimental)

To try the new chat interface:

# Clone the repository
git clone https://github.com/iamgreggarcia/codesherpa.git

Add your OPENAI_API_KEY to a copy of .env.example:

cd codesherpa/frontend
cp .env.example .env.local

Install dependencies and startup the Next.js app:

pnpm install
pnpm dev

OR

npm install
npm run dev

Download the docker image OR run the codesherpa API locally (beware!):

Docker image:

# Pull the Docker image
docker pull ghcr.io/iamgreggarcia/codesherpa:latest

# Run the Docker image locally
docker compose up

Run the server locally (potentially risky!):

cd codesherpa
make dev

Navigate to http://localhost:3000. Expect bugs and inconsistencies.

Installation and Running codesherpa as a ChatGPT Plugin

Prerequisites

Ensure the following software is installed on your system:

  • Python 3.10
  • Docker
  • Docker Compose (optional). Download Docker Desktop or the plugin to use Docker Compose

Option 1: Using Docker image from Github Packages

# Pull the Docker image
docker pull ghcr.io/iamgreggarcia/codesherpa:latest

# Run the Docker image locally
docker compose up

Option 2: Using the repository and Make commands

# Clone the repository
git clone https://github.com/iamgreggarcia/codesherpa.git

# Navigate to the repository directory
cd codesherpa

# Build the Docker image using Make
make build-docker

# Run the Docker image locally
make run-docker-localserver

Option 3: Using the repository and Docker commands

Instead of Make commands, you can use the following Docker commands directly or use Docker Compose

# Clone the repository
git clone https://github.com/iamgreggarcia/codesherpa.git

# Navigate to the repository directory
cd codesherpa

# Build the Docker image
docker build -t codesherpa .

# Run the Docker image locally
docker run -p 3333:3333 codesherpa python3 -c "import localserver.main; localserver.main.start()"

# OR use Docker Compose

docker compose up

Whichever option you choose, codesherpa will be accessible at localhost:3333.

Connecting codesherpa to ChatGPT

  1. Navigate to the ChatGPT UI, and access the plugin store.
  2. Select "Develop your own plugin".
  3. In the plugin URL input, enter localhost:3333. Your ChatGPT should now be able to use codesherpa's features.

Examples

Below are some examples. Please note that portions of these videos are edited and/or sped up for brevity.

Ploting a vector field on a sphere creating an animated gif (short):

Vector.field.on.a.sphere.animation.mp4

Demo of the Demo: Recreating the ChatGPT Code Interpreter Video Demo

Most of us have seen the ChatGPT Code Interpreter Video Demo, which is the inspiration for this project. So I thought it fitting to ask similar questions as those in the OpenAI video demo.

  • Asking about properties of the function 1/sin(x):
codesherpa.ChatGPT.Code.Interpreter.prompts.-.demo.mp4
codesherpa.ChatGPT.Code.Interpreter.prompts.-.music.csv.demo-1.mp4

Future Work

  • Improve the UI (maybe). The UI needs a lot of love:
    • multi-conversation support
    • previous message editing
    • granular control over parameters
    • consistent function call rendering

Contributing

I welcome contributions! If you have an idea for a feature, or want to report a bug, please open an issue, or submit a pull request.

Steps to contribute:

  1. Fork this repository.
  2. Create a feature branch git checkout -b feature/YourAmazingIdea.
  3. Commit your changes git commit -m 'Add YourAmazingIdea'.
  4. Push to the branch git push origin feature/YourAmazingIdea.
  5. Submit a Pull Request.

Disclaimer

codesherpa is independently developed and not affiliated, endorsed, or sponsored by OpenAI.

License

This project is licensed under the terms of the MIT license.

Subscribe to our newsletter