Creating a ChatGPT API Swiss Army Knife

The world has been taken by storm with the use of LLMs.1 We see it all over the place currently with some of the most famous examples being OpenAI’s ChatGPT, Bing Chat (also OpenAI powered)2, and Facebook’s LLaMa (which you can run on your own hardware)3.

It would behoove us to learn how to use this new technology, as it really does feel like another Dot-Com moment.

We were inspired by shell_gpt and we wanted to make our own for some different use cases. We created our internal tool Thyme, which acts as a communication device for the OpenAI ChatGPT API. It currently has the functionality to accept general queries, or use one of the defined prompts in the program. If you want your own tool to use just to query the API then we suggest you use shell_gpt as it is a solid program.

We use a Golang library called go-openai which acts as a wrapper around the API so we can call it easier.

We will want to ways to use the tool: through passing a query directly like a search engine, and for using pre-defined prompts that do more advance manipulation.

type Prompt struct {
    Name        string
    Text        string
    Description string
    Examples    map[int]PromptExample // Number them in order to be passed

type PromptExample struct {
    Name string //example_assistant, example_user
    Text string

Each prompt in our case consists of the name of the prompt (“listify”, “active_voice”, or whatever you want here), The actual text of the initial prompt, a description to display to the user, and examples to feed the API. While “examples” are seemingly not in the official API documentation on the OpenAI website they are listed in the Python documentation so we have included them for use in future projects.

Currently our prompts are fairly simple but we are doing this because we will be reusing them frequently. For example, we take audio notes of specific meetings or events. We use Whisper to transcribe our voice memos that we save into a specific directory (the code for this can be found here). We would like to be able to say a phrase such as “this memo is a list”, which our script could pick up on, and then call thyme to run the listify prompt.

prompts["listify"] = Prompt{
        Name:        "listify",
        Text:        "Return a numbered list of actions items from the following text",
        Description: "This prompt takes a block of text and returns a numbered list of action items.",

The results could look something like this:

Bender :: src/letseatlabs/Thyme ‹main› % thyme -p listify --text "Mix equal parts of baking soda and vinegar to create a paste. Apply the paste onto the discolored areas of the sink and allow it to sit for 10-15 minutes. Scrub gently with a sponge, then rinse with warm water."
1. Mix equal parts of baking soda and vinegar to create a paste.
2. Apply the paste onto the discolored areas of the sink.
3. Allow the paste to sit for 10-15 minutes.
4. Scrub gently with a sponge.
5. Rinse with warm water. 

Except in the case of our automated script we would have the process capture the output of thyme and then process to do stuff() with it.

We also took the time to have fancy animations in the terminal, as well as ways to disable them if we want to capture just the raw output for inter-process work. We will probably be adding color soon too, but that is mainly for fun and education.

We wrote this short article hoping to inspire others to make their own little multi-purpose GPT tools. Even if you do not end up using it, just knowing how to interact with this will benefit you as a programmer going forward.