Kino PromptBuddy
An Elixir Livebook smart cell for pair programming with LLMs, keeping a conversation about the notebook you are actively creating.
Inspiration
This project was inspired by Jeremy Howard'sSolve.it, an app and methodology designed to augment human capabilities with AI. PromptBuddy brings this pair-programming methodology to Elixir and Livebook.
What is PromptBuddy?
PromptBuddy is a Livebook Smart Cell that allows you to give prompts in the context of all the cells that precede it in the notebook. When you submit a prompt, it automatically includes the source code from all previous cells as context for the LLM, enabling contextual assistance as you develop your notebook.
Features
- Contextual LLM interaction: Automatically includes all preceding cells as context
- Streaming responses: See the LLM's response in real-time as it generates
- Multiple LLM support: Works with any OpenRouter-compatible model via ReqLLM
- Simple UI: Clean form-based interface integrated into Livebook
- Session introspection: Automatically discovers and includes notebook context
Installation
kino_promptbuddy can be installed by adding it to your list of dependencies in your Livebook setup section:
Mix.install([
{:kino_promptbuddy, "~> 0.0.1"}
])If you want to track the latest commit from the repository directly, point Mix at GitHub:
Mix.install([
{:kino_promptbuddy, github: "fredguth/kino_promptbuddy"}
])Configuration
Before using PromptBuddy, you need to configure an API key for your LLM provider. The library uses OpenRouter by default.
Add your OpenRouter API key to Livebook's Secrets menu (accessible from the navbar):
-
Secret name:
LB_OPENROUTER_API_KEY - Value: Your OpenRouter API key
Then in your setup cell:
if key = System.get_env("LB_OPENROUTER_API_KEY") do
ReqLLM.put_key(:openrouter_api_key, key)
endUsage
- Add the PromptBuddy package to your Livebook setup section
- Configure your API key as described above
- Insert a "Prompt Buddy" smart cell anywhere in your notebook
- Type your prompt and submit
- The LLM will receive context from all preceding cells and provide a contextual response
The smart cell will:
- Collect all cell sources from the beginning of the notebook up to (but not including) the current cell
- Use the first cell as a system message
- Use all subsequent cells as user messages
- Append your prompt as the final user message
- Stream the response back in real-time
How It Works
PromptBuddy uses Livebook's introspection capabilities to:
- Identify the current cell and session
- Connect to the Livebook node via ERPC
- Retrieve the notebook structure
- Extract source code from all preceding cells
- Build a conversation context for the LLM
- Stream responses back through Kino frames
Example
Once you have a few cells in your notebook (code, markdown, etc.), add a Prompt Buddy cell and ask questions like:
- "Explain what the code above does"
- "How can I optimize the function in the previous cell?"
- "Add error handling to the code"
- "What would be a good next step?"
The LLM will have full context of everything that came before.
Documentation
Documentation can be found at https://hexdocs.pm/kino_promptbuddy.
Development
To understand how PromptBuddy was built from scratch, check out the from_scratch.livemd notebook, which walks through the entire development process step-by-step.
License
MIT