Skip to content

Conditional flow with tool calls #33

Open
@r-leyshon

Description

@r-leyshon

Hi there,

I'm interested in this package for a more complex chatbot that I've been developing. I need to preferably stream openAI responses with tools. Getting this to work with the shiny chat component has been a little tricky, but then I came across a reference to chatlas in the docstring for the shiny Chat class. Working through your docs and examples, I can get so far with it.

This approach allows for streaming responses with tool calls and shiny ui
widgets. Though the tool calls need to be self-contained. Whatever the tool
returns is fed into the model for a subsequent response, rather than being
available for additional conditional flow.

I would like to instead receive the function name and parameter values and
call get_current_temperature(), passing the json response back to model.
This would allow me to display a notification without relying on a side
effect.

from chatlas import ChatOpenAI, Turn
from shiny.express import ui
import dotenv
import requests

openai_key = dotenv.dotenv_values()["OPENAI_KEY"]
messages = [Turn(role="system", contents="You are a helpful but terse assistant.")]
messages.append(Turn(role="assistant", contents = "Hi! How can I help you today?"))


# func contains side effect which I would prefer to handle outside
def get_current_temperature(latitude: float, longitude: float):
    """
    Get the current weather given a latitude and longitude.

    Parameters
    ----------
    latitude
        The latitude of the location.
    longitude
        The longitude of the location.
    """
    lat_lng = f"latitude={latitude}&longitude={longitude}"
    url = f"https://api.open-meteo.com/v1/forecast?{lat_lng}&current=temperature_2m,wind_speed_10m&hourly=temperature_2m,relative_humidity_2m,wind_speed_10m"
    response = requests.get(url)
    json = response.json()
    weather_now = json['current']
    ui.notification_show(f"Queried weather API... {weather_now}")
    return weather_now


chat_model = ChatOpenAI(
    api_key=openai_key,
    turns=messages,
    )
chat_model.register_tool(get_current_temperature)
chat = ui.Chat(
    id="ui_chat",
    messages=["Hi! How can I help you today?"],
)
chat.ui()


@chat.on_user_submit
async def handle_user_input():
    response = chat_model.stream(chat.user_input())
    await chat.append_message_stream(response)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions