Maitai supports some of the Langchain features for tool calling, easing the migration from Lanchain to Maitai.

Python Tool Calling

Using the Maitai tool wrapper, you can easily turn existing functions into tool calls.

Previously, you may have been defining tools as JSON objects:

weather_tool = {
    "type": "function",
    "strict": True,
    "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA"
                },
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
            },
            "required": ["location"]
        }
    }
}
available_functions = {
    "get_current_weather": get_current_weather,
}
response = maitai.chat.completions.create(
    messages=messages,
    model="llama3-groq-70b-8192-tool-use-preview", ## Remove this line to set model in Portal
    session_id="YOUR_SESSION_ID",
    intent="CONVERSATION",
    application="demo_app",
    tool_choice="auto",
    tools=[weather_tool],
}

for tool_call in response.choices[0].message.tool_calls:
    function_to_call = available_functions[tool_call.function.name]
    function_args = json.loads(tool_call.function.arguments)
    function_response = function_to_call(**function_args)

And then you had to wrangle that tool call into a map and pass arguemnts.

You can continue doing that, or you can use the @tool decorator to automatically build that JSON object using your function docstring and arguments.

The same tool call above simply becomes:

import maitai
from maitai.tools import tool, Tools

@tool(strict=True)
def get_current_weather(location, unit="fahrenheit"):
    """
    Get the current weather in a given location

    :param location: The city and state, e.g. San Francisco, CA
    :param unit: The unit of measurement (default is "fahrenheit")
    """
    ...

tools = Tools(get_current_weather)

response = maitai.chat.completions.create(
    messages=messages,
    model="llama3-groq-70b-8192-tool-use-preview", ## Remove this line to set model in Portal
    session_id="YOUR_SESSION_ID",
    intent="CONVERSATION",
    application="demo_app",
    tool_choice="auto",
    tools=tools,
}

for tool_call in response.choices[0].message.tool_calls:
    tool_response = tools.invoke(tool_call)

Javascript Tool Calling

Maitai provides an LLM agent that you can substitute into your Langchain tool call flow.

Previously, you may have been using Langchain tools in Javascript like:

import { ChatOpenAI } from "@langchain/openai";
import { createOpenAIFunctionsAgent, AgentExecutor } from "langchain/agents";

const llm = new ChatOpenAI({
  model: "gpt-3.5-turbo",
  temperature: 0,
});
...
const agent = await createOpenAIFunctionsAgent({
  llm,
  tools,
  prompt,
});

const agentExecutor = new AgentExecutor({
  agent,
  tools,
  verbose: true,
});

const result = await agentExecutor.invoke({
  input: `What is the value of foo?`,
});

In order to use Maitai, all you need to change is:

import Maitai from "maitai";

const llm = new Maitai.MaitaiChat({
  application: "iantest",
});
...
const result = await agentExecutor.invoke({
  input: model.input(`What is the value of foo?`, {
    session_id: "YOUR SESSION ID",
    intent: "YOUR APPLICATION CALL INTENT",
  }),
});