Swarm with local LLaMa - Handoff supported?
The swarm library introduced a key new primitive Agent Handoff by returning an object of Agent type, so that the control flow continues into new Agent’s code..
Works great with Openai API. But can you get handoff to work with Local Ollama? Not clear.
Under the hood, OpenAI modified the tool calling template — added a new field ‘sender’.
{'model': 'llama-3.1-70b-versatile', 'messages': [
{'role': 'system', 'content': 'You are a helpful agent.'}, {'role': 'user', 'content': 'Transfer me to sales.'}, {'role': 'assistant',
'tool_calls': [
{'function': {'arguments': '{}', 'name': 'transfer_to_sales'}, 'id': 'call_7vaz', 'type': 'function'}], 'sender': 'Manager'},
{'role': 'tool', 'tool_call_id': 'call_7vaz', 'tool_name': 'transfer_to_sales', 'content': '{"assistant": "Sales Agent"}'}],
'stream': False, 'extra_body': {}}
The value of this field together with the content obtained from tool call is used to decide the Agent to handoff to, using the Result datatype.
The open LLM APIs are unable to interpret sender or Result as of now. In fact, groq/llama3 complains about sender. If agent handoff gains traction, we may have to upgrade OpenAI compatibility soon.