Hayhooks
STDIO部署Haystack管道和代理为REST API
部署Haystack管道和代理为REST API
Hayhooks makes it easy to deploy and serve Haystack Pipelines and Agents.
With Hayhooks, you can:
📚 For detailed guides, examples, and API reference, check out our comprehensive documentation.
# Install Hayhooks pip install hayhooks
hayhooks run
Create a minimal agent wrapper with streaming chat support and a simple HTTP POST API:
from typing import AsyncGenerator from haystack.components.agents import Agent from haystack.dataclasses import ChatMessage from haystack.tools import Tool from haystack.components.generators.chat import OpenAIChatGenerator from hayhooks import BasePipelineWrapper, async_streaming_generator # Define a Haystack Tool that provides weather information for a given location. def weather_function(location): return f"The weather in {location} is sunny." weather_tool = Tool( name="weather_tool", description="Provides weather information for a given location.", parameters={ "type": "object", "properties": {"location": {"type": "string"}}, "required": ["location"], }, function=weather_function, ) class PipelineWrapper(BasePipelineWrapper): def setup(self) -> None: self.agent = Agent( chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"), system_prompt="You're a helpful agent", tools=[weather_tool], ) # This will create a POST /my_agent/run endpoint # `question` will be the input argument and will be auto-validated by a Pydantic model async def run_api_async(self, question: str) -> str: result = await self.agent.run_async({"messages": [ChatMessage.from_user(question)]}) return result["replies"][0].text # This will create an OpenAI-compatible /chat/completions endpoint async def run_chat_completion_async( self, model: str, messages: list[dict], body: dict ) -> AsyncGenerator[str, None]: chat_messages = [ ChatMessage.from_openai_dict_format(message) for message in messages ] return async_streaming_generator( pipeline=self.agent, pipeline_run_args={ "messages": chat_messages, }, )
Save as my_agent_dir/pipeline_wrapper.py.
hayhooks pipeline deploy-files -n my_agent ./my_agent_dir
Call the HTTP POST API (/my_agent/run):
curl -X POST http://localhost:1416/my_agent/run \ -H 'Content-Type: application/json' \ -d '{"question": "What can you do?"}'
Call the OpenAI-compatible chat completion API (streaming enabled):
curl -X POST http://localhost:1416/chat/completions \ -H 'Content-Type: application/json' \ -d '{ "model": "my_agent", "messages": [{"role": "user", "content": "What can you do?"}] }'
Or integrate it with Open WebUI and start chatting with it!
Hayhooks is actively maintained by the deepset team.