Metadata-Version: 2.2
Name: ai_puti
Version: 0.1.0b1
Summary: puti: MultiAgent-based package for LLM
Home-page: https://github.com/aivoyager/puti
Author: obstaclews
Author-email: obstaclesws@qq.com
Maintainer: obstaclews
Keywords: llm,multiagent,package,agent,twikit,openai,websearch,terminal,python,file,fastapi
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: ==3.12
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: pytz==2022.6
Requires-Dist: click==8.1.3
Requires-Dist: requests==2.28.1
Requires-Dist: celery==5.4.0
Requires-Dist: colorama==0.4.6
Requires-Dist: fastapi==0.115.11
Requires-Dist: httpx==0.28.1
Requires-Dist: langchain==0.3.20
Requires-Dist: langchain_core==0.3.43
Requires-Dist: langchain_openai==0.3.8
Requires-Dist: loguru==0.7.3
Requires-Dist: openai==1.65.4
Requires-Dist: Pillow==11.1.0
Requires-Dist: pydantic==2.10.6
Requires-Dist: pydantic_core
Requires-Dist: pydantic_settings==2.8.1
Requires-Dist: pytest==8.3.5
Requires-Dist: PyYAML==6.0.2
Requires-Dist: twikit==2.3.3
Requires-Dist: uvicorn==0.34.0
Requires-Dist: websocket_client==1.8.0
Requires-Dist: mypy==1.15.0
Requires-Dist: ollama==0.4.7
Requires-Dist: pandas==2.2.3
Requires-Dist: python-box==7.3.2
Requires-Dist: mcp==1.6.0
Requires-Dist: faiss-cpu==1.10.0
Requires-Dist: psutil==6.0.0
Requires-Dist: tenacity==8.2.3
Requires-Dist: redis==5.2.1
Requires-Dist: gevent==25.4.2
Requires-Dist: googlesearch-python==1.3.0
Requires-Dist: pymysql==1.1.1
Requires-Dist: llama_index==0.12.37
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: keywords
Dynamic: maintainer
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# Puti - Multi-Agent Framework

<p align="center">
    <em>Tackle complex tasks with autonomous agents.</em>
</p>

<p align="center">
    <a href="./README.md">
        <img src="https://img.shields.io/badge/document-English-blue.svg" alt="EN doc">
    </a>
    <a href="https://opensource.org/licenses/MIT">
        <img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT">
    </a>
    <a href="./docs/ROADMAP.MD">
        <img src="https://img.shields.io/badge/ROADMAP-ROADMAP-blue.svg" alt="Roadmap">
    </a>
</p>

<p align="center">
    <!-- Project Stats -->
    <a href="https://github.com/aivoyager/puti/issues">
        <img src="https://img.shields.io/github/issues/aivoyager/puti" alt="GitHub issues">
    </a>
    <a href="https://github.com/aivoyager/puti/network">
        <img src="https://img.shields.io/github/forks/aivoyager/puti" alt="GitHub forks">
    </a>
    <a href="https://github.com/aivoyager/puti/stargazers">
        <img src="https://img.shields.io/github/stars/aivoyager/puti" alt="GitHub stars">
    </a>
    <a href="https://github.com/aivoyager/puti/blob/main/LICENSE">
        <img src="https://img.shields.io/github/license/aivoyager/puti" alt="GitHub license">
    </a>
    <a href="https://star-history.com/#aivoyager/puti">
        <img src="https://img.shields.io/github/stars/aivoyager/puti?style=social" alt="GitHub star chart">
    </a>
</p>

## ✨ Introduction

Puti is a Multi-Agent framework designed to tackle complex tasks through collaborative autonomous agents. It provides a flexible environment for building, managing, and coordinating various agents to achieve specific goals.

## 🚀 Features

*   **Multi-Agent Collaboration**: Supports communication and collaboration between multiple agents.
*   **Flexible Agent Roles**: Allows defining agent roles with different goals and capabilities (e.g., Talker, Debater).
*   **Environment Management**: Provides environment for managing agent interactions and message passing.
*   **Configurable**: Easily configure LLM providers and other settings through YAML files.
*   **Extensible**: Easy to build and integrate your own agents and tools.

## 📦 Installation

Clone the repository and install required dependencies:

```bash
git clone https://github.com/aivoyager/puti.git
cd puti
pip install -r requirements.txt
```

## 💡 Usage Examples

### 1. Basic Chat

Simple conversation with a single agent.

```python
from llm.roles.talker import PuTi
from llm.nodes import ollama_node  # Or your preferred LLM node

msg = 'What is calculus?'
talker = PuTi(agent_node=ollama_node)
response = talker.cp.invoke(talker.run, msg)
print(response)
```

### 2. Chat with MCP

Interact with an agent that can call external tools using Message Coordination Protocol (MCP).

```python
import asyncio
from llm.envs import Env
from llm.roles.talker import PuTiMCP
from llm.messages import Message

env = Env()
talker = PuTiMCP()  # This agent can potentially call tools via MCP server
env.add_roles([talker])

msg = 'How long is the flight from New York (NYC) to Los Angeles (LAX)?'
env.publish_message(Message.from_any(msg))

asyncio.run(env.run())
print(env.history) # View the conversation history
```

### 3. Agent Debate

Set up two agents for a debate.

```python
import asyncio
from llm.envs import Env
from llm.messages import Message
from llm.roles.debater import Debater

env = Env(name='debate_game', desc='Agents debating a topic')

# Define two debaters with opposing goals
debater1 = Debater(name='Alex', goal='Present positive arguments in each debate round. Your opponent is Rock.')
debater2 = Debater(name='Rock', goal='Present negative arguments in each debate round. Your opponent is Alex.')

env.add_roles([debater1, debater2])

# Start the debate
topic = 'Is technological development beneficial or harmful?'
initial_message = Message.from_any(
    f'You are now participating in a debate. The topic is: {topic}',
    receiver=debater1.address, # Start with debater1
    sender='user'
)

# Add message to the other debater's memory as well
debater2.rc.memory.add_one(initial_message)

env.publish_message(initial_message)

# Run the environment asynchronously
asyncio.run(env.run())

# Print the debate history
print(env.history)
```

## ⚙️ Configuration

Configure your LLM provider and other settings in `conf/config.yaml`:

```yaml
# conf/config.yaml
llm:
    - openai:
        MODEL: "gpt-4o-mini"  # Or your preferred model
        BASE_URL: "YOUR_OPENAI_COMPATIBLE_API_BASE_URL" # e.g., https://api.openai.com/v1
        API_KEY: "YOUR_API_KEY"
        MAX_TOKEN: 4096
    - llama: # Example for Ollama
        BASE_URL: "http://localhost:11434" # Your Ollama server address
        MODEL: "llama3.1:latest"
        STREAM: true
    # Add other LLM configurations as needed
```

Access configuration in your code:

```python
from conf.llm_config import OpenaiConfig, LlamaConfig

# Access OpenAI configuration
openai_conf = OpenaiConfig()
print(f"Using OpenAI Model: {openai_conf.MODEL}")

# Access Llama configuration
llama_conf = LlamaConfig()
print(f"Using Llama Model: {llama_conf.MODEL}")
```

## 🤝 Contributing

Contributions are welcome! Please refer to the contribution guide (if available) or contribute by submitting Issues or Pull Requests.

1.  Fork the repository
2.  Create your Feature branch (`git checkout -b feature/AmazingFeature`)
3.  Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4.  Push to the branch (`git push origin feature/AmazingFeature`)
5.  Open a Pull Request

## 📜 License

This project is licensed under the MIT License. See [LICENSE](LICENSE) file for details.

---

_Let the Puti framework empower your multi-agent application development!_

