Metadata-Version: 2.1
Name: LLM-Mediator
Version: 0.9.14
Summary: Just a simple mediator for different LLM models.
Home-page: https://github.com/zeuscsc/llm_mediator.git
Author: Zeus Chiu
Author-email: zeuscsc@gmail.com
License: MIT
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Requires-Python: >=3.9.0
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: openai
Requires-Dist: torch
Requires-Dist: torchvision
Requires-Dist: torchaudio
Requires-Dist: transformers
Requires-Dist: peft
Requires-Dist: chardet
Requires-Dist: sentence-transformers
Requires-Dist: pynvml


# LLM Mediator
Just a simple mediator for different LLM models
Will cache the response for the same input text during debug and save money for you.

## Features
- [x] Cache
- [x] GPT-3.5
- [x] GPT-3.5-16k
- [x] GPT-4
- [x] GPT-4-32k
- [ ] GPT-4-vision
- [ ] LLaMA2
- [ ] Falcon

## Quick Usage
Install:
~~~shell
pip install LLM-Mediator
# Install llm_mediator from github
pip install git+https://github.com/zeuscsc/llm_mediator.git
~~~
Usage:
~~~python
model_name="GPT-4-32k"
model=LLM(GPT)
model.model_class.set_model_name(model_name)
response=model.get_response(system,assistant,user)
~~~
Where `system`, `assistant`, `user` are the input text, and `response` is the output text.
Or you can just follow the docs from OpenAi:
~~python
generator=model.get_chat_completion(messages=messages,functions=functions,function_call=function_call,stream=True,temperature=0,completion_extractor=GPT.AutoGeneratorExtractor,print_chunk=False)
~~~
## Set Environment Variables
Unix:
~~~shell Unix
export OPENAI_API_KEY=your openai key (Nessary for GPT)
export TECKY_API_KEY=your tecky key (Nessary for GPT)
~~~
Windows:
~~~shell Windows
$ENV:OPENAI_API_KEY="your openai key" (Nessary for GPT)
$ENV:TECKY_API_KEY="your tecky key" (Nessary for GPT)
~~~
Python:
Create a 
~~~python
from llm_mediator import gpt
gpt.OPENAI_API_KEY="your openai key" (Nessary for GPT)
gpt.TECKY_API_KEY = "your tecky key" (Nessary for GPT)
~~~
