# Generative Agents This serves as yet-another cobbled together application of [generative agents](https://arxiv.org/pdf/2304.03442.pdf) utilizing [LangChain](https://github.com/hwchase17/langchain/tree/master/langchain) as the core dependency and subjugating a "proxy" for GPT4. In short, by utilizing a language model to summarize, rank, and query against information, immersive agents can be attained. ## Installation ``` pip install -r requirements.txt ``` ## Usage Set your environment variables accordingly: * `LLM_TYPE`: (`oai`, `llamacpp`): the LLM backend to use in LangChain. OpenAI requires some additional environment variables: - `OPENAI_API_BASE`: URL for your target OpenAI - `OPENAI_API_KEY`: authentication key for OpenAI - `OPENAI_API_MODEL`: target model * `LLM_MODEL`: (`./path/to/your/llama/model.bin`): path to your GGML-formatted LLaMA model, if using `llamacpp` as the LLM backend * `LLM_EMBEDDING_TYPE`: (`oai`, `llamacpp`, `hf`): the embedding model to use for similarity computing. To run: ``` python .\src\main.py ``` ## Plans I ***do not*** plan on making this uber-user friendly like [mrq/ai-voice-cloning](https://git.ecker.tech/mrq/ai-voice-cloning), as this is just a stepping stone for a bigger project integrating generative agents. I do, however, plan on adding a simple Gradio web UI to interface with this better.