Go to file
2023-04-29 05:54:55 +00:00
samples an amazing commit 2023-04-29 04:14:56 +00:00
src added rudimentary web UI 2023-04-29 05:54:55 +00:00
LICENSE Initial commit 2023-04-29 03:37:11 +00:00
README.md an amazing commit 2023-04-29 04:14:56 +00:00
requirements.txt an amazing commit 2023-04-29 04:14:56 +00:00

Generative Agents

This serves as yet-another cobbled together application of generative agents utilizing LangChain as the core dependency and subjugating a "proxy" for GPT4.

In short, by utilizing a language model to summarize, rank, and query against information, immersive agents can be attained.

Installation

pip install -r requirements.txt

Usage

Set your environment variables accordingly:

  • LLM_TYPE: (oai, llamacpp): the LLM backend to use in LangChain. OpenAI requires some additional environment variables:
    • OPENAI_API_BASE: URL for your target OpenAI
    • OPENAI_API_KEY: authentication key for OpenAI
    • OPENAI_API_MODEL: target model
  • LLM_MODEL: (./path/to/your/llama/model.bin): path to your GGML-formatted LLaMA model, if using llamacpp as the LLM backend
  • LLM_EMBEDDING_TYPE: (oai, llamacpp, hf): the embedding model to use for similarity computing.

To run:

python .\src\main.py

Plans

I do not plan on making this uber-user friendly like mrq/ai-voice-cloning, as this is just a stepping stone for a bigger project integrating generative agents.

I do, however, plan on adding a simple Gradio web UI to interface with this better.