Open Source or Cloud LLMs

You can bring your own Open AI key or use one of 10 cloud hosted models (LLAMA2, Mistral 7B, Hugging Face). Agent Cloud Open Source users can also connect to their own Open AI compatible endpoints.

How does it work

Bring your own Open AI key or connect Agent Cloud to 10+ open source models.

For highly secure environments, Open source users can connect agent cloud to their own LLMs to have a fully secure architecture with no access to the internet.

Get started with
Open Source or Cloud LLMs
Supports all major cloud LLMs (Open AI, Cohere, Anthropic Claude)
Point to your own self hosted LLM (LLAMA2, Hugging Face)
Empower employees with Digital assistants
Get started

Explore other features

Explore

The end to end RAG pipeline

Select your connector

Use our collection of data sources to sync data from other systems like confluence or upload your own pdf, docx, txt or csv file.
When selecting systems like databases (postgres, snowflake, bigquery) you can select tables and even columns to ingest.

Prep your data

For files you can provide instructions on how to split and chunk your data. Leverage Open AI latest text-embedding-3-small for embedding or select from open source models like BGE/base.

Vector store your data

Once data has been embedded the platform will store your data within a vector database. We also expose

Keep data fresh

Select what frequency you would like to sync data from the source. This can be manual, scheduled or a cron expression. This means users can query fresh data and know how recent the source was updated.

Start chatting with your data!

Now that data is synced, simply create an agent with your choice of LLM and start a session to talk to your data.