Airy supports multiple connectors and apps in order to expand all the functionalities of a streaming platform into a complete AI streaming app.
By installing the LLM apps and components, you can connect your existing data to Airy and build powerful automations.
An LLM is a kind of AI model tailored to decipher and produce text resembling human communication, using extensive data as its foundation. It learns from a wide array of online content to anticipate subsequent words in a series, empowering it to respond to queries, create material, and aid in multiple activities. Airy offers a modular interface compatible with various LLMs.
Currently supported LLM components:
- Llama2: LLama2 is a large language model that can be downloaded and deployed in a self-hosted environment.
- OpenAI: The OpenAI connector is a bidirectional connector that connects Airy to the OpenAI LLM.
A vector database is a datastore with high-dimensional capacity is ideal for consistently storing data related to natural language processing or visuals. In this setup, data is depicted as vectors, and extraction depends on likeness, facilitating swift similarity-based searches and context formulation. Such vector databases are adept at housing vectorized depictions of continuous data streams, making real-time queries and enhancing contextual understanding for questions directed at LLMs.
Currently supported vector databases:
- FAISS: FAISS is a vector database that allows developers to quickly group and search embeddings of documents that are similar to each other.
- Weaviate: An open-source vector database that allows storing object and vector embeddings from various ML-models.
- FAISS connector: A connector that can stream Airy data from the Kafka topics into the FAISS vector database.
- Pinecone: A fully managed vector database solution, used to store and search vector embeddings.