Skip to main content

Introduction

Airy supports multiple connectors and apps in order to expand all the functionalities of a streaming platform into a complete AI streaming app.

By installing the LLM apps and components, you can connect your existing data to Airy and build powerful automations.

LLMsโ€‹

An LLM is a kind of AI model tailored to decipher and produce text resembling human communication, using extensive data as its foundation. It learns from a wide array of online content to anticipate subsequent words in a series, empowering it to respond to queries, create material, and aid in multiple activities. Airy offers a modular interface compatible with various LLMs.

Currently supported LLM components:

Apps:โ€‹

  • Llama2: LLama2 is a large language model that can be downloaded and deployed in a self-hosted environment.

Connectors:โ€‹

  • OpenAI: The OpenAI connector is a bidirectional connector that connects Airy to the OpenAI LLM.

Vector databasesโ€‹

A vector database is a datastore with high-dimensional capacity is ideal for consistently storing data related to natural language processing or visuals. In this setup, data is depicted as vectors, and extraction depends on likeness, facilitating swift similarity-based searches and context formulation. Such vector databases are adept at housing vectorized depictions of continuous data streams, making real-time queries and enhancing contextual understanding for questions directed at LLMs.

Currently supported vector databases:

Apps:โ€‹

  • FAISS: FAISS is a vector database that allows developers to quickly group and search embeddings of documents that are similar to each other.
  • Weaviate: An open-source vector database that allows storing object and vector embeddings from various ML-models.

Connectors:โ€‹

  • FAISS connector: A connector that can stream Airy data from the Kafka topics into the FAISS vector database.
  • Pinecone: A fully managed vector database solution, used to store and search vector embeddings.