LlamaIndex, formerly GPT Index, is a Python data framework designed to manage and structure LLM-based applications, with a particular emphasis on storage, indexing and retrieval of data.
LlamaIndex provides a complete set of tools to automate tasks such as data ingestion from heterogeneous sources (PDF files, Web pages, ...) and retrieval-augmented generation (RAG); it also features a rich ecosystem of plugins that make it possible to connect with third-party components, from vector stores to data readers.
Most of the examples in this section can run straight away as Colab notebooks, provided you have checked the pre-requisites.
If you prefer to run in local Jupyter, set up the LlamaIndex Python environment first.
CassIO powers integration with LlamaIndex, making it possible to easily develop LLM applications within this framework while taking advantage of Astra DB / Cassandra as the storage system.
The following examples can be run either locally as Jupyter notebooks (see the local setup section for instructions) or directly as Colab notebooks (check out the icon at the top of each page).
- Check out the Vector Store Quickstart for a primer on creating and using a Vector Store on top of a Cassandra (vector-capable) database.
This list will grow over time as new needs are addressed and the current extensions are refined.