Member-only story

🦖RAPTOR🦖 Implementation Code Walk-through

Angelina Yang
2 min readApr 22, 2024

--

Last week we talked about RAPTOR — an advanced RAG technique that features recursive hierarchical clustering. If you would like to revisit:

🦖RAPTOR🦖 for Advanced RAG

“Retrieval-augmented language models can better adapt to changes in world state and incorporate long-tail knowledge.” Yet, the majority of existing RAG methodologies only retrieve short, contiguous chunks from a retrieval corpus, which limits the holistic understanding of the document’s overall context.

Read full story

Today, we’ll share an example implementation. To make it simple, we recorded it:

Check it out in the video below!👇

And here’s the code for your entertainment:

Check out the Colab notebook!

Subscribe to Our YouTube Channel!

Subscribe our channel!

We are kicking off our YouTube channel in the new year, and we invite you on board as we walk you through some of these intricacies about AI, fueled by the feedback from our readers, friends and colleagues!

We want to make our channel about AI for everyone. Similar to this newsletter, we’ll talk about new AI products, the latest trends, the nitty-gritty engineering stuff, career insights for AI enthusiasts, and, of course, one of our favorite topics — the entrepreneurial side of AI — 🥳

we’re here to show you how you can ride the AI wave and be your own entrepreneur using the cool tools available in the market.

🛠️✨ Happy practicing and happy building! 🚀🌟

Thanks for reading our newsletter. You can follow us here: Angelina Linkedin or Twitter and Mehdi Linkedin or Twitter.

Source of images/quotes:

🗞️Paper: RAPTOR: RECURSIVE ABSTRACTIVE PROCESSING FOR TREE-ORGANIZED RETRIEVAL: https://arxiv.org/pdf/2401.18059.pdf

🔨 Implementation: https://colab.research.google.com/drive/1jbjC4Sh2YVZkpyUE4EB6y8wnZgO7uPUV?usp=sharing

📚 Also if you’d like to learn more about RAG systems, check out our book on the RAG system:

Order a copy!

📬 Don’t miss out on the latest updates — Subscribe to our newsletter: The MLnotes Newsletter

--

--

No responses yet

Write a response