| # Dataset Card for Enwiki Dataset | |
| This is an automatically updating dataset containing ~7 million English Wikipedia articles, with expanded templates and converted into Markdown. This dataset was created with the intention to provide a bite-sized, LLM-readable version of Wikipedia for various applications, including RAG. | |
| ## Dataset Overview | |
| There are two main versions of the dataset: | |
| - **`merged-articles`** - Complete Wikipedia dump with all articles merged into a single file. | |
| - **`merged-article-chunked`** - Articles chunked into ~700 word segments with Markdown-header hierarchical breadcrumbs (Gemini embeddings coming soon!) | |
| The latest version of the dataset was updated on `2025-08-16`. | |
| Blogpost coming soon! |