You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Tutorials/README.md
+29-2Lines changed: 29 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,8 @@
1
-
# FlagEmbedding_tutorial
1
+
# Tutorial
2
2
3
-
If you are new to here, check out the 5 minute [quick start](./quick_start.ipynb)!
3
+
FlagEmbedding holds a whole curriculum for retrieval, embedding models, RAG, etc. This section is currently being actively updated. No matter you are new to NLP or a veteran, we hope you can find something helpful!
4
+
5
+
If you are new to embedding and retrieval, check out the 5 minute [quick start](./quick_start.ipynb)!
4
6
5
7
<details>
6
8
<summary>Tutorial roadmap</summary>
@@ -11,18 +13,43 @@ If you are new to here, check out the 5 minute [quick start](./quick_start.ipynb
11
13
12
14
This module includes tutorials and demos showing how to use BGE and Sentence Transformers, as well as other embedding related topics.
13
15
16
+
-[x] Intro to embedding model
17
+
-[x] BGE series
18
+
-[x] Usage of BGE
19
+
-[x] BGE-M3
20
+
-[ ] BGE-ICL
21
+
- ...
22
+
14
23
## [Similarity](./2_Similarity)
15
24
16
25
In this part, we show popular similarity functions and techniques about searching.
17
26
27
+
-[x] Similarity metrics
28
+
- ...
29
+
18
30
## [Indexing](./3_Indexing)
19
31
20
32
Although not included in the quick start, indexing is a very important part in practical cases. This module shows how to use popular libraries like Faiss and Milvus to do indexing.
21
33
34
+
-[x] Intro to Faiss
35
+
-[x] Using GPU in Faiss
36
+
-[ ] Index and Quantizer
37
+
-[ ] Milvus
38
+
- ...
39
+
22
40
## [Evaluation](./4_Evaluation)
23
41
24
42
In this module, we'll show the full pipeline of evaluating an embedding model, as well as popular benchmarks like MTEB and C-MTEB.
25
43
44
+
-[x] Evaluate MSMARCO
45
+
-[x] Intro to MTEB
46
+
-[x] MTEB Leaderboard Eval
47
+
-[ ] C-MTEB
48
+
- ...
49
+
26
50
## [Reranking](./5_Reranking/)
27
51
28
52
To balance accuracy and efficiency tradeoff, many retrieval system use a more efficient retriever to quickly narrow down the candidates. Then use more accurate models do reranking for the final results.
0 commit comments