Skip to content

Commit 6a478a0

Browse files
authored
Merge pull request #1176 from 545999961/master
Update pip
2 parents c6ba4b2 + 790255b commit 6a478a0

5 files changed

Lines changed: 31 additions & 19 deletions

File tree

README.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,20 +92,31 @@ It is the first embedding model which supports all three retrieval methods, achi
9292

9393
## Installation
9494
### Using pip:
95+
If you do not want to finetune the models, you can install the package without the finetune dependency:
9596
```
9697
pip install -U FlagEmbedding
9798
```
99+
If you want to finetune the models, you can install the package with the finetune dependency:
100+
```
101+
pip install -U FlagEmbedding[finetune]
102+
```
98103
### Install from sources:
99104

100105
Clone the repository and install
101106
```
102107
git clone https://github.com/FlagOpen/FlagEmbedding.git
103108
cd FlagEmbedding
109+
# If you do not want to finetune the models, you can install the package without the finetune dependency:
104110
pip install .
111+
# If you want to finetune the models, you can install the package with the finetune dependency:
112+
# pip install .[finetune]
105113
```
106114
For development in editable mode:
107115
```
116+
# If you do not want to finetune the models, you can install the package without the finetune dependency:
108117
pip install -e .
118+
# If you want to finetune the models, you can install the package with the finetune dependency:
119+
# pip install -e .[finetune]
109120
```
110121

111122
## Quick Start

README_zh.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,8 @@
3333
<a href="#license">License</a>
3434
<p>
3535
</h4>
36-
[English](README.md) | [中文](README_zh.md)
36+
37+
[English](https://github.com/FlagOpen/FlagEmbedding/blob/master/README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
3738

3839
BGE (BAAI General Embedding) 专注于检索增强llm领域,目前包括以下项目:
3940

@@ -85,20 +86,31 @@ BGE (BAAI General Embedding) 专注于检索增强llm领域,目前包括以下
8586

8687
## 安装
8788
### 使用pip:
89+
如果你不想微调模型,你可以直接安装包,不用finetune依赖:
8890
```
8991
pip install -U FlagEmbedding
9092
```
93+
如果你想微调模型,你可以用finetune依赖安装:
94+
```
95+
pip install -U FlagEmbedding[finetune]
96+
```
9197
### 从源文件安装部署:
9298

9399
克隆并安装FlagEmbedding:
94100
```
95101
git clone https://github.com/FlagOpen/FlagEmbedding.git
96102
cd FlagEmbedding
103+
# 如果你不想微调模型,你可以直接安装包,不用finetune依赖:
97104
pip install .
105+
# 如果你想微调模型,你可以用finetune依赖安装:
106+
# pip install .[finetune]
98107
```
99108
在可编辑模式下安装:
100109
```
110+
# 如果你不想微调模型,你可以直接安装包,不用finetune依赖:
101111
pip install -e .
112+
# 如果你想微调模型,你可以用finetune依赖安装:
113+
# pip install -e .[finetune]
102114
```
103115

104116
## 快速开始

examples/README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,6 +81,7 @@ print(scores)
8181

8282
We support fine-tuning a variety of BGE series models, including `bge-large-en-v1.5`, `bge-m3`, `bge-en-icl`, `bge-multilingual-gemma2`, `bge-reranker-v2-m3`, `bge-reranker-v2-gemma`, and `bge-reranker-v2-minicpm-layerwise`, among others. As examples, we use the basic models `bge-large-en-v1.5` and `bge-reranker-large`. For more details, please refer to the [embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune/embedder) and [reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune/reranker) sections.
8383

84+
If you do not have the `deepspeed` and `flash-attn` packages installed, you can install them with the following commands:
8485
```shell
8586
pip install deepspeed
8687
pip install flash-attn --no-build-isolation

examples/finetune/embedder/README.md

Lines changed: 3 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -17,27 +17,21 @@ In this example, we show how to finetune the embedder with your data.
1717
- **with pip**
1818

1919
```shell
20-
pip install -U FlagEmbedding
21-
pip install deepspeed
22-
pip install flash-attn --no-build-isolation
20+
pip install -U FlagEmbedding[finetune]
2321
```
2422

2523
- **from source**
2624

2725
```shell
2826
git clone https://github.com/FlagOpen/FlagEmbedding.git
2927
cd FlagEmbedding
30-
pip install .
31-
pip install deepspeed
32-
pip install flash-attn --no-build-isolation
28+
pip install .[finetune]
3329
```
3430

3531
For development, install as editable:
3632

3733
```shell
38-
pip install -e .
39-
pip install deepspeed
40-
pip install flash-attn --no-build-isolation
34+
pip install -e .[finetune]
4135
```
4236

4337
## 2. Data format

examples/finetune/reranker/README.md

Lines changed: 3 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -16,27 +16,21 @@ In this example, we show how to finetune the reranker with your data.
1616
- **with pip**
1717

1818
```shell
19-
pip install -U FlagEmbedding
20-
pip install deepspeed
21-
pip install flash-attn --no-build-isolation
19+
pip install -U FlagEmbedding[finetune]
2220
```
2321

2422
- **from source**
2523

2624
```shell
2725
git clone https://github.com/FlagOpen/FlagEmbedding.git
2826
cd FlagEmbedding
29-
pip install .
30-
pip install deepspeed
31-
pip install flash-attn --no-build-isolation
27+
pip install .[finetune]
3228
```
3329

3430
For development, install as editable:
3531

3632
```shell
37-
pip install -e .
38-
pip install deepspeed
39-
pip install flash-attn --no-build-isolation
33+
pip install -e .[finetune]
4034
```
4135

4236
## 2. Data format

0 commit comments

Comments
 (0)