Skip to content

Commit ee80eac

Browse files
committed
fix: fix bug when not using knowledge_distilation in fine-tuning embedder
1 parent 073651f commit ee80eac

1 file changed

Lines changed: 3 additions & 2 deletions

File tree

FlagEmbedding/abc/finetune/embedder/AbsDataset.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -416,8 +416,9 @@ def _create_batch_data(self, batch_raw_data):
416416

417417
passages.extend(tmp_passages)
418418

419-
if len(teacher_scores) > 0 and len(passages) > 0:
420-
assert len(teacher_scores) == len(passages)
419+
if teacher_scores is not None:
420+
if len(teacher_scores) > 0 and len(passages) > 0:
421+
assert len(teacher_scores) == len(passages)
421422

422423
return queries, passages, teacher_scores
423424

0 commit comments

Comments
 (0)