You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: packages/doc/docs/extensions/huggingface/huggingface-text-generation.mdx
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,8 +6,8 @@ The [Text Generation](https://huggingface.co/docs/api-inference/detailed_paramet
6
6
7
7
The result will be a string from `huggingface_text_generation`.
8
8
9
-
:::📢 Notice
10
-
The **Text Generation** default model is **gpt2**, If you would like to use the [Meta LLama2](https://huggingface.co/meta-llama) models, you have two method to do:
9
+
:::info
10
+
The **Text Generation** default model is **gpt2**, If you would like to use the [Meta LLama2](https://huggingface.co/meta-llama) models, you have two methods to do:
11
11
12
12
1. Subscribe to the [Pro Account](https://huggingface.co/pricing#pro).
13
13
- Set the Meta LLama2 model using the `model` keyword argument in `huggingface_text_generation`, e.g: `meta-llama/Llama-2-13b-chat-hf`.
Copy file name to clipboardExpand all lines: packages/extension-huggingface/README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -112,7 +112,7 @@ The [Text Generation](https://huggingface.co/docs/api-inference/detailed_paramet
112
112
113
113
Using the `huggingface_text_generation` filter. The result will be a string from `huggingface_text_generation`.
114
114
115
-
**📢 Notice**: The **Text Generation** default model is **gpt2**, If you would like to use the [Meta LLama2](https://huggingface.co/meta-llama) models, you have two method to do:
115
+
**📢 Notice**: The **Text Generation** default model is **gpt2**, If you would like to use the [Meta LLama2](https://huggingface.co/meta-llama) models, you have two methods to do:
116
116
117
117
1. Subscribe to the [Pro Account](https://huggingface.co/pricing#pro).
118
118
- Set the Meta LLama2 model using the `model` keyword argument in `huggingface_text_generation`, e.g: `meta-llama/Llama-2-13b-chat-hf`.
)} %}SELECT {{ data | huggingface_text_generation(query=context.params.value,model="meta-llama/Llama-2-13b-chat-hf", wait_for_model=true, use_cache=false) }}`;
119
171
120
-
121
172
awaitcompileAndLoad(sql);
122
173
awaitexecute({value: 'what repository has most stars?'});
123
174
124
175
// Assert
125
176
constqueries=awaitgetExecutedQueries();
126
177
constbindings=awaitgetCreatedBinding();
127
-
178
+
128
179
expect(queries[0]).toBe('SELECT $1');
129
-
expect(bindings[0].get('$1')).toEqual('Answer: Based on the information provided, the repository with the most stars is "vulcan-sql" with 1000 stars.');
180
+
expect(bindings[0].get('$1')).toEqual(
181
+
'Answer: Based on the information provided, the repository with the most stars is "vulcan-sql" with 1000 stars.'
0 commit comments