Skip to content

Commit e0a3e1b

Browse files
committed
feature: add example links
1 parent 72914d9 commit e0a3e1b

2 files changed

Lines changed: 29 additions & 4 deletions

File tree

packages/doc/docs/extensions/dbt.mdx

Lines changed: 22 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,9 +34,25 @@ We need to install an additional package to integrate with dbt:
3434
3535
:::
3636
37-
## Using models in VulcanSQL
37+
3. Setup `profiles.yaml`(if you are using DuckDB as the data source for your dbt project)
3838

39-
Using models of dbt is extremely easy, you only need to use the following syntax.
39+
Add `persistent-path` to the `profiles.yaml` in the root of your VulcanSQL project like following
40+
41+
```yaml
42+
- name: duckdb
43+
type: duckdb
44+
connection:
45+
persistent-path: [duckdb db file path of your dbt project]
46+
allow: "*"
47+
```
48+
49+
## Setup of your dbt project
50+
51+
Please refer to the [dbt Quickstarts turotials](https://docs.getdbt.com/quickstarts)
52+
53+
## Using the dbt extension
54+
55+
Using models of dbt is extremely easy, you only need to use the following syntax in your VulcanSQL project.
4056

4157
```sql
4258
{% dbt "model.<project-name>.<model-name>" %}
@@ -47,3 +63,7 @@ For example, to query all data from model `my_first_dbt_model` in the project `d
4763
```sql
4864
select * from {% dbt "model.demo.my_first_dbt_model" %}
4965
```
66+
67+
## Examples
68+
69+
You can check out this [dbt-jaffle-shop](https://github.com/Canner/vulcan-sql-examples/tree/main/dbt-jaffle-shop) example for further details!

packages/doc/docs/extensions/huggingface/huggingface-table-question-answering.mdx

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
The [Table Question Answering](https://huggingface.co/docs/api-inference/detailed_parameters#table-question-answering-task) is one of the Natural Language Processing tasks supported by Hugging Face.
44

5-
Using the `huggingface_table_question_answering` filter.
5+
## Using the `huggingface_table_question_answering` filter.
66

77
The result will be converted to a JSON string from `huggingface_table_question_answering`. You could decompress the JSON string and use the result by itself.
88

@@ -63,7 +63,7 @@ SELECT {{ products.value() | huggingface_table_question_answering(query=question
6363
]
6464
```
6565

66-
### Arguments
66+
## Arguments
6767

6868
Please check [Table Question Answering](https://huggingface.co/docs/api-inference/detailed_parameters#table-question-answering-task) for further information.
6969

@@ -73,3 +73,8 @@ Please check [Table Question Answering](https://huggingface.co/docs/api-inferenc
7373
| model | N | google/tapas-base-finetuned-wtq | The model id of a pretrained model hosted inside a model repo on huggingface.co. See: https://huggingface.co/models?pipeline_tag=table-question-answering |
7474
| use_cache | N | true | There is a cache layer on the inference API to speedup requests we have already seen |
7575
| wait_for_model | N | false | If the model is not ready, wait for it instead of receiving 503. It limits the number of requests required to get your inference done |
76+
77+
78+
## Examples
79+
80+
You can check out this [table-question-answering](https://github.com/Canner/vulcan-sql-examples/tree/main/huggingface/table-question-answering) example for further details!

0 commit comments

Comments
 (0)