Skip to content

Commit f376695

Browse files
committed
add examples
1 parent 4ef1564 commit f376695

23 files changed

Lines changed: 415 additions & 9 deletions

README.md

Lines changed: 35 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -43,14 +43,45 @@ Check out also the docusaurus [documentation](https://scrapegraph-doc.onrender.c
4343
You can use the `SmartScraper` class to extract information from a website using a prompt.
4444

4545
The `SmartScraper` class is a direct graph implementation that uses the most common nodes present in a web scraping pipeline. For more information, please see the [documentation](https://scrapegraph-ai.readthedocs.io/en/latest/).
46-
### Case 1: Extracting informations using a local LLM
46+
### Case 1: Extracting informations using Ollama
47+
Remember to download the model on Ollama separately!
48+
```python
49+
from scrapegraphai.graphs import SmartScraperGraph
50+
51+
graph_config = {
52+
"llm": {
53+
"model": "ollama/mistral",
54+
"temperature": 0,
55+
"format": "json", # Ollama needs the format to be specified explicitly
56+
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
57+
},
58+
"embeddings": {
59+
"model": "ollama/nomic-embed-text",
60+
"temperature": 0,
61+
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
62+
}
63+
}
64+
65+
smart_scraper_graph = SmartScraperGraph(
66+
prompt="List me all the news with their description.",
67+
# also accepts a string with the already downloaded HTML code
68+
source="https://perinim.github.io/projects",
69+
config=graph_config
70+
)
71+
72+
result = smart_scraper_graph.run()
73+
print(result)
74+
75+
```
76+
77+
### Case 2: Extracting informations using Docker
4778

4879
Note: before using the local model remeber to create the docker container!
4980
```text
5081
docker-compose up -d
5182
docker exec -it ollama ollama run stablelm-zephyr
5283
```
53-
You can use which model you want instead of stablelm-zephyr
84+
You can use which models avaiable on Ollama or your own model instead of stablelm-zephyr
5485
```python
5586
from scrapegraphai.graphs import SmartScraperGraph
5687

@@ -75,7 +106,7 @@ print(result)
75106
```
76107

77108

78-
### Case 2: Extracting informations using Openai model
109+
### Case 3: Extracting informations using Openai model
79110
```python
80111
from scrapegraphai.graphs import SmartScraperGraph
81112
OPENAI_API_KEY = "YOUR_API_KEY"
@@ -98,7 +129,7 @@ result = smart_scraper_graph.run()
98129
print(result)
99130
```
100131

101-
### Case 3: Extracting informations using Gemini
132+
### Case 4: Extracting informations using Gemini
102133
```python
103134
from scrapegraphai.graphs import SmartScraperGraph
104135
GOOGLE_APIKEY = "YOUR_API_KEY"

examples/gemini/readme.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
This folder contains an example of how to use ScrapeGraph-AI with Gemini, a large language model (LLM) from Google AI. The example shows how to extract information from a website using a natural language prompt.
File renamed without changes.

examples/local_models/inputs/plain_html_example.txt renamed to examples/local_models/Docker/inputs/plain_html_example.txt

File renamed without changes.

examples/local_models/Docker/readme.md

Whitespace-only changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 commit comments

Comments
 (0)