You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+35-4Lines changed: 35 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,14 +43,45 @@ Check out also the docusaurus [documentation](https://scrapegraph-doc.onrender.c
43
43
You can use the `SmartScraper` class to extract information from a website using a prompt.
44
44
45
45
The `SmartScraper` class is a direct graph implementation that uses the most common nodes present in a web scraping pipeline. For more information, please see the [documentation](https://scrapegraph-ai.readthedocs.io/en/latest/).
46
-
### Case 1: Extracting informations using a local LLM
46
+
### Case 1: Extracting informations using Ollama
47
+
Remember to download the model on Ollama separately!
48
+
```python
49
+
from scrapegraphai.graphs import SmartScraperGraph
50
+
51
+
graph_config = {
52
+
"llm": {
53
+
"model": "ollama/mistral",
54
+
"temperature": 0,
55
+
"format": "json", # Ollama needs the format to be specified explicitly
56
+
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
57
+
},
58
+
"embeddings": {
59
+
"model": "ollama/nomic-embed-text",
60
+
"temperature": 0,
61
+
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
62
+
}
63
+
}
64
+
65
+
smart_scraper_graph = SmartScraperGraph(
66
+
prompt="List me all the news with their description.",
67
+
# also accepts a string with the already downloaded HTML code
68
+
source="https://perinim.github.io/projects",
69
+
config=graph_config
70
+
)
71
+
72
+
result = smart_scraper_graph.run()
73
+
print(result)
74
+
75
+
```
76
+
77
+
### Case 2: Extracting informations using Docker
47
78
48
79
Note: before using the local model remeber to create the docker container!
49
80
```text
50
81
docker-compose up -d
51
82
docker exec -it ollama ollama run stablelm-zephyr
52
83
```
53
-
You can use which model you want instead of stablelm-zephyr
84
+
You can use which models avaiable on Ollama or your own model instead of stablelm-zephyr
54
85
```python
55
86
from scrapegraphai.graphs import SmartScraperGraph
56
87
@@ -75,7 +106,7 @@ print(result)
75
106
```
76
107
77
108
78
-
### Case 2: Extracting informations using Openai model
109
+
### Case 3: Extracting informations using Openai model
79
110
```python
80
111
from scrapegraphai.graphs import SmartScraperGraph
81
112
OPENAI_API_KEY="YOUR_API_KEY"
@@ -98,7 +129,7 @@ result = smart_scraper_graph.run()
98
129
print(result)
99
130
```
100
131
101
-
### Case 3: Extracting informations using Gemini
132
+
### Case 4: Extracting informations using Gemini
102
133
```python
103
134
from scrapegraphai.graphs import SmartScraperGraph
This folder contains an example of how to use ScrapeGraph-AI with Gemini, a large language model (LLM) from Google AI. The example shows how to extract information from a website using a natural language prompt.
0 commit comments