Skip to content

Commit

Permalink
chore(docs): add full website config examples
Browse files Browse the repository at this point in the history
  • Loading branch information
j-mendez committed Nov 28, 2023
1 parent 7b5800c commit 50d71a1
Show file tree
Hide file tree
Showing 4 changed files with 107 additions and 5 deletions.
2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ compact_str = "0.7.1"
napi = { version = "2.14.1", default-features = false, features = ["napi4", "async", "tokio_rt"] }
napi-derive = "2.14.2"
num_cpus = "1.16.0"
spider = { version = "1.50.8", features = ["napi", "budget", "cron", "regex", "cookies"] }
spider = { version = "1.50.8", features = ["napi", "budget", "cron", "regex", "cookies", "socks"] }

[target.x86_64-unknown-linux-gnu.dependencies]
openssl-sys = { version = "0.9.96", features = ["vendored"] }
Expand Down
2 changes: 1 addition & 1 deletion book/src/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,6 @@

- [Website](./website.md)

# Features
# Usage

- [Cron Job](./cron-job.md)
6 changes: 3 additions & 3 deletions book/src/cron-job.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,14 @@
Use a cron job that can run any time of day to gather website data.

```ts
import { Website, type NPage } from "@spider-rs/spider-rs";
import { Website } from "@spider-rs/spider-rs";

const website = new Website("https://choosealicense.com")
.withCron("1/5 * * * * *")
.build();

const onPageEvent = (err: Error | null, value: NPage) => {
links.push(value);
const onPageEvent = (err, value) => {
console.log(value);
};

const handle = await website.runCron(onPageEvent);
Expand Down
102 changes: 102 additions & 0 deletions book/src/website.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,3 +49,105 @@ const website = new Website("https://choosealicense.com")
```

View the [cron](./cron-job.md) section for details how to use the cron.

### Budget

Add a crawl budget that prevents crawling `x` amount of pages.

```ts
const website = new Website("https://choosealicense.com")
.withBudget({
"*": 1,
})
.build();
```

### Subdomains

Include subdomains in request.

```ts
const website = new Website("https://choosealicense.com")
.withSubdomains(true)
.build();
```

### TLD

Include TLDs in request.

```ts
const website = new Website("https://choosealicense.com")
.withTlds(true)
.build();
```

### External Domains

Add external domains to include with the website.

```ts
const website = new Website("https://choosealicense.com")
.withExternalDomains(["https://www.myotherdomain.com"])
.build();
```

### Proxy

Use a proxy to crawl a website.

```ts
const website = new Website("https://choosealicense.com")
.withProxies(["https://www.myproxy.com"])
.build();
```

### Delays

Add delays between pages.

```ts
const website = new Website("https://choosealicense.com")
.withDelays(200)
.build();
```

### User-Agent

Use a custom User-Agent.

```ts
const website = new Website("https://choosealicense.com")
.withUserAgent("mybot/v1")
.build();
```

### Request Timeout

Add a request timeout per page in miliseconds. Example shows 30 seconds.

```ts
const website = new Website("https://choosealicense.com")
.withRequestTimeout(30000)
.build();
```

### Respect Robots

Respect the robots.txt file.

```ts
const website = new Website("https://choosealicense.com")
.withRespectRobotsTxt(true)
.build();
```

### Http2 Prior Knowledge

Use http2 to connect if you know the website servers supports this.

```ts
const website = new Website("https://choosealicense.com")
.withHttp2PriorKnowledge(true)
.build();
```

0 comments on commit 50d71a1

Please sign in to comment.