Skip to content

Commit

Permalink
deploy: 302a328
Browse files Browse the repository at this point in the history
  • Loading branch information
j-mendez committed Nov 29, 2023
1 parent 78f09fe commit 4d48fdb
Show file tree
Hide file tree
Showing 5 changed files with 26 additions and 8 deletions.
3 changes: 2 additions & 1 deletion getting-started.html
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,8 @@ <h1 class="menu-title">spider-rs</h1>
<div id="content" class="content">
<main>
<h1 id="getting-started"><a class="header" href="#getting-started">Getting Started</a></h1>
<p>Install the package.</p>
<p>Make sure to have <a href="https://nodejs.org/en/download">node</a> installed v10 and higher.</p>
<p>Install the package with your favorite package manager.</p>
<pre><code class="language-sh">yarn add @spider-rs/spider-rs
# or
npm install @spider-rs/spider-rs
Expand Down
15 changes: 12 additions & 3 deletions print.html
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,8 @@ <h1 id="introduction"><a class="header" href="#introduction">Introduction</a></h
console.log(website.getLinks());
</code></pre>
<div style="break-before: page; page-break-before: always;"></div><h1 id="getting-started"><a class="header" href="#getting-started">Getting Started</a></h1>
<p>Install the package.</p>
<p>Make sure to have <a href="https://nodejs.org/en/download">node</a> installed v10 and higher.</p>
<p>Install the package with your favorite package manager.</p>
<pre><code class="language-sh">yarn add @spider-rs/spider-rs
# or
npm install @spider-rs/spider-rs
Expand Down Expand Up @@ -233,7 +234,6 @@ <h3 id="events"><a class="header" href="#events">Events</a></h3>
<h2 id="builder-pattern"><a class="header" href="#builder-pattern">Builder pattern</a></h2>
<p>We use the builder pattern to configure the website for crawling.</p>
<p>*note: Replace <code>https://choosealicense.com</code> from the examples below with your website target URL.</p>
<p>All of the examples use typescript by default.</p>
<pre><code class="language-ts">import { Website } from &quot;@spider-rs/spider-rs&quot;;

const website = new Website(&quot;https://choosealicense.com&quot;);
Expand Down Expand Up @@ -292,7 +292,7 @@ <h3 id="proxy"><a class="header" href="#proxy">Proxy</a></h3>
.build();
</code></pre>
<h3 id="delays"><a class="header" href="#delays">Delays</a></h3>
<p>Add delays between pages.</p>
<p>Add delays between pages. Defaults to none.</p>
<pre><code class="language-ts">const website = new Website(&quot;https://choosealicense.com&quot;)
.withDelays(200)
.build();
Expand Down Expand Up @@ -321,6 +321,15 @@ <h3 id="http2-prior-knowledge"><a class="header" href="#http2-prior-knowledge">H
.withHttp2PriorKnowledge(true)
.build();
</code></pre>
<h2 id="chaining"><a class="header" href="#chaining">Chaining</a></h2>
<p>You can chain all of the configs together for simple configuration.</p>
<pre><code class="language-ts">const website = new Website(&quot;https://choosealicense.com&quot;)
.withSubdomains(true)
.withTlds(true)
.withUserAgent(&quot;mybot/v1&quot;)
.withRespectRobotsTxt(true)
.build();
</code></pre>
<div style="break-before: page; page-break-before: always;"></div><h1 id="crawl"><a class="header" href="#crawl">Crawl</a></h1>
<p>Crawl a website concurrently.</p>
<pre><code class="language-ts">import { Website } from &quot;@spider-rs/spider-rs&quot;;
Expand Down
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion searchindex.json

Large diffs are not rendered by default.

12 changes: 10 additions & 2 deletions website.html
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,6 @@ <h1 id="website"><a class="header" href="#website">Website</a></h1>
<h2 id="builder-pattern"><a class="header" href="#builder-pattern">Builder pattern</a></h2>
<p>We use the builder pattern to configure the website for crawling.</p>
<p>*note: Replace <code>https://choosealicense.com</code> from the examples below with your website target URL.</p>
<p>All of the examples use typescript by default.</p>
<pre><code class="language-ts">import { Website } from &quot;@spider-rs/spider-rs&quot;;

const website = new Website(&quot;https://choosealicense.com&quot;);
Expand Down Expand Up @@ -235,7 +234,7 @@ <h3 id="proxy"><a class="header" href="#proxy">Proxy</a></h3>
.build();
</code></pre>
<h3 id="delays"><a class="header" href="#delays">Delays</a></h3>
<p>Add delays between pages.</p>
<p>Add delays between pages. Defaults to none.</p>
<pre><code class="language-ts">const website = new Website(&quot;https://choosealicense.com&quot;)
.withDelays(200)
.build();
Expand Down Expand Up @@ -264,6 +263,15 @@ <h3 id="http2-prior-knowledge"><a class="header" href="#http2-prior-knowledge">H
.withHttp2PriorKnowledge(true)
.build();
</code></pre>
<h2 id="chaining"><a class="header" href="#chaining">Chaining</a></h2>
<p>You can chain all of the configs together for simple configuration.</p>
<pre><code class="language-ts">const website = new Website(&quot;https://choosealicense.com&quot;)
.withSubdomains(true)
.withTlds(true)
.withUserAgent(&quot;mybot/v1&quot;)
.withRespectRobotsTxt(true)
.build();
</code></pre>

</main>

Expand Down

0 comments on commit 4d48fdb

Please sign in to comment.