Skip to content

Commit

Permalink
deploy: 10d47dc
Browse files Browse the repository at this point in the history
  • Loading branch information
j-mendez committed Dec 27, 2023
1 parent dc0ad8f commit f9d4a95
Show file tree
Hide file tree
Showing 4 changed files with 42 additions and 2 deletions.
20 changes: 20 additions & 0 deletions print.html
Original file line number Diff line number Diff line change
Expand Up @@ -303,6 +303,26 @@ <h3 id="proxy"><a class="header" href="#proxy">Proxy</a></h3>
async def main():
website = Website(&quot;https://choosealicense.com&quot;).with_proxies([&quot;https://www.myproxy.com&quot;])

asyncio.run(main())
</code></pre>
<h3 id="depth-limit"><a class="header" href="#depth-limit">Depth Limit</a></h3>
<p>Set the depth limit for the amount of forward pages.</p>
<pre><code class="language-ts">import asyncio
from spider_rs import Website

async def main():
website = Website(&quot;https://choosealicense.com&quot;).with_depth(3)

asyncio.run(main())
</code></pre>
<h3 id="cache"><a class="header" href="#cache">Cache</a></h3>
<p>Enable HTTP caching, this useful when using the spider on a server.</p>
<pre><code class="language-py">import asyncio
from spider_rs import Website

async def main():
website = Website(&quot;https://choosealicense.com&quot;).with_caching(True)

asyncio.run(main())
</code></pre>
<h3 id="delays"><a class="header" href="#delays">Delays</a></h3>
Expand Down
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion searchindex.json

Large diffs are not rendered by default.

20 changes: 20 additions & 0 deletions website.html
Original file line number Diff line number Diff line change
Expand Up @@ -275,6 +275,26 @@ <h3 id="proxy"><a class="header" href="#proxy">Proxy</a></h3>
async def main():
website = Website(&quot;https://choosealicense.com&quot;).with_proxies([&quot;https://www.myproxy.com&quot;])

asyncio.run(main())
</code></pre>
<h3 id="depth-limit"><a class="header" href="#depth-limit">Depth Limit</a></h3>
<p>Set the depth limit for the amount of forward pages.</p>
<pre><code class="language-ts">import asyncio
from spider_rs import Website

async def main():
website = Website(&quot;https://choosealicense.com&quot;).with_depth(3)

asyncio.run(main())
</code></pre>
<h3 id="cache"><a class="header" href="#cache">Cache</a></h3>
<p>Enable HTTP caching, this useful when using the spider on a server.</p>
<pre><code class="language-py">import asyncio
from spider_rs import Website

async def main():
website = Website(&quot;https://choosealicense.com&quot;).with_caching(True)

asyncio.run(main())
</code></pre>
<h3 id="delays"><a class="header" href="#delays">Delays</a></h3>
Expand Down

0 comments on commit f9d4a95

Please sign in to comment.