Is a Raspberry PI too weak? Crontab settings possible? #264
Replies: 6 comments
-
i think it's not because raspberry pi too weak but simply serpbear has trouble with too much keyword. all are being requested at the same time but scraper can only handle so much at a time so it works for some but then many are not yet tracked and forgotten by serpbear. |
Beta Was this translation helpful? Give feedback.
-
okay found the problem, in my case it is because i'm using free scraper from scrapingrobot which is very slow (it has only 1 concurrent connection at the same time) so for hundreds or even thousands of keyword it takes 30 sec approximately per keyword which is very very slow |
Beta Was this translation helpful? Give feedback.
-
Hi and thanks , you edit this setting? |
Beta Was this translation helpful? Give feedback.
-
yes you edit that settings, change to serper.dev (it has free 2500 credit) it will be a lot faster in scraping (10x faster or more). |
Beta Was this translation helpful? Give feedback.
-
Perfect, and what is the best Scraping Frequency and Keyword Scrape Delay Settings ? |
Beta Was this translation helpful? Give feedback.
-
keyword scrape delay: 0 |
Beta Was this translation helpful? Give feedback.
-
I have 7 domains and 120 keywords (7 Domains | 120 keywords) running on my Raspberry PI with Docker.
But it doesn't manage to update the 120 keywords once a day?
For many keywords in the ‘Position’ column there is always the loading symbol.
Is this due to the performance of Raspberry PI or something else?
Is it possible to set the scan time and intervals per domain?
Thanks Sebastian
Beta Was this translation helpful? Give feedback.
All reactions