This repository contains a Python script to scrape job listings from Google Jobs using the SerpAPI. The script fetches job listings for a specified search term and location, and saves the results in a CSV file.
Elevate your job search experience with this interactive app! JobJob ! 🎉
- Click Here: 🌐 Explore Now
- Create Your Account and get your API key at: 🔑 Get API Key
- Discover Your Dream Job using this powerful job board aggregator!
🔽 The following is for developers only 🔽
- 🚀 Installation
- 📖 Usage
- ✨ Features
- 🤝 Contributing
- 📜 License
-
Clone the repository:
git clone https://github.com/Yohan-GRNR/Job-Seeker.git cd Job-Seeker
-
Create a virtual environment and activate it - optionnal - :
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install the required packages:
pip install -r requirements.txt
-
Setup your SerpAPI key:
- Obtain an API key from SerpAPI.
- Save the API key in a file located at
../API keys/serpai.txt
.
-
Modify the search parameters:
-
Edit the
search_term
,search_location
, andsearch_radius
variables in the script to suit your needs. -
Example:
search_term = "data analyst" search_location = "Geneva, Switzerland" search_radius = 20
-
-
Run the script:
- Run the notebook.
-
View the results:
- The results are saved in a CSV file named
DB_data-analyst.csv
.
- The results are saved in a CSV file named
- Pagination: Fetches multiple pages of results.
- Error Handling: Stops fetching if there are no more results.
- Data Normalization: Normalizes nested JSON data.
- Data Persistence: Saves results to a CSV file, appending new data and removing duplicates.
Contributions are welcome! Please create a pull request or open an issue for any improvements or bug fixes.
This project is licensed under the MIT License. See the LICENSE file for details.
Note: Ensure you comply with SerpAPI's terms of use and respect the scraping rules of the websites you are accessing.
For any questions or issues, feel free to open an issue in this repository. Happy scraping !