Searching for jobs online can be cumbersome, and filtering through jobs to see if they fit your skills and preferences is often a hassle. This project simplifies the process by scraping LinkedIn job postings based on configured search terms and using an AI agent to evaluate job descriptions for compatibility with the user's skills and preferences. It provides a match percentage and justifications for each job.
- Configurable Job Filters: Users can specify search terms, time uploaded, location, and more.
- AI-Driven Matching: Upload a text file with your skills and preferences for the AI agent to score job compatibility.
- Data Management: Updates a Google Sheet with all found jobs and their data.
- Notifications: Sends notifications about qualified jobs to a Slack channel.
- Technologies Used:
- Web Scraping: BeautifulSoup
- AI Agent: LangGraph framework and Google Vertex AI's LLaMA3.2-90B model
- Prerequisites:
- Create a
.env
file specifying:- Google credentials file location
- API key
- Slack bot token (
SLACK_BOT_TOKEN
)
- Configure search terms in
config.json
.
- Create a
- User info:
./job_crawler/linkedin_crawler/utils/user_info.py
update candidate_skills and candidate_preferences
- Dependencies: Install required Python packages from
requirements.txt
:pip install -r requirements.txt
- Run the batch script located in
./task_scheduler_scripts/run_job_agent.bat
to execute the scraping and AI matching process. - Once complete:
- The Google Sheet will be updated with job details and match scores.
- A Slack notification will be sent with the qualified jobs.
- Model: Google Vertex AI's LLaMA3.2-90B
- Framework: LangGraph
- Functionality: Evaluates job descriptions against user-uploaded skills and preferences, providing match percentages and explanations.
- Web Scraping Logic: Adapted from linkedinscraper.