Actively recruiting / 18 applicants
We’re here to help you
Wilson Bittencourt is in direct contact with the company and can answer any questions you may have. Email
Wilson Bittencourt, RecruiterAbout Us
Property Shield is an Atlanta-based data security firm focused on building the fraud prevention infrastructure for residential real estate. Since our start in 2023 we've helped save US renters over $1B in potential losses to rental scams.
Position Overview
We are seeking an experienced Mid / Senior Web Crawling + Backend Engineer to take ownership of our large-scale data acquisition and processing systems. The ideal candidate will have deep expertise in advanced web crawling techniques and proven experience managing high-volume, unstructured data pipelines. This role is critical to our mission of identifying and preventing rental fraud through comprehensive data collection and analysis. This remote position will collaborate daily with our existing development team.
Key Responsibilities
- Design, develop, and maintain sophisticated web crawling systems capable of handling millions of property listings and rental advertisements
- Implement advanced crawling techniques including JavaScript rendering, anti-bot detection evasion, and distributed crawling architectures
- Build and optimize backend systems for processing, validating, and storing large volumes of unstructured data from diverse sources
- Develop robust data validation models to ensure data quality and integrity across our fraud detection pipeline
- Architect scalable data processing workflows that can handle real-time and batch processing requirements
- Implement monitoring, alerting, and recovery systems for mission-critical data acquisition processes
- Collaborate with cross-functional teams to translate business requirements into technical data solutions
- Optimize system performance and costs while maintaining high availability and reliability
- Stay current with emerging web scraping technologies, anti-detection techniques, and data processing best practices
Required Qualifications
- 4+ years of experience in backend development with significant focus on web crawling and data processing
- Expert knowledge of web scraping frameworks and libraries (Scrapy, Selenium, Playwright, BeautifulSoup, or similar)
- Deep understanding of web technologies including HTML parsing, JavaScript execution, session management, and proxy rotation
- Strong experience with anti-detection techniques, CAPTCHA solving, and handling dynamic content
- Proficiency in backend technologies such as Python, Node.js, or Java with emphasis on data-intensive applications
- Extensive experience working with unstructured data formats (JSON, XML, HTML) and data transformation pipelines
- Strong background in data validation, cleaning, and quality assurance methodologies
- Experience with database design and management for high-volume data storage (MongoDB, PostgreSQL, or similar)
- Proven ability to build and deploy systems that process millions of records efficiently
- Strong problem-solving skills and experience debugging complex distributed systems
- Excellent communication skills in English
- Ability to work effectively in a remote team environment
Preferred Qualifications
- Experience with cloud platforms, particularly AWS (EC2, Lambda, S3, SQS, RDS)
- Knowledge of containerization technologies like Docker and orchestration with Kubernetes
- Experience with message queues and event-driven architectures (Redis, RabbitMQ, Apache Kafka)
- Familiarity with machine learning techniques for data classification and anomaly detection
- Experience with real estate data, property listings, or rental market analysis
- Knowledge of legal and ethical considerations in web scraping
- Experience with CI/CD pipelines and infrastructure as code
- Previous work in fraud detection, fintech, or data-intensive applications
- Experience working in a startup environment
- Familiarity with distributed computing frameworks (Apache Spark, Hadoop)
Technical Portfolio Requirements
Candidates should submit a portfolio showcasing:
- Examples of large-scale web crawling projects with technical details about challenges overcome
- Backend systems designed for high-volume data processing and validation
- Documentation of data pipeline architectures and performance optimizations
- Side projects demonstrating expertise in handling unstructured data
- Contributions to open-source projects related to web scraping or data processing (if applicable)
What We Offer
- Opportunity to build and lead critical data infrastructure that directly impacts fraud prevention
- Competitive salary and benefits package
- Flexible remote work arrangement
- Collaborative and innovative work environment focused on solving complex technical challenges
- Professional growth and skill development opportunities in cutting-edge data technologies
- Chance to make a significant societal impact in the fight against rental fraud
- Access to large-scale datasets and the resources to build robust, scalable systems