EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are looking for a Lead Data Engineer to become part of our Customer Success and Support (CSS) Data Engineering team, where you will help create impactful data solutions and applications that inform key decisions across the business. This role is ideal for someone who excels at building scalable systems and values working in a dynamic, team-oriented setting.
Responsibilities
- Deliver technical assistance to Data Science groups
- Design and manage tables to support analytics and data storage needs
- Monitor and resolve support tickets with limited direct engagement
- Benefit from guidance provided by established team members
- Build datamarts and develop specialized data models for analytical projects
- Engage in sprint planning and fulfill Data Scientist requests by quarterly deadlines
- Handle urgent requests and provide on-call support when necessary
- Develop and implement data ingestion workflows as part of assigned tasks
- Create thorough and accessible documentation to support data usage
- Work within a 12/5 support schedule to maintain reliable service
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline
- Minimum 5 years of hands-on experience in data engineering
- At least one year of experience leading and supervising development teams
- High-level expertise in SQL
- Strong background with Databricks SQL
- In-depth understanding of Spark
- Proficient in Python, including PySpark and Airflow integration
- Experience in data integration and data modeling
- Knowledge of big data principles, AWS technologies, and Airflow
- Skilled in designing data models and utilizing REST APIs
- Understanding of ETL concepts and pipeline development
- Effective collaborator with strong communication and documentation skills
- Experience working directly with customer teams
- Open to receiving and applying constructive feedback
- Motivated to quickly learn and adapt to new technologies
- Advanced English proficiency, both spoken and written, at B2+ level or above
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn