Position: Python Data Engineer
Location: Remote
Duration: Contract
EXP: 9 Years Minimum
Must Have:-
- Python Building/Developing Data Pipelining on Python
- SQL used for Data Validations
- Shell Scripting
- AWS S3 & Lambda
Requirements:
- Minimum 7 years of experience in data engineering and Data Pipelines
- Minimum 5 years of extensive experience in Python Programming
- Minimum 3 years of extensive experience in SQL, Unix/Linux Shell Scripting
- Hands-on experience writing complex SQL queries, exporting, and importing large amounts of data using utilities.
- Minimum 3 year of AWS experience
- Basic Knowledge of CI/CD
- Excellent communication skills and Good Customer Centricity.
Project Name: OneStream Convergence
Expectations:
- Understand existing workflows and underlying frameworks
- Migrate these existing workflows to new client's specific data ingestion framework
- Collaborate with and across Agile teams
- to gather metadata to meet the current data governance standards
- Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
- Build scripts/utilities to accelerate migration
- Analyze data & generate reports
- Learn-unlearn-relearn concepts with an open and analytical mindset.
- Troubleshooting & Critical thinking
- Develop & review technical documentation for artifacts delivered.
Nice to Haves:
- Prior experience with data migration project
- Experience with Kafka Streams/building data intensive streaming applications (stream processing, e.g. Kafka, Spark Streaming)
- Experience/Knowledge of Scala or Java Programming
- Experience with at least one Cloud DW such as Snowflake
- Experience with Distributed Computing Platforms
Job Types: Full-time, Contract
Salary: $47.60 - $57.32 per hour
Benefits:
- 401(k)
- Dental insurance
- Health insurance
Experience level:
Schedule:
Work Location: Remote