Senior Data and Software Engineer
Remote
Contract to Hire OR FT
Only W2
Note : Looking for a Senior engineer who can build secure, privacy-focused data systems using SQL, Python, Snowflake, and AWS.
About Role:
We are seeking a highly skilled and experienced Senior Data and Software Engineer to join our team. In this role, you will be responsible for designing, implementing, and maintaining robust data structures and controls for privacy features and services, enabling secure multi-party collaboration. You will ensure compliance with data governance standards, implement strong data security measures, and leverage your expertise in SQL and Python to manage large data volumes. This position requires a deep understanding of various data technologies, cloud environments, data clean rooms, and AI concepts.
Responsibilities:
- Design and implement data privacy features and services to enable secure multi-party collaboration, including query constraints, data clean room construction, deployment, and monitoring at scale
- Uphold data governance standards and practices, ensuring compliance with data quality standards
- Implement and enforce data security measures to protect sensitive data, including personally identifiable information (PII) and financial data
- Leverage SQL and Python programming proficiency to extract, transform, clean, and interpret large data volumes (200BIL+ records)
- Design highly performant data structures to ensure optimal storage and retrieval of data
- Utilize advanced database technologies to enhance data storage and processing capabilities
- Participate in code reviews to maintain code quality and consistency; collaborate with the infrastructure team to plan and execute deployments
Qualifications:
- Master's degree in Computer Science, Engineering, Data Science, or related field/experience
- 6 or more years of experience as a back-end data/software engineer working on various data technologies, with proficiency in SQL, Python, and Javascript (Node.js)
- 2 or more years of hands-on experience with Snowflake ecosystem, including expert knowledge of SnowPipes, Streams, Views, performance tuning, data modeling, ELT pipelines, data visualizations, and standard DWH concepts, and experience implementing complex SQL stored procedures
- Minimum 3 years’ experience with various AWS cloud technologies and data lake management such as S3, Lambda, Airflow, Redshift, Athena, and Glue
- Demonstratable knowledge of data clean room technologies, such as creating secure data shares using RBAC. Knowledge of Snowflake Native apps (v6+) preferred
- Knowledge of all aspects of the SDLC as well as experience with Jenkins and setting up CI/CD processes
- Experience with data security concepts such as CCPA, GDPR, SSO, and JWT
- Proficiency in data access controls, including aggregation constraints, projection policies, row access policies, column masking, and differential privacy; knowledge of data controls including semantic models, dbt, honeydew, Iceberg tables, various catalogs (glue, rest, nessie, hadoop etc), and catalog syncing concepts
- Experience with cloud environments such as SNOWFLAKE, AWS, DATABRICKS, LIVERAMP, and GCP
- Familiarity with BI tools such as Thoughtspot, Sigma, Domo, Looker, Quicksights, Tableau; detailed knowledge of AI concepts such as Generative AI chatbots, cortex analyst, agent training, various LLMs, and prompt engineering
- Experience in modeling: Snowpark ML, TensorFlow
- Expertise building and maintaining APIs; Swagger/OpenAPI spec, Node, and Next.js a plus