Sr. Data Engineer (FTE & Fully Remote)
Location: REMOTE
Duration: Permanent
Salary: $120,000-$200,000K – and can offer a bonus
- Dependent on: education, relevant work experience, qualifications, certifications, location, etc.
- Significant performance-based bonus plan, profit sharing, and generous benefits
Interview Process:
- 45-minute teams call with the hiring manager
- Technical assessment + teams call with team
- 2-hour final panel teams call
*Candidates must be willing to travel to the corporate office in Brookfield, WI and travel to client sites, industry conferences, etc.*- about 10%
Must-have:
- 7+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
- Expert level experience working in Databricks and AWS
- Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, and MongoDB
- Experience managing and standardizing clinical data from structured and unstructured sources
- Experience building and managing solutions on AWS
- Expert knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7
- Expert knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm
- Expert in building out data models, data warehouses, designing of data lakes for enterprise (and product use)
- Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions.
- Experience in performance tuning, query optimization, security, monitoring, and release management.
- Experience working with and managing large, disparate, identified, and de-identified data sets from multiple data sources
Plus:
- Bachelor's degree or master's degree in computer science, data engineering or related field
- 10+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
- Health and Life Insurance business experience
- Associate or Professional level solution architecture certification in Azure and/or AWS
- Experience in Snowflake
Day-to-Day:
Insight Global is seeking a Sr. Data Engineer to join our actuarial consultant client 100% remotely. In this position as a Senior Data Engineer of our client’s Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client’s projects. Responsibilities include:
Data Platform: Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise.
- Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality
- Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions
- External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance
- ETL: Building solutions within Delta Live Tables and automation of transformations
- Medallion Architecture: Building out performant enterprise-level medallion architecture(s)
- Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions
- Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data
- Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles
- Costs: Working with the business to build cost effective and cost transparent Data solutions
Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance
- Experience working with Migration tools i.e., AWS DMS, AWS Glue, Fivetran, integrate.io
- Identify and implement improvements to enhance data processing efficiency
- Experience with building out effective pipeline monitoring solutions
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ‘big data’ technologies.
Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data
- Assemble large, complex data sets that meet functional and non-functional business requirements
- Develop and maintain data models, ensuring they align with business objectives and data privacy regulations
Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data
- Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions.
- Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.
- Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems
Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.
- Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.