Dice is the leading career destination for tech experts at every stage of their careers. Our client, Technogen, Inc., is seeking the following. Apply via Dice today!
_TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years.
__TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencies.
_Position: Data Scientist/Engineer
Location: Plano, TX
Duration: Longterm
Job Description:
Our primary requirement is for hands on engineer with expertise in Python, PySpark, Azure Data Factory, DevOps pipelines, Databricks, and related technologies
4+ years of experience supporting data science projects in a consulting environment.
4+ year(s) of experience participating in projects that focused on one or more of the following areas: Predictive Analytics, Data Design, Statistics, AI/ML, ML Ops
3+ years of experience using Python and/or R to analyze disparate datasets.
Experience with Databricks, Pyspark, Azure Data Factory, GitHub, Azure Data Lake Storage (ADLS)
Experience testing ML models
Working on projects leveraging your expertise in data science, artificial-intelligence and machine learning
Assist in breaking down complex business problems, developing solutions, and delivering with a high degree of focus on client satisfaction.
Conduct market research, develop a point-of-view and communicate effectively back to clients and stakeholders.
Bring innovative thinking, resourcefulness leveraging best practices and creativity to achieve successful client outcomes.
Establish relationships with our clients at the appropriate levels, gain an understanding of the project work and problems encountered.
Work with data sets of varying degrees of size and complexity including both structured and unstructured data.
Piping and processing massive data-streams in distributed computing environments.
Implement batch and real-time model scoring.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Apply business knowledge to analyze data, develop reports and solve problems Perform ad hoc analyses of data depending on business needs.
Participate in the analysis and resolution of issues related to information flow and content with data stakeholders.??