Job Description
As part of our Artificial Intelligence Team, you will help out shaping the future of our software.
You will develop, test and also maintain data architectures to keep this data accessible and ready for analysis. Among your tasks, you will do Data Modelling, ETL (Extraction Transformation and Load), Data Architecture Construction and Development and also Testing of the Database Architecture.
Daily responsibilities
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud ‘big data’ technologies.
- Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Real impact one step at a time
The impact will imply the project's context and will also go beyond this, with the Competence Area community that you will be part of, with a strong focus on your technical skills.
Professional Opportunities
You will have access to AI Community trainings and programs emphasizing skills on the technical and tactical side, while you will be engaged within new projects and opportunities landing in our business line.
Community insights
The community consists of Data Scientists and Machine Learning Engineers, along with Data Engineers sharing knowledge and projects' insights on a regular basis. We engage in projects pertaining to Computer Vision, NLP, Advanced Analytics, Preventions and Trends Analysis.
Qualifications
Must have
- 3+ years of professional experience
- Experience working in Agile teams
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets
- Build processes supporting data transformation, data structures, metadata, dependency and workload management
- Technical experience with:
- Big data tools: Spark
- Object-oriented languages: Python
- Visualization tools: PowerBI, etc
- Relational Databases: Postgres
Nice to have:
- Experience in working directly with customer stakeholders
- Knowledge of manipulating, processing and extracting value from large disconnected datasets
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
- Technical experience with
- Big Data tools: Databricks
- Data pipeline and workflow management tools: Airflow.
- Stream-processing systems: Storm, Spark-Streaming, etc.
Additional Information
At Accesa you can:
Enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing, covering social, physical, emotional wellbeing, as well as work-life fusion.
- Physical: premium medical package for both our colleagues and their children, dental coverage up to a yearly amount, eyeglasses reimbursement every two years, voucher for sport equipment expenses, in-house personal trainer
- Emotional: individual therapy sessions with a certified psychotherapist, webinars on self-development topics
- Social: virtual activities, sports challenges, special occasions get-togethers
- Work-life fusion: yearly increase in days off, flexible working schedule, birthday, holiday and loyalty gifts for major milestones, work from home bonuses