Personal details

Wes T. - Remote data engineer

Wes T.

Based in: đŸ‡§đŸ‡· Brazil
Timezone: Brasilia (UTC-3)

Summary

With more than five years of experience in data and software engineering had the privilege of working for several industries. My entire career was in consulting firms, thus, I learned to work in different environments and businesses such as investment firms, paper producers, retail, etc. Not only with data engineering responsibilities but also had solutions architecture and tech lead experiences. Building solutions such as Data Lakes, ETL/ELT pipelines, data modeling, data migration, DevOps, Business Intelligence in general, Data Governance, Web Scraping, Chatbots, and automation as a whole. Experience with Machine Learning concepts and frameworks, like Tensorflow, Keras, and PyTorch. The main programming languages are Python, JavaScript, and SQL. Proficient in the main cloud providers (AWS, azure). Significant experience with Docker, Kubernetes, Git, CI/CD, Terraform, Airflow, Spark, Snowflake, DBT, etc.

Work Experience

Data Engineer
Blue Orange | Feb 2020 - Present
Python
SQL
MySQL
MongoDB
Docker
Amazon Redshift
AWS (Amazon Web Services)
  • Developed data engineering, analytics, and machine learning solutions for clients.
  • Worked with various technologies and tools, including Microsoft Azure, AWS, Docker, Git, Selenium, SSIS, SQL Server, Redshift, PostgreSQL, MySQL, MongoDB, SQLite, and Terraform.
  • Had diverse opportunities of working as a techlead and also interviewing candidates on the technical side.
Data Engineer
Fifteen Hundred | Dec 2017 - Feb 2020
Python
SQL
Machine Learning
React
JavaScript
Amazon Redshift
Express.js
AWS (Amazon Web Services)

Worked on different types of projects that required a versatile set of skills such as web scraping, APIs, web applications, analytics with Power BI and Tableau, data migration, data ingestion with SSIS, bots, chatbots, and automation with the main cloud providers (AWS, Azure). Python, Javascript, and SQL as the main programming languages used. Also used tools like Selenium, Azure Data Factory, SSIS, Apache Spark, Redshift, Microsoft Cognitive Services, Flask, ReactJS, ExpressJS, SQL Server, SSIS, and Machine Learning, among other technologies.

Education

Universidade do Oeste Paulista
Bachelor's degree・Computer Science
Mar 2014 - Sep 2018

Personal Projects

Web Crawler for Investment Firm
2020
Python
Selenium
AWS (Amazon Web Services)
Developed a web crawler using Python, Selenium, Scrapy, AWS, OpenCV, and Tesseract to scrape hundreds of websites daily, handling several types of formats, including image-based PDFs. Built models based on the business logic, aggregating the data after each daily run, and delivered reports with the gathered information.
Data project for Private Equity Company
2021
Python
Snowflake
Airflow
DevOps
Over the past two years, I have worked on a significant project that stands out as the highlight of my career. My responsibilities included data ingestion, standardization, processing, transformation, and modeling. The latest tools were used, such as Apache Airflow, Meltano, Snowflake, DBT, etc. Pointed by my superiors as a significant responsible for the success of the project, and consequentially, the small team grew significantly, leading me to take on expanded responsibilities and work as a tech lead and leading successfully part of the engineering team in various domains such as data engineering, DevOps, and supporting other engineers to maximize client value. Collaboration with cross-functional teams to achieve business goals. The success of this project has been driven by my strong analytical, problem-solving, and communication skills, enabling me to engage with technical, non-technical, and business stakeholders to make informed technical and architectural decisions in line with the client's needs.