Skills & Expertise:
Proficient in data science tasks, encompassing cluster analysis, time series analysis, dimensionality reduction (PCA, UMAP), and anomaly detection.
Seasoned in machine learning tools, including scikit-learn, gradient boosting, SHAP values interpretation, and model fine-tuning (Optuna).
Skilled in deploying models, optimizing for scalability, and utilizing tools such as Luigi, MLflow, Docker, FastAPI/Flask, and pytest.
Well-versed in using Python libraries like NumPy and Pandas for data mining and manipulation, as well as Selenium for web scraping.
Proficient in data analysis techniques, employing data visualization tools such as Plotly, Seaborn, and Matplotlib.
Comfortable with statistics, with experience in SciPy, statistical analysis, hypothesis testing, Bayesian statistics, and probability theory.
Experienced in deep learning frameworks, including TensorFlow, PyTorch, and Keras.
Skilled in various NLP tasks, such as sentiment analysis, topic modeling, classification, and working with GenAI APIs.
Proficient in working with databases, including Google BigQuery, MySQL, PostgreSQL, PL/SQL, and Redis, and competent at handling Big Data.
Familiar with network analysis and graph theory.
Comfortable using Git for version control and conducting code reviews.
Skilled in cloud computing platforms, including Google Cloud Platform (GCP), Vertex AI, and AWS, and proficient in Linux shell scripting (Ubuntu, Debian).
Following best software development practices and experienced in agile methodologies, particularly Scrum.
Seasoned in interacting with stakeholders and providing detailed reports on project progress and outcomes.