Well-versed Data/Cloud Engineer, Business Intelligence & Data Scientist Consultant with 10+ years of experience building data-intensive applications, with extensive hands-on knowledge across multiple industries (real estate, oil, Insurance Brokers, etc). I strive to make an impact by creating scalable data platforms and transforming big data into valuable insights using data science and analytics. My primary interests include distributed systems, large-scale structured storage & query optimization, and data exploration and prediction with ML/AI.
Creating and utilizing tools to monitor applications and services in the cloud for a platform that combines big data analytics, discrete customer insights, and intuitive visual segmentation.
Utilized Amazon cloud based computing environments to manage and support full application stacks.
Analyzed and monitored performance key metrics for multiple clients, maintaining tools to automate operational processes.
Coordinate the development and revision of programs for a Data Science 6-month course that includes an introduction to Python's pandas, numpy, matplotlib and seaborn libraries.
Acted as a Data Science evaluator, grading exams for 7 projects of the course that includes: Exploration (Numpy), Transformation (Pandas, Seaborn), Regression (DecisionTree, KNN), Parameter Optimization (GridSearch), NLP (TD-IDF, SVM, RandomForest, AdaBoost, Voting), Recommendation Systems (scikit-surprise), ModelDeployment (IBM Cloud, IBM Watson).
Supported process improvements and established a culture of rapid experimentation.
Worked in data analysis of the real estate group of Zaplabs, performing tasks such as distance metrics between the listings of counties in Python. Created processes and DAGs with Python Airflow.
Analysis and design of new ETLs for Inter-American Development Bank. Led a small team that coordinated efforts, reported progress, managed data warehouse requirements and incidents queue for HUB International.
Development of the financial data warehouse of Funding Circle Tools: Pentaho Data Integrator, MySQL, Cassandra.
Developed and maintained multidimensional data models of the Management Control area, built on Oracle Hyperion EPM (Hyperion Essbase y Hyperion Planning 11.1) technology.
Supported tasks for Oracle Planning Forms and Essbase cube for the Region (Chile, Peru, Argentina and Colombia).
Integrated customized applications with Oracle standard application modules.
Extracted financial information from different oil subsidiaries companies (Financial balances, intercompany operations, investments) for the financial consolidation of CVP Group.
Performed ETL Building with Pentaho Data Integrator and designed and maintained the DB2 DWH.
Reported financial information using the Pentaho Report Designer. Created Multidimensional Cubes through the Pentaho Schema Workbench.
The project involved migrating a marketing platform to the AWS cloud.
AWS platform was chosen as the cloud infrastructure.
The pipeline was migrated from SSIS packages to PySpark ETL processes, utilizing S3 for storage, AWS Lambda for monitoring, EC2 for computation, and RDS/Redshift for database.
A cloud environment was created for the on-premise solution, leveraging S3 for data storage.