Arthur F.

Arthur F.

Senior Data Engineer

Florianópolis, Brazil
Hire Arthur F. Hire Arthur F. Hire Arthur F.

About Me

Arthur has over 15 years of experience in software development and system analysis and seven years working with SQL and data analysis. He's experienced with Oracle, BigQuery, Snowflake, and other data sources. He also has a solid DBT, Python, data engineering, and business intelligence background. In addition to excellent relationships with work teams and customers, Arthur is very committed, adaptable, and creative.

Data Visualization Data Analysis Analytics SQL Databases Relational Databases ANSI SQL Oracle SQL Business Intelligence (BI) PL/SQL Snowflake BI Reports Data Analytics Data Warehousing Data Engineering

Work history

Appex Group, Inc.
Snowflake Data Engineer
2022 - Present (2 years)
, Remote
  • Designed and built a new data architecture for the company in collaboration with the new data team, helping automate data ingestion, modeling, and visualization.

  • Implemented dbt (data build tool) for data modeling and trained data analysts on how to use it properly.

  • Set up Fivetran connectors from several data sources to a Snowflake data warehouse.

  • Implemented Airflow DAGs (MWAA) to run and test dbt models and for custom data ingestions using API calls.

  • Built controls for data quality checks in the extracted sources using dbt tests and Airflow.

SnowflakeData Warehousing Data Warehouse DesignData EngineeringData Build Tool (dbt) Apache Airflow GithubFiveTranLookerData ArchitectureData ModelingAmazon Web Services (AWS) SQLPython
205 Data Lab
Analytics Engineer
2020 - 2022 (2 years)
, Remote
  • Worked 100% remotely for a US-based company, providing services for San Francisco Bay data customers as an analytics engineer.

  • Created automated custom data reports using Python and Excel VBA scripts.

  • Extracted data for reports from Presto DB and Snowflake using complex SQL scripts.

  • Transformed and modeled raw data through ELT processes using data-building tools.

  • Developed Python scripts for Prefect Cloud to automate and orchestrate data reports.

  • Integrated data between Snowflake and Salesforce to generate reports using Bulk API.

Spin
Data Analyst
2019 - 2020 (1 year)
, Remote
  • Analyzed data using complex SQL queries on Google BigQuery.

  • Presented data analytics reports for managers using dashboards from Mode Analytics.

  • Defined business metrics to support and determine company OKRs.

  • Provided monitoring and decision support reports for different areas such as growth, product, and operations.

  • Worked on geospatial analysis for scooters and generated map charts using Python and Jupyter Notebook.

  • Supported A/B tests to analyze the adoption of product features.

Dashboards Amplitude Jupyter NotebookPythonBigQuery SQL
Optiva Inc.
Solution Integration Architect
2018 - 2019 (1 year)
, Remote
  • Worked 100% remotely as part of a global professional services team (English and Spanish speaking teams).

  • Integrated and configured DCRM and portals for telecom customers in different countries.

  • Wrote system integration tests, user acceptance tests, test scenarios, configuration handbooks, user training, and production rollout documents.

  • Analyzed data from different CRM environments, exporting and comparing data into Excel using formulas such as VLOOKUP to troubleshoot missing configuration parameters.

BSSASPX Microsoft Dynamics CRM Microsoft SQL Server
Wedo Technologies
Data Analyst | Technical Leader
2011 - 2018 (7 years)
, Remote
  • Worked with large volumes of data. Found revenue leakages and trends between telecom systems, applying data analytics techniques.

  • Integrated several telecom systems into the RAID ETL tool, reading different data sources(Oracle, SQL Server, Excel, CSV, ASN1), containing telecom events and customer data.

  • Performed Oracle DB, SQL performance tuning, procedures using PL/SQL, and wrote Python and Shell scripts.

  • Provided system analysis and solution design documentation, scope and architecture definition, and technical proposals for sales.

  • Designed user reports, dashboards, and KPIs to support data-driven decisions and identify revenue leakages.

  • Oversaw unit tests, integration tests, UAT, and production rollout. Completed technical and functional training for customers and teams.

  • Provided technical leadership and project management in different projects.

  • Worked on various projects for telecommunications customers in different countries, such as Brazil, Chile, and Peru.

PL/SQLBusiness Intelligence (BI) ETLPythonShellUnixRAIDMicrosoft SQL Server Oracle
Seventh
Developer
2009 - 2011 (2 years)
, Remote
  • Contributed to Delphi programming (MVC, object-oriented) for video surveillance software.

  • Integrated different kinds of devices such as IP cameras and video encoders.

  • Led network protocol integration using CGI, SOAP, HTTP, TCP, and RTSP.

  • Reverse-engineered protocols using Wireshark. Decoded video and audio using VLC libraries.

CGI SOAPNetwork Protocols TCP/IPDelphi
Alliance Consutoria
Trainee | System Analyst
2004 - 2007 (3 years)
, Remote
  • Developed software with Uniface language—Compuware.

  • Integrated databases including Oracle, SQL Server, and DB2.

  • Contributed to development using Agile methodologies.

  • Participated in level two CMMI project implementation.

IBM DB2Microsoft SQL Server Oracle
SimplyWise
Data Engineer
Present (2024 years)
, Remote
  • Designed and built a data architecture to help the company make data-driven decisions.

  • Integrated multiple data sources such as Apple Search, Google Ads, MySQL, and Amplitude into a Redshift data warehouse.

  • Configured data integration services like Fivetran and Stitch to collect different data sources.

  • Developed a Python data pipeline to read and parse JSON files, upload them to Amazon S3, and load them into Redshift tables.

  • Installed and configured the data-building tool to perform data transformation through ELT data modeling.

  • Built dashboards and reports using Tableau Online.

Amplitude RedshiftPythonData Build Tool (dbt) TableauAmazon S3 (AWS S3) Amazon EC2 FiveTranStitch Data SQLELT ETLData pipelinesData EngineeringData Architecture
Projeto 22
Data Engineer
Present (2024 years)
, Remote
  • Worked as a part-time data engineer, supporting the data squad on a new data lake project for a vehicle sales company WebMotors.

  • Supported the data engineering team providing AWS resources: EC2, S3, VPC, RDS using Aurora PostgreSQL, and Redshift.

  • Developed Cloud Formation templates to automate AWS resources provisioning.

  • Deployed an Apache NIFI cluster for data ingestion processing using Zookeeper and NiFi Registry.

  • Integrated AWS DMS (data migration service) mapping tables from SQL Server and Aurora MySQL to S3 buckets in parquet files.

  • Employed an Apache Airflow and EMR cluster to support complex data processing.

  • Built CloudWatch alarms for servers and applications monitoring.

  • Used Lambda and Boto3 to automatically stop/start EC2 and RDS instances according to schedule, reducing costs.

  • Developed Python scripts for several purposes, such as database stress tests.

Amazon Web Services (AWS) Apache Airflow Apache NifiEMRPostgreSQLRelational Database Services (RDS) Amazon Virtual Private Cloud (VPC)Amazon S3 (AWS S3) Amazon EC2

Portfolio

Data Analyst for a Toptal Client

Designed a geospatial dashboard (map) for a vehicle sharing company, displaying the user adoption and behavior change related to defined parking places. The data was extracted made using SQL and using Google BigQuery geospatial libs. The dashboard was made using Mode, Python, and Jupyter Notebook.

Data Engineer for Fin Tech Company

Designed and built a data architecture to help the company make data-driven decisions. I integrated multiple data sources such as Apple Search, Google Ads, MySQL, and Amplitude into an AWS Redshift data warehouse.Configured data integration services like Fivetran and Stitch to collect different data sources and developed a data pipeline using Python to read and parse JSON files, upload them to AWS S3 and load them into Redshift tables. I installed and configured the DBT (data building tool) to perform data transformation (ELT data modeling) and designed the dashboards and reports using Tableau Online.

New Data Architecture

I supported the company in building a new data architecture in two stages for data ingestion automation, storing all the data in one data warehouse and supporting the data analysis team to focus on the business intelligence side.In the first stage, we made the data available for data analysts as soon as possible using Fivetran and storing it in Snowflake.In the second stage, we built custom connectors using Airflow and implemented more organized data modeling with dbt and SQL.

Education

Education
Master of Business Administration (MBA) in Project Management
Fundação Getulio Vargas (FGV)
2011 - 2012 (1 year)
Education
Bachelor's Degree in Computer Engineering
Universidade Metodista de São Paulo
2002 - 2007 (5 years)