Jeff is a Senior BI Consultant with a penchant for working on the cutting edge of high-performance computing, data science and advanced analytics projects; providing oversight for design decisions, best practices and processes. He employs modern BI/ETL tools for processing huge volumes of datasets; creating, maintaining and optimizing data delivery and extraction - having successfully delivered 4 data warehouses, 8 BI reporting systems and dozens of data analysis projects for clients.
Provided engineering expertise for the implementation of ETL/BI/Data Warehouse solutions on BC Hydro's smart meter project; promoting data quality, integrity, accuracy, and support for data standards.
Worked on data transition, transformation, and movement into the BI repository; supporting software installations, maintenance, and upgrades - testing software updates and patches on the project.
Designed, developed, and deployed interactive Business Intelligence (BI) dashboards on the project; integrating data from various data sources into the dashboard.
Worked on the end-to-end design and implementation of BI/ETL (reporting and analytics) solutions ATCO's clients in the Oil and Gas, and Utility industries.
Utilized key technical resources from clients for the modelling of data warehouse systems; providing optimal and cost-effective solutions to ATCO's clients in achieving business objectives.
Provided technical leadership in the analysis of data on solutions; working to improve accuracy, speed of delivery and effectiveness through changes and implementations of processes, systems, etc.
Provided accurate reports on business-critical KPIs and related parameters to support business decisions in the design and implementation of ETL solutions for URS' smart plan and heat system.
Provided thought leadership for the delivery of value-added insights and analysis in constructing data decomposition diagrams, providing data flow diagrams, and documenting processes.
Identified development and data quality issues on projects; working with leads to implement new or mitigating solutions.
Applied functional knowledge for the design of the data model and datamart (star schema) on the OLAP web-based GIS management system.
Offered leadership for the implementation of solutions on the inspection and collision systems; steering them through best practices, resolving issues and ensuring knowledge transfer between consultants and Alberta Gov's resources.
Provided solution strategy, oversight and guidance for large, complex projects; acting as a point of contact for Oracle implementations, managing solution scope, and keeping functional processes on track.
Identified areas for improvement on the Alberta Blue Cross platform and built both long-term and ad-hoc solutions.
Reviewed and resolved production defect issues for business users through debugging objects on the production landscape.
Performed production support activities including daily job monitoring, troubleshooting break/fix issues and remediation, and system enhancements in support of Alberta Blue Cross' platform.
Designed, analyzed, implemented and supported solutions for the reporting system and dashboards; evaluating available alternatives and managing resource planning and constraints on the project.
Empowered end-users on projects through instruction and functional knowledge transfer of business solutions.
Provided support for Oracle BIEE solutions by addressing system function anomalies, and end-user questions/issues; fixing issues relating to performance, data quality and more during production support.
Developed databases and systems that allowed for the creation of dashboards, KPI scorecards, and additional reports on solutions for CGI.
Analyzed BI requirements for the design and development of reporting solutions to support CGI's standard reports and dashboards.
Administrated data warehouse solutions on the project; troubleshooting technical issues and coordinating with the support team to resolve product bugs and issues.
This project synchronized data from various data sources into a target database such as PostgreSQL and Redshift; syncing data through a batch process using an incremental approach. It can generate mappings between data source and targets - eliminating the need for manual ETL mapping on the solution. It comes with a commit feature that restarts failed sync operations from the point of failure; saving uses time and cost.
Worked on a complex query solution with a lot of fo SQL queries on its Metabase reporting tool. The new solution parses all queries and easily maps data/queries from source to target. The program lists all columns and sub-queries; mapping them automatically to targets.
Developed a new reporting tool for the client's big data lake platform hosted on AWS. Utilized a spectrum approach in mapping S3 data into Redshift, created aggregation reports on SQL and presented results through CSV on the AWS Lambda function on the serverless framework. It provides results through an API link, uses Python in customizing ETL processes and air flow to orchestrate incremental processes on the solution.