Megha M.

Megha M.

Data Engineer

Bangalore, India
Hire Megha M. Hire Megha M. Hire Megha M.

About Me

Diverse and comprehensive experience in IT performing multiple roles as Technical Manager, Senior Consultant, Data Engineer, Team Leader, Scrum Master with rich experience in ETL processes and data migration projects. Have led and managed a numerous data-centric projects to go live, responsible for end-to-end delivery, including design, planning, coding, project execution, project automation, resourcing, mentoring, quality assurance and delivery.

JSON Web Tokens (JWT) Kafka Streams Stock Trading Amazon Web Services (AWS) Guice Role-based Access Control (RBAC) Cloud Databases Full-stack Jenkins Sybase IQ Amazon API Gateway Architecture Cloud Architecture Debugging MySQL Test-driven development (TDD) Serverless Framework Data Analytics Back-end Architecture FinTech Pandas Team Mentoring Amazon EC2 CDC Integration REST APIs Amazon RDS AWS Data Pipeline Service KNIME Python Serverless C Google Guava Subversion (SVN) Amazon S3 (AWS S3) Amazon Virtual Private Cloud (VPC) AWS AppSync Kubernetes REST Software Design Sybase Android SDK Back-end Kibana Material Design Mockito Optimization Swagger Typescript CSS DevOps PostgreSQL Project Estimation Spring Microservice System Architecture Technical Architecture Third-party APIs Android APIs Database Design Mobile Web Development Amazon API CI/CD Pipelines Hadoop noSQL Spring Amazon DynamoDB AWS CloudFormation Big Data Design Patterns Elasticsearch Solution Architecture AWS Lambda Data Analysis Java Slang Interactive Brokers API Mobile Development Near-field Communication (NFC) AWS Amplify Back4App Data Engineering SQL Trading Vue Amazon Aurora ETL Google Guice Leadership Microsoft SQL Server Next.js Amazon Ion Distributed Systems Project Management Spring Boot Apache Spark Kotlin MongoDB Python 3 React Scalability Spark Apache Kafka Microservices Flask Gradle JavaScript Microservices Architecture Docker OAuth 2 Oracle SQL Algorithms AWS Elastic Beanstalk AWS Glue Perl AWS Cloud Computing Services Redshift Bitcoin Data pipelines Data Structures HTML Numpy Parse Server SAML Automated Trading Software AWS DevOps Blockchain Computer Science Git JSI

Work history

Oracle
Data Migration ETL Lead | Data Engineer
2012 - Present (13 years)
Remote
  • Successfully led and mentored a team of freshers in the high-stakes data migration project valued at multi million dollars.

  • An extensive ETL data migration engagement involving huge volumes of data extraction, data staging, cleaning, enrichment, formatting, application of complex business rules and final loading of 10 years of data into FLEXCUBE tables leveraging several automation tools created by me and my team.

  • Collaborated and Co-ordinated with multiple stakeholders involving FLEXCUBE teams, SCB Bank, FBC Holdings ensuring seamless integration of systems.

  • Implementing the project through writing pl/SQL migration scripts, running multiple Mock runs, DR runs on static and dynamic data having daily deliverables adhering to strict, non-negotiable deadlines.

  • Focused on Automating the processes and creating reusable generic utilities, conducting POCs and working on performance tuning to accommodate 10 years data migration.

  • Designed and got developed the migration of one the very complex Signatures module, extracting data from XML files, converting to CSV, and reconciling with JPEG images for over 10 years of customer records.

  • Conducted Agile ceremonies, daily scrums, Story creations and assignment in Jira, Holding Retrospective calls and all the related activities for Agile implementation of the Project having a team exceeding 11 members.

Infosys Technologies
Technology Lead | Software Engineer | Lead Tester
2006 - 2012 (6 years)
Remote
  • Implemented and Configured data product PMM to facilitate real time data integration in the application transitioning it from batch to real time processing. The impact was real time data was available in Netezza reporting database for Campaigning team to read real time purchasing behavior of customers and create more meaningful campaigns and coupons for them.

  • Enhanced the features and supported a web application that processes and reconciles travel data from multiple agencies, generating financial reports.

  • Developed PL/SQL procedures, debugged applications, created i-Reports, and automated testing using Rational Functional Tester (RFT).

Portfolio

Back-end Web Server

I developed a comprehensive back-end server utilizing Python and Flask, which was deployed on AWS Lambda and API Gateway. This server seamlessly integrated numerous APIs into a front-end website, Android application, and various app scripts. Additionally, I implemented JWT authentication and role-based access control to ensure secure access across all the APIs.

Rule Execution Platform for Amazon

I was a full-stack developer who contributed to the development of an Ion-based rule-execution platform that efficiently facilitated micro-service creation on the Amazon platform. During my tenure, I meticulously designed and implemented the following essential components: - An Ion data storage and retrieval solution utilizing DynamoDB and Elasticsearch. - A user interface portal powered by REST APIs, enabling seamless CRUD operations on all system resources for clients. - An on-box caching solution. - A robust permission control system to ensure secure data sharing among clients.

Messaging app for android

I have developed a native Android application for non-instant messaging purposes. In my role as a full-stack developer, I focused on designing the frontend utilizing material design components, while employing Back4App hosted Parse server as the backend solution.

Swift Claim API for Amazon

Utilized AWS native technologies including Lambda, CloudAuth, and API Gateway for the implementation of a client-facing REST API, facilitating the direct claiming of Amazon e-Gift cards into customer accounts for external-to-Amazon clients. Established the AWS cloud formation stack to seamlessly connect API Gateway with Amazon's internal network of microservices. Developed a Java client library that was widely employed by various external clients for seamless integration with the API. Successfully mitigated the Lambda cold start issue by leveraging AWS CloudWatch events, resulting in a significant reduction of average latency by nearly 50%.

Data Pruning Solution for GS Data Lake

As a proficient back-end developer, I collaborated closely with seasoned senior developers involved in designing the HDFS export feature. Together, we comprehensively grasped the system's functionality to ensure the seamless implementation of a client-requested data pruning functionality for Sybase IQ-based virtual warehouses. This inclusive feature, in particular, validates the presence of data within the underlying HDFS prior to executing the pruning workflow, all while maintaining compatibility with previous versions and long-term maintainability.

Automated Trading App

I developed and executed an automated trading application in Python utilizing Interactive Broker APIs for the purpose of purchasing and selling securities. A sophisticated approach was employed, incorporating various technical indicators including DMI and ADX to facilitate informed trading decisions.

Data Pipeline for Amazon

I successfully redesigned a data warehousing pipeline, resulting in a 70% reduction in operational burden for the team. This endeavor involved engaging with various stakeholders to discern between desirable and essential requirements for the streaming data. For instance, while some missing data was acceptable, duplicate data was deemed unacceptable. I conducted an in-depth exploration of multiple potential solutions, carefully considering the technical trade-offs associated with each. Eventually, I devised and implemented a new serverless solution using Fast Data Pipelines. This encompassed not only the design and implementation of a new service but also the incorporation of a backward compatible modification in all six related microservices. Additionally, I executed daily jobs involving SQL queries to reconcile data between the old and new solutions. Once the accuracy of the new solution was attested, I seamlessly transitioned to its utilization.

Education

Education
B. Tech in Electronics and Communication Engineering
Guru Jambheshwar University of Science & Technology, Haryana
2003 - 2007 (4 years)