DevOps Engineer - Created a Web Application to Manage AWS Workspaces
Created a web portal to help staff manage (request, create, rebuild and delete) AWS workspaces for virtual work. The solution was built using several AWS services; Jekyll for building static sites and Terraform for deployment. AWS Cognito was integrated into the Jekyll site to grant access to staff and AWS Step functions were used in the orchestration of Lambda functions for the creation, rebuilding, requesting and deleting of workspaces. API gateway was integrated with Lambda functions on the backend and the API endpoints integrated into the Jekyll site. AWS Directory service was used to manage users and groups logging into the AWS workspace desktop and the static site was deployed using S3 bucket, CloudFront and WAF for protection. Designed the solution, built the website and created the IaC templates. The web portal was successfully launched and is used by thousands of staff in the agency.
Senior DevOps Engineer - Cloud Hosting Solution
Built a cloud hosting environment for static sites and serverless functionality with high availability and reliability for developers, SMS and enterprises using GKE, EKS, Anthos, Kafka, SQS, Redis, Mongo, Postgres, Xstate, Prometheus, Grafana, AlertManager, PagerDuty, Honeycomb, GraphQL, Apollo server, OpenResty, Lua, Google cloud run, Datadog, and Knative. Designed and developed the cloud hosting environment for frontend web pages on the solution - implementing different solutions to build and deploy apps into any desired cloud storage system. Deployed a serverless function on the backend using the Knative function.
DevOps Engineer / Infrastructure Manager - Design an Asynchronous Infrastructue for Processing millions of record ingested into S3 bucket (Marketing Recommendation Solution for E-Commerce Companies)
Worked on a marketing recommendation service deployed across the e-commerce platforms of major brands in the US. It involved the deployment of a system to track out of stock products on the e-commerce providers’ website and notify customers when products are back in stock. The solution was setup using the FileZilla server as the input end-point for e-commerce partners, credentials were created for each partner to upload the necessary feed file into the bucket. Through an FTP server setup on EC2 instance, a CRON job moved the file to the S3 bucket which triggered a Lambda function to read each record in the file and send it into Kinesis stream. The application had a total of 10 microservices working in an asynchronous flow, sending and receiving messages through the SQS queue. The final processed message is written to an RDS table and sent as an email to the final customer. Created new features for the system and fixed bugs on applications, and created deployment templates in Terraform and Cloudformation. It processes over 600 million records daily for different e-Commerce brands in the USA with over 5 million users.