Rohil is a Data Engineer with expertise in facilitating efficient and effective analysis and empowering users with multiple data types - building clean pipelines and maintaining data products on projects. He provides solutions to maintain reports, dashboards, and metrics to enhance the performance of data - summarizing analyses, making recommendations, and providing appropriate information that meets the needs of analysts and business owners.
Developed new database solutions to reduce licensing costs and improve the performance of the read servers.
Migrated the existing database infrastructure to MySQL, deployed Kafka pipelines for replication tasks, and integrated 3rd-party dashboards.
Set up 5 new servers for Media.net, optimized an existing system to perform 100x faster, added new features to multiple systems, and enhanced integration.
Built and implemented a new data infrastructure that generates different reports on a primary reporting interface for Media.Net.
Delivered a generic code for the primary reporting interface and implemented solutions to limit the max waiting time to 2s on the download queue.
Implemented a new MSSQL server to handle failover requirements, executed Powershell for file handling, and reduced maintenance time by 10X using a generic package.
Worked on a primary reporting interface with solutions to increase querying times on the interface and implemented solutions to generate different types of reports from multiple tables with 100+M rows within seconds. Implemented load-balanced schedulers to handle downloads and built a system to send database and server error alerts directly to the instant messaging system.
Designed and deployed 5 new servers to handle the load on existing servers and implemented solutions to download data reports and deliver the latest data reports over email at regular intervals. Monitored, administered, and fixed bugs, optimized performance, and added new features to critical systems on the project.
Designed and executed new GoLang-based connectors to fetch data from Netsuite, Zendesk, Promoter.io, Google Sheets, and Superset for Glint. Employed Airflow pipelines to ingest data from different sources into the data lake, setting up data streaming with Kafka, and ensuring data quality on the project.
Education
Post-graduate Degree in Data Science
International Institute of Information Technology Bangalore