Avinash is an experienced Python developer with 10+ years of industry knowledge, writing web applications and performing backend development. He has mostly worked in the fintech domain with both startups to big MNCs and from hedge funds to risk management domains. Avinash has a good grasp of software fundamentals and is an excellent problem solver. His latest project involved building REST APIs and a model calculator for Wells Fargo, a community-based financial services company.
Developed REST APIs and a model calculator for Wells Fargo, a community-based financial services company. Developed automated data copy functionality using Celery and Airflow.
Worked on designing a platform for modelers to write and develop models or perform data analysis
Designed and developed a dataset processing library using pandas and exposed UI using Django allowing, users to fine tune the processing.
Designed and developed data check engine microservice as part of larger Data Check framework using Flask, Pandas, and Numpy.
Created workflows in Airflow to automate loading and consumption of data. Wrote a Python client library for Business users that allows them to interact with dataset results.
Developed an application to perform economic checks for dataset(eg Sharpe Ratio) results.
Worked on the development of a language that can be used to write Forecasting Models for CCAR. It is implemented using internal DSL on pure Python and supports delayed computation.
Developed a web application that runs calculations for CCAR and other risk exercises. Designed and wrote a data access layer that acts as interface between UI/APIs and Oracle database.
Designed and developed a job monitoring application written in Enaml (for UI) and Python, as well as position on-boarding scripts.
Took ownership of a code-coverage tool, including running the tool, debugging and fixing bugs present that were reported by users.
Incorporated Java code coverage into the command-line version of the tool. Developed a Python-based framework for small utilities.
Designed and developed a Python version of the command line tool that was easier to maintain than the existing one. Wrote a web application to view source code with any staging area using Django.
Worked on developing an Android application for stability testing and Key Performance Indicator (KPI) testing.
Created a Java-based Android application and Python Test Scripts in the System Validation Team for stability testing, Key Performance Indicator (KPI) testing. Handled debugging issues of the Linux platform with Android Operating System.
Worked onsite in Validation head-quarters in Sweden. Tasks involved transfer of ownership for a Python-based Automated testing framework.
Our team at AQR computed certain parameters for the mutual funds, these values need to be validated by checking whether they fall in a standard distribution. This was done manually using excel, the data comes in via REST API, which was extracted using Postman and then populated into an excel file. I decided to implement this in Python as these checks took too long and were prone to manual errors. When gathering requirements, I figured out that this can be implemented as a framework that other teams can use as well. So the framework that I wrote was a REST service where the user can specify the check to be applied and how to fetch the data, the framework provided connectors to fetch data from various sources (s3, REST API, DB, files), it also provided basic checks (standard distribution, greater/less than), provided base check classes to be overridden for custom implementation. The framework would apply the checks, store intermediate and final results, and these can then be viewed by another REST call.
In JPMC, a lot of our modellers use excel for data modelling, but this is not scalable or even trackable. We decided to change that by writing a platform/app that could be used as an alternative to Excel. For this we used Jupyter Lab for UI and the backend was implemented using Python(Pandas, NumPy, Xarray), we also provided REST APIs(Django) for users to create/share/delete projects. The Jupyter lab would connect to a kernel that had our library already imported, it also had a tab to view the tabular data (easier for folks who have always used excel). This library contained function with function names and signatures similar to excel so that it becomes easy for our modellers to use it. I developed one key feature that was pre-loading user data into the Python kernel and assigning it to variable names defined by the user.
The project involved a setup where our clients can host a quiz for their employees, and record the responses. This data is stored across different tables. I wrote a framework that would process these results and generate an appropriate alert based on various conditions. I had to figure out how to write optimum SQL to query our Postgres DB to fetch only the required records and not all. Then I processed this data with Pandas and separated it into different sections based on conditions for generating alerts. These alerts are then sent to the employees via an in-app notification and SMS. I also wrote a Django UI for controlling this process so that new alerts/conditions can be uploaded, as well as configuring clients and their notification channels.
Education
Bachelor of Technology (BTech), Electronics and Communications Engineering