Aditya is a developer with experience building machine learning and statistical models with large-scale data sets on cloud platforms using the latest big data technologies. Thanks to master's degrees from the IE Business School and IIT (ISM) Dhanbad, Aditya has a solid understanding of data science in various business scenarios. He is also a former quantitative researcher specializing in time-series and machine learning-based strategies and risk models in financial markets.
Built time series forecasting models using SOTA deep learning algorithms like N-HiTS and N-BEATS, which outperformed traditional ARIMA and Holt-Winters ES models.
Built a proprietary trial optimization algorithm to predict the end date of trials, which outperformed all the time series models.
Built ensemble models for demand and sales forecasting.
Developed a BERT-based conversational AI solution based on business requirements.
Converted natural language queries into SQL queries using BERT-based deep-learning architecture.
Contributed to significant parts of the back-end flow and took ownership of those flows.
Extracted various fields from contract PDFs using regex and deep learning models and optimized the models to increase processing speed using TensorRT.
Put the DL models into production using APIs and Docker. Used AWS and GCP to enable autoscaling features.
Natural Language Processing (NLP)
Generative Pre-trained Transformers (GPT)
GPT
Custom BERT
APIsPython 3
Google Cloud Platform (GCP)
Deep LearningAmazon Web Services (AWS)
Machine Learning Operations (MLOps)
FlaskREST APIs
DockerAutoscaling
Futures First
Quantitative Analyst
2013 - 2019 (6 years)
Remote
Performed an exploratory data analysis on large-scale financial datasets and derived insights that led to tradable strategies, using Python and visualizing data through dashboards in Tableau.
Implemented a time series analysis (SARIMA and GARCH) of prices in commodity markets, considering CFTC reports and external factors like currency.
Developed regression-based mean-reverting strategies in fixed-income markets of the US and Brazil.
Deployed ETL pipelines and ML pipelines working on GCP.
Performed backtesting and forward testing of strategies by tracking their Sharpe ratios.
Performed hypothesis testing and evaluated the risk for strategies based on Monte Carlo simulations and historical value at risk.
Built natural language pipelines to track news sentiment.
Created a tweet listener capable of listening to the tweets from a given list of authors and making the data ready for the decision engine.
Built the automated trading capacity using the Alpaca API.
Developed the end-end analysis of a particular Twitter IPO hypothesis.
Worked on the decision engine using a random forest regressor that accepts the tweet and the stock price and gives out a stock buying or selling recommendation.
Competed at the IE Business School's startup lab and won the investors' choice award and the most innovative project award.
Developed the whole machine learning pipeline from scratch, starting with a web scraper for pictures, extracting properties of a picture, and training the model using the data.
Served the model using a REST API (Flask) on the website wiselike.pythonanywhere.com.
Performed A/B and hypothesis testing to test the validity of the model.
Developed a novel 4D (degrees of freedom) solution for the simultaneous localization and mapping of an unmanned aerial vehicle to reduce the computation cost and published research on the same (Leeexplore.ieee.org/document/6461785).
Combined location data from various sources like LIDAR, proximity sensors, inertial measurement units, and camera using extended Kalman filters to update the state information of the robot.
Developed a fuzzy logic-based PID controller for the unmanned aerial vehicle to maintain stability during flight.