Mate B.

Mate B.

Senior Software Engineer

Georgia
Hire Mate B. Hire Mate B. Hire Mate B.

About Me

Mate is an experienced software engineer with a passion for high-performance solutions. He has expertise in programming languages and tools like Python, FastAPI, AWS, and cloud engineering. He has worked on projects in finance, healthcare, and eCommerce. Mate's strong attention to detail and commitment to quality make him a valuable asset to any team. He thrives in a fast-paced environment and is committed to staying up-to-date with the latest technologies to deliver cutting-edge solutions.

Back-end Data Scraping Web Scraping Serverless OAuth Pycharm Python REST APIs Linux Amazon Web Services (AWS) Flask Docker PyTest Elasticsearch AWS Lambda

Work history

Elysium Health
AWS Back-end Engineer | Python Developer
2022 - 2023 (1 year)
Remote
  • Developed REST APIs using FastAPI deployed on API Gateway through Magnum.

  • Linked Fitbit and Garmin accounts to our service and pulled data from these accounts.

  • Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.

  • Implemented continuous integration and continuous deployment (CI/CD) pipelines using tools like CircleCI to automate the build, test, and deployment process.

Back-end Amazon Web Services (AWS) ServerlessAWS Lambda OAuth 2 OAuthGarmin API Fitbit API Ruby on Rails (RoR) Node.jsLambda Functions Lambda Architecture DatabasesContainerizationPyTestPycharmFastAPIREST APIs APIs
Mesh
Senior Software Engineer | Python Developer
2020 - 2022 (2 years)
Remote
  • Led the development of REST API applications from scratch using Python and Flask; ensured adherence to coding standards, best practices, and design patterns.

  • Collaborated closely with the product manager and other stakeholders to gather requirements, define the project scope, and establish priorities for development sprints.

  • Designed and implemented an automated testing strategy that included unit tests, integration tests, and end-to-end tests, ensuring application quality and reliability.

  • Implemented a CI/CD pipeline using CodePipeline, allowing for efficient and automated deployment of applications.

  • Collaborated with the front-end development team to ensure seamless integration of APIs into web and mobile applications, improving user experience and overall performance.

  • Built an entire scraping architecture based on asyncio and aiohttp. Served as a team lead and discussed demos and project improvements regularly with the CEO.

PythonFlaskAPIsREST APIs API IntegrationLinuxData ScrapingWeb ScrapingSoftware ArchitectureTechnical Leadership CloudLeadershipArchitecture DevOpsElasticsearch Amazon Web Services (AWS) AWS Lambda SeleniumEarly-stage Startups Bots CI/CD Pipelines Test AutomationBack-end ServerlessOAuth 2 OAuthPython Dataclasses PydanticAPI Architecture API Applications CeleryRedisLambda Functions Lambda Architecture DatabasesContainerizationPyTestDockerPycharm
MaxinAI
Software Engineer | Python Developer
2019 - 2020 (1 year)
Remote
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.

  • Implemented CI/CD pipelines using tools like Jenkins, CircleCI, or GitLab CI to automate the build, test, and deployment process.

  • Worked with AWS services like EC2, S3, RDS, and Lambda to deploy and scale applications in the cloud. Developed and maintained monitoring and logging solutions using tools like the CloudWatch stack to ensure high availability and performance.

  • Created a system for scraping new and popular events from websites such as Eventbrite and Ticketmaster.

  • Automated a scraping process using cron jobs and created a REST API for calling and managing spiders.

ScrapyPythonElasticsearch PandasCronFlaskAPIsREST APIs Amazon Web Services (AWS) AWS Lambda SeleniumMVC Frameworks MySQLCI/CD Pipelines Back-end DatabasesContainerizationPyTestDockerPycharm
MaxinAI
Software Engineer | Python Developer
2018 - 2019 (1 year)
Remote
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.

  • Implemented CI/CD pipelines using tools like CircleCI or GitLab CI to automate the build, test, and deployment process. Worked with AWS services to deploy and scale applications in the cloud.

  • Developed and maintained monitoring and logging solutions using tools like CloudWatch to ensure high availability and performance. Improved application security by implementing authentication and authorization mechanisms like OAuth 2 or JWT tokens.

  • Contributed to a scraping and parsing system for the United States 50 states' laws using Python Scrapy.

  • Worked with databases like PostgreSQL, MySQL, or MongoDB to design and implement data models and data access layers for APIs. Collaborated with front-end developers to integrate APIs into web and mobile applications.

PythonScrapyFlaskAPIsREST APIs API IntegrationLinuxSQLData ScrapingWeb ScrapingCloudElasticsearch Amazon Web Services (AWS) AWS Lambda MySQLBack-end DockerPycharm
Divio AG
Python and Django Developer
Present (2024 years)
Remote
  • Refactored and migrated legacy code to a new API, enhancing stability and scalability for a cloud management service provider.

  • Wrote tests for new endpoints and fixed tests for older ones.

  • Took part in rectifying the company's existing bugs and contributed to improvements.

Ekwithree
Python Web Scraper
Present (2024 years)
Remote
  • Created crawlers to crawl over four million websites and collect data about the about page, contact information, address info, etc.

  • Stored data into Elasticsearch and created ES indexes with analyzers and tokenizers, which would have great searching scores.

  • Run crawler on multiple servers to boost performance.

MaxinAI
Senior Software Engineer | Python Developer
Present (2024 years)
Remote
  • Designed and developed REST APIs for various web applications using Python and FastAPI. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.

  • Implemented CI/CD pipelines using tools like CircleCI or GitLab CI to automate the build, test, and deployment process. Worked with AWS services to deploy and scale applications in the cloud.

  • Modified the code architecture and refactored code to match current Python standards. Optimized API performance by implementing caching, load balancing, and distributed computing solutions.

  • Worked with databases like PostgreSQL or MongoDB to design and implement data models and data access layers for APIs. Collaborated with front-end developers to integrate APIs into web and mobile applications.

PythonAPI GatewaysFastAPIPostGisAPIsREST APIs API IntegrationLinuxSQLData ScrapingWeb ScrapingTechnical Leadership CloudLeadershipDevOpsElasticsearch Amazon Web Services (AWS) FFMPEGAWS Lambda Video StreamingMySQLEarly-stage Startups Discord Bots CI/CD Pipelines Test AutomationBack-end ServerlessOAuth 2 OAuthPython Dataclasses PydanticAPI Architecture API Applications RedisSDKs Lambda Functions Lambda Architecture DatabasesContainerizationPyTestDockerPycharm
MaxinAI
Software Engineer | Python Developer
Present (2024 years)
Remote
  • Influenced database scraping and created a scalable scraping system for social media crawling. Scraped over 50 million users' data from Instagram, Facebook, YouTube, and Twitch.

  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.

  • Developed an algorithm identifying the same users throughout different social media platforms. Created an algorithm for detecting bots across all social media.

  • Implemented CI/CD pipelines using tools like Jenkins, CircleCI, or GitLab CI to automate the build, test, and deployment process.

PythonScrapyFlaskREST APIs APIsCloudArchitecture Software ArchitectureMongoDBElasticsearch Neo4jAmazon Web Services (AWS) AWS Lambda MicroservicesSeleniumMVC Frameworks MySQLBots CI/CD Pipelines Test AutomationDjangoBack-end ServerlessOAuth 2 OAuthPython Dataclasses API Architecture API Applications CeleryRedisSDKs pipSoftware Packaging DatabasesContainerizationPyTestDockerPycharm
MaxinAI
Software Engineer | Python Developer
Present (2024 years)
Remote
  • Developed REST endpoints for calling machine learning (ML) models on live data for Amazon products' allergen checking.

  • Designed an ML pipeline system for food label validation and compliance that uses several neural network models to receive images of labels and extract required information, such as nutrition facts, allergens, ingredients, and weight.

  • Created a pipeline accuracy evaluation system to measure end-to-end and each step's accuracy.

  • Implemented unit tests and test-driven development.

PythonScrapyFlaskAPIsREST APIs CloudElasticsearch MongoDBPandasOCRArchitecture Software ArchitectureAmazon Web Services (AWS) AWS Lambda MicroservicesSeleniumMVC Frameworks MySQLBots CI/CD Pipelines DjangoBack-end DatabasesContainerizationPyTestDockerPycharm
MaxinAI
Software Engineer | Python Developer
Present (2024 years)
Remote
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.

  • Implemented CI/CD pipelines using tools like Jenkins, CircleCI, or GitLab CI to automate the build, test, and deployment process.

  • Worked with AWS services like EC2, S3, RDS, and Lambda to deploy and scale applications in the cloud. Developed and maintained monitoring and logging solutions using tools like CloudWatch or ELK stack to ensure high availability and performance.

  • Improved application security by implementing authentication and authorization mechanisms like OAuth 2 or JWT tokens. Worked with databases like PostgreSQL, MySQL, or MongoDB to design and implement data models and data access layers for APIs.

  • Created an MVC project using web2py and raw JavaScript on the front end. Collaborated with front-end developers to integrate APIs into web and mobile applications.

  • Managed both front-end and back-end tasks, supporting later bug fixes.

PythonAPIsREST APIs FlaskWeb2py PostgreSQLElasticsearch Amazon Web Services (AWS) AWS Lambda MySQLTelegram Bots Back-end DockerPycharm

Portfolio

Mesh - Verifying Professionals

I was part of a team that developed a service for a software company to improve the process of verifying professionals. Our goal was to create a system that would help customers quickly verify their professional licenses without contacting support directly. To achieve this, we developed an on-demand live crawler crawling over 100 websites and could understand customer queries and return license information.To ensure the service was reliable and scalable, we deployed it on a cloud platform and set up a monitoring system to track performance and quickly identify any issues. We also regularly reviewed customer feedback and improved the service responses based on user input.We decided to use a REST API to provide users with easy access to the service, which required me to build a robust back-end system in Python using the FastAPI web framework.We collaborated closely as a team throughout the development process, leveraging cutting-edge technologies to create a high-quality product. To ensure that the system was reliable and efficient, I performed extensive testing and evaluation on the data processing pipeline, including testing for edge cases and performance issues.

REST API for Climate-related Project

As a Python developer, I was involved in creating a REST application for a climate-related company that allowed their users to detect losses in building costs related to climate change, like flooding, heating, etc. Additionally, I played a crucial role in restructuring and refactoring the architecture of the existing code to improve its efficiency and scalability.To start, I worked closely with the company's product team to understand the requirements and goals of the new application. We decided to use a REST API to provide users with easy access to the data, which required me to build a robust back-end system in Python using the FastAPI web framework.After completing the initial development, I began restructuring the existing codebase to improve its efficiency and scalability. This involved identifying areas of the code that were slowing down the application and rewriting them to be more performant. Additionally, I worked to simplify the codebase and remove any unnecessary dependencies, which helped to make the code more maintainable and easier to work with.Through this process, I was able to significantly improve the performance of the application and make it easier to add new features and functionality.

Juststream | Video Streaming Platform

I am the founder of Juststream.live video streaming platform. The serverless video streaming platform built on AWS is a highly scalable and performant platform that had 500,000 monthly users. The platform is designed using a serverless architecture, which allowed me to focus on building features and functionality rather than managing servers. The platform leveraged AWS services such as Lambda, API Gateway, AWS Media Convert, Elastic Load Balancer, and CloudFront to provide users with a highly secure, reliable, and low-latency video streaming experience. I have also implemented various monitoring and logging solutions to ensure the platform's performance and health were constantly checked. Overall, the project was an impressive achievement that demonstrated the power of serverless architecture and the capabilities of AWS services.

Big Data Collection and Management for a Social Media Platform

As a Python developer, I was tasked with writing a REST API and a Big Data crawler to collect more than 100 million user information data from different social media platforms for a marketing research company.To start, I researched the different social media platforms and identified the necessary data points relevant to the marketing research project. I then wrote a Big Data crawler in Python that could collect this data from multiple sources, process it, and store it in a scalable database.To ensure that the data was collected efficiently and reliably, I designed the crawler to run on multiple servers in parallel to ensure that the crawler could handle a large volume of data in a timely manner.Once the data was collected and processed, I wrote a fast and reliable REST API in Python to allow the marketing research company to access and analyze the data easily.To deploy the project on AWS, I used Amazon's Elastic Beanstalk service. This helped ensure the project could handle a large volume of traffic and data.I worked closely with the marketing research company to ensure that the project met their needs and goals. I also performed testing to ensure that the system was reliable and secure.

US Statutes and Laws

As a Python developer, I took part in building a scalable web crawler to collect US statutes and laws from various legal websites. This involved designing a crawler architecture that could handle large amounts of data and was efficient in its execution.To begin, I worked with the product team to identify the websites that contained the desired legal data and then created a crawler using Python libraries such as Scrapy and Selenium to scrape the web pages and store the data in a format that could be easily used in further processing.To ensure that the crawler was efficient and scalable, I deployed it using AWS. This allowed us to process large amounts of data quickly and reliably.Once the crawler was complete, I created a REST API in Flask that would provide authorized users access to the collected data.I designed the API to be fault-tolerant, with data replication and load balancing, to ensure it could handle high traffic levels without downtime.Throughout the development process, I worked closely with the product team to ensure that the final product met their requirements and goals. I performed testing and monitoring to ensure that the system was reliable and secure.

Allergen Checking on Products Using ML

As a Python developer, I built a pipeline for a software engineering project that checked for products' allergens and other information in their description on Amazon pages. The pipeline consisted of data extraction, cleaning, preprocessing, and deployment.I researched the necessary data points to extract from the Amazon product pages. I used Python libraries such as Requests, Selenium, and Pandas to extract the data from Amazon product pages and clean it for further processing.After cleaning the data, I developed a robust system that could efficiently process large datasets of product descriptions to identify the presence of allergens and other key information.Once the pipeline was developed and tested, I deployed it in a REST API using Flask, which allowed users to enter the Amazon product URL and retrieve information on the presence of allergens and other key information related to the product's ingredients and nutritional facts.Throughout the development process, I worked closely with the product team to ensure the system met their requirements and goals. I also performed testing and monitoring to ensure the system was reliable and secure.

Education

Education
Bachelor's Degree in Computer Science
Ivane Javakhishvili Tbilisi State University
2015 - 2019 (4 years)