Proficient Senior Software Engineer / Technical Lead with 12+ years of experience designing and delivering high-performance, self healing, scalable systems across FinTech, EdTech, and Web Platforms. Expert in Python, microservices, event-driven architecture, async APIs. Focusing on automation, with a strong track record of reducing operational overhead, improving system throughput, and accelerating delivery cycles. Adept at mentoring teams, guiding cross-functional collaborations, and transforming manual workflows into efficient, automated processes.
Career Goal: Lead a team to deliver scalable and impactful tech products.
Professional Goal: Design and develop large-scale, high-frequency self healing, systems to solve complex problems.
Personal Goal: Building homelab projects to master networking and bare-metal deployments.
Future Goal: Develop a home LLM to automate coding and manage business logic for personal projects.
Managed and enhanced eCube GUI, a web application for crawling and parsing website data using Python (Django/Flask), and mentored a team of 10 developers on Python best practices, frameworks, and scalable service design.
Designed and implemented a high-performance web crawling system capable of handling 1,000,000+ URLs per day with RabbitMQ for distributed queuing and Python (Requests + Selenium) for adaptive scraping. System was self-healing and maintenance-free, monitoring metrics across Redis, MongoDB, and MySQL.
This led to a performance improvement from previous ~20 million data points/week with ~40% errors; to 50 million data points/week with only ~10% errors, achieving high throughput, reliability, and scalable data ingestion.
Developed a standardized crawler–parser template to customize scripts for diverse site structures while bypassing aggressive anti-bot mechanisms.
Led end-to-end API design and client integration, including requirement gathering, microservices architecture, API documentation, and team guidance for implementation and deployment.
Developed a high-performance API capable of serving 3 million SQL records in a single call (~1,000 records/sec), eliminating the need for chunking or pagination and enabling efficient, large-volume data delivery for client applications.