
Siva Karanam
Verified Expert in Engineering
Software Developer
Kakinada, Andhra Pradesh, India
Toptal member since January 19, 2022
Siva has over seven years of professional experience in software development. He has solid knowledge of algorithms, data structures, and object-oriented programming (OOP). Siva is experienced in web development, web crawlers/scraping, data cleaning, and data munging. He has worked with all phases of the software development lifecycle and Agile methodologies. In addition, Siva has experience with MySQL, Microsoft SQL Server, and Consul, and he knows data science and machine learning concepts.
Portfolio
Experience
- Python - 6 years
- Software - 5 years
- Large-scale Web Crawlers - 4 years
- Agile Sprints - 4 years
- Web Development - 4 years
- REST APIs - 4 years
- Data Engineering - 2 years
- MySQL - 2 years
Availability
Preferred Environment
PyCharm, Slack, Jira, Confluence, Agile Sprints, MacOS, Python, Software, Linux, Databases
The most amazing...
...recognition I've received is the prestigious Rakuten Excellence Award while working at Rakuten.
Work Experience
Back-end Developer
OnCorps, Inc.
- Developed a pipeline to process the financial statements and generate meaningful insights.
- Built client-specific Python scripts to find errors in financial statements.
- Automated many manual tasks to reduce the manual efforts to identify negative cases.
Full-stack Engineer
Silq
- Integrated the shipping (visibility) tracking with a third-party API, including DHL and UPS.
- Tracked and fixed many bugs using the Jira tool. Worked with retooling to design new pages using React components and changed the existing forms.
- Researched and found the rich data source for seaports and airports with valid identifiers. Ingested 10,000+ seaport and airport data.
Senior Software Full-stack Engineer
Rakuten
- Built an integrated tool that would help manage all the systems and services under a single platform.
- Automated manual work using Jenkins and Python, which reduced 130 manual hours per month.
- Created a web application to interact with servers via SSH and ran commands from a browser.
Data and Back-end Engineer
Myntra
- Designed and developed a config-based crawler framework for large-scale web crawling, crawling at 20 million records every day.
- Wrote post-processers to convert the unstructured data crawled from different sources to structured data. Automated the scripts to process millions of records every day.
- Structured and stored the data into databases as needed and then ran the models on top of it to get the insights.
- Developed REST APIs to get data for fashion insights.
Software Engineer
Headrun Technologies Pvt Ltd
- Developed site-specific Web crawlers with and without using the Scrapy framework, which extracts metadata from websites, delivers the meta in JSON and MySQL formats, and provides metadata to search databases.
- Worked as a full-stack developer and created a web crawlers-based health tracking tool with JavaScript on the front end and Django, Tastypie, and Nginx on the back end.
- Pulled the data from different sources like websites, email, pdf, cloud, etc., and then stored the data into different storages using models.
- Developed bots using Python and Selenium, incorporating CAPTCHA-solving techniques (e.g., text input, drag-and-drop) to automate large-scale booking processes.
- Implemented a Web crawlers-based stats tool with Django, with over 2,000 web crawlers of health, information, and rich metadata. This tool monitored the web crawlers and sent notifications on the heat map with different color codes.
Experience
Omnia
Private Cloud Application for an MNC
Market Intelligence Product for eCommerce Company
KEY CONTRIBUTIONS
Data Architecture and Planning:
• Conducted in-depth analysis of market intelligence requirements and developed a strategic plan for efficient web data extraction.
Scalable Web Scraping Framework:
• Implemented a robust web scraping framework using Python and Scrapy to extract data from eCommerce platforms, competitor websites, and industry news sites.
Configurable Crawling System:
• Engineered a configurable crawling system to adapt to dynamic changes in website structures.
• Implemented regular updates for accurate data retrieval.
Data Cleaning and Transformation:
• Applied advanced data cleaning techniques to handle diverse, unstructured data formats.
• Transformed raw data into structured datasets for integration into the company's analytics pipeline.
Alerting Mechanism:
• Implemented an alerting mechanism to notify stakeholders of significant market changes.
RESULTS
The market intelligence product significantly improved strategic decision-making.
Auction Scraper for Predictions
I was the back-end and data engineer who developed the scraper, collected the data, processed it (converted the unstructured to structured data), and provided the data to the data science team to run models on top of it. The scraper has helped our clients view/consume the data from rich insights and predictions.
Skills
Libraries/APIs
React, REST APIs, API Development, Beautiful Soup, Requests, Mypy, Django ORM, Pandas, Interactive Brokers API, SQLAlchemy
Tools
Jenkins, Git, pylint, GitHub, Confluence, Jira, Pytest, Microsoft Excel, Auth0, Crawlera, ELK (Elastic Stack), ChatGPT, Boto 3, Apache Airflow
Languages
Python 3, Python 2, Python, HTML, JavaScript, SQL, CSS, TypeScript, GraphQL, XML, Java
Frameworks
Swagger, Flask, Django, Selenium, Scrapy, Django REST Framework, Presto, Angular, Spark, Next.js
Paradigms
Unit Testing, REST, DevOps, Automation, ETL, ETL Implementation & Design, Testing, Microservices, Clean Architecture, B2B
Storage
JSON, MySQL, Databases, PostgreSQL, Data Integration, Apache Hive, Azkaban, Redis, Elasticsearch, Redis Cache, Amazon S3 (AWS S3), MongoDB
Platforms
Linux, MacOS, Docker, Amazon Web Services (AWS), Amazon EC2, Apache Kafka, Databricks, Google Cloud Platform (GCP), Azure
Other
Large-scale Web Crawlers, Back-end, APIs, Back-end Development, Data Scraping, Scraping, Data Extraction, Web Scraping, Web Crawlers, Website Data Scraping, Information Retrieval, Agile Sprints, Data Engineering, Web Development, Consul, FastAPI, Software, Full-stack, Architecture, System Design, API Integration, Proxies, eCommerce, CI/CD Pipelines, Minimum Viable Product (MVP), OAuth, Cloud, ETL Tools, UI Automator, Bots, CAPTCHA, Open Source, Jupiter, Containerization, User Experience (UX), Full-stack Development, Web App Development, Containers, Large Language Models (LLMs)
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring