Stevan Popov, Solution Architecture Developer in Belgrade, Serbia
Stevan Popov

Solution Architecture Developer in Belgrade, Serbia

Member since March 12, 2018
Stevan is Solution Architect with proven experience in delivering performant applications mainly focused on reporting, business intelligence, and Rest APIs on high volume of data. He is experienced in database design, data modeling as well as distributed data storages including NoSQL and BigData technologies. Steven is able to communicate and work with team members in several time zones.
Stevan is now available for hire




Belgrade, Serbia



Preferred Environment

Git, Docker, IntelliJ IDEA, Mac OS

The most amazing...

...project I've worked on is real-time decision support system using Apache Storm, Machine Learning components, Hadoop, and 20+ nodes MySQL cluster.


  • Solution Architect (via Toptal)

    2019 - PRESENT
    US based Marketing Company
    • Delivered technology and architecture artifacts as part of the Toptal projects team during the Lighting Phase.
    • Designed architecture, deployments, and for every aspect of the application in the mindset of Agile and CI/CD.
    • Compiled architectural guidelines and deployment procedures.
    • Completed scalable and cost-effective infrastructure on AWS.
    • Accomplished multi-layered security integrated with both cloud and application components.
    • Integrated Authorize.NET as the payment gateway for SAAS vertical.
    • Led the development and supported the product stakeholders in raising data platform to full breed CRM.
    • Vocalized best architecture practices and influenced product decisions - kept the value of deliverables by maintaining the overall system extensibility.
    • Modeled time-series projections for performance data leveraging Python ML libraries.
    Technologies: Authorize.Net, Python, Docker, AngularJS, Node.js, AWS
  • Architect (via Toptal)

    2019 - 2019
    Imagined Futures
    • Delivered the reference architecture and high-level diagrams or the product.
    • Defined solution context, component view, and component communication.
    • Specified the proof of concept and phased it into deliverables that bring business value.
    • Conceptualized design principles and technology choices for the use-cases given and future stages of the product.
    • Vocalized different approaches to data processing, both through DAG and Kafka topic sequencing.
    Technologies: Kafka Streams, Apache Spark, Docker, PostgreSQL, Python, Java, Spring Boot, Apache Kafka
  • Software Architect

    2017 - 2019
    • Managed team of freelancers in order to continuously integrate and develop new features for the client.
    • Conceptualized a new platform architecture, going away from monolith application to micro-service based architecture.
    • Wrote POCs in order to adopt and evaluate new technology stack.
    • Contributed as part of both the steering and working groups to make sure that the business side is aware of progress and to facilitate the process of functional requirements understanding.
    • Led the development and advised software engineers with best practices.
    • Investigated all the performance bottlenecks, agreed on SLAs with C level executives, and helped the implementation of resolutions.
    • Worked with UI/UX designers in order to provide pixel-perfect design, great user experience matching the client's idea.
    Technologies: CSS, AngularJS, Docker, Symfony, PHP, MySQL
  • Solution Architect/Team Lead

    2017 - 2019
    Empire Sports And Entertainment
    • Designed live data-warehousing solution, integrating several sources including APIs and DBs.
    • Supported data pipeline development using Hadoop with both HDFS and AWS S3 for data lake, using R models for initial data analyses.
    • Delivered MySQL based data-warehouse using Kimball snow-flake schema.
    • Completed Spark/Hive/HBase integrations.
    • Designed the Hadoop cluster with 10+ nodes in the autoscaling group on AWS.
    Technologies: Java 8, MySQL, Hadoop, Big Data, Python, R, Machine Learning
  • AWS Engineer

    2018 - 2018
    • Set up the architecture for high volume near real-time cryptocurrency data collection over exchange APIs.
    • Worked on delivering reusable scalable infrastructure and data persisting strategy with AWS S3 and RDS Postgres (Aurora).
    • Delivered POC plan and structure.
    • Collaborated with stakeholders in order to define next steps and review current work.
    • Implemented performance monitoring and alerts over ELK.
    • Contributed with cryptocurrency API call limitation strategy to avoid IP locking.
    Technologies: PostgreSQL, Docker, Python, AWS
  • Solution Architect

    2016 - 2018
    • Headed two projects using: Java for backend services, React.JS / Angular.JS for front-end, Tableau / Microstrategy as BI tools, container based architecture (Docker), AWS Instance as a code (Terraform), AWS Redshift for DWH, Apache Storm for ETLing.
    • Evangelized the architectural vision, making engineering groups feel part of the whole picture and able to relate their work to our long-term strategic goals.
    • Collaborated with Engineering to ensure that changes implemented into the system are completed in a strategic manner, and fit the longer term technical vision for the project.
    • Developed Performance testing plans and strategies alongside with product managers, defining stress, capacity and performance testing acceptance criteria.
    • Developed a vision for the architecture of our reporting product to ensure it could meet current and future business needs in the most economical and efficient way possible.
    • Collaborated with the development team to build and maintain design artifacts.
    • Analysed Splunk and ELK data to understand workload distribution for better code optimization prioritisation in compliance with or performance SLAs and goals.
    • Designed test execution plans, supported development of custom performance and capacity testing frameworks and usage of 3rd party tools.
    • Achieved high availability over the systems.
    • Supported logging and health-check internal library development and strategies.
    • Worked closely with the development team in proposing and driving POCs, investigating tasks and actively participating in their resolution. Ensure that the developers are able to realize the architecture and vision of the solution in line with the reference architecture.
    • Ensured business requirements accurately translated into the technical vision.
    Technologies: Tableau, Java 8, Hadoop, AWS EMR, Redshift, PostgreSQL, AWS RDS, AWS
  • BI & Data Analyst

    2015 - 2016
    • Established BI universe in company.
    • Rolled out first BI phase using Tableau as BI tool, HP Vertica as DWH, Pentaho Kettle as ETL tool on AWS.
    • Designed data pipelines and data-warehouse architecture.
    • Created a complete data model to support multi-tenant OLAP database.
    • Coordinated with business SME in order to deliver end-to-end solution.
    • Used Hadoop for Data Lake and reporting source for sensor metrics KPIs and monitoring.
    Technologies: Hadoop, MySQL, Pentaho, Vertica, Tableau
  • BI/DWH Developer

    2013 - 2015
    Core Distribution
    • Rolled out complete BI solution based on Pentaho BI Stack (EE), and HP Vertica as DWH.
    • Created 30+ custom dashboards, manual mapping documents, and delivered performant 20+ TB data-warehouse on HP Vertica.
    • Led end-to-end project implementation.
    • Troubleshot and optimized SQL performance.
    Technologies: Vertica, Pentaho
  • Database Developer

    2013 - 2013
    • Created Live DWH from MySQL database to InfiniDB engine (columnar storage) using Flex-Views and triggers.
    • Built a high-availability solution for operational reporting used by 200+ concurrent users, over 15,000 users per day, and over 40,000 executions per day.
    • Configured and optimized MySQL cluster.
    • Implemented partitioning and indexing strategy for optimal read/write operations.
    • Developed archiving of data.
    Technologies: InfiniDB, MySQL
  • BI/DWH Developer

    2011 - 2013
    • Rolled out BI solution based on Microsoft SharePoint, SQL Server, using stored procedures for ETLing.
    • Enhanced reporting speed with optimization techniques.
    • Implemented performance and event monitoring of our edge services (typically .NET on IIS), WMI monitoring of Servers and Equipment (hardware and OS), as well as deep monitoring of SQL Services across our MS-SQL Federation.
    • Built core data-oriented functionality shared across many areas of our business exists in SQL Server as an ecosystem of stored procedures and jobs which perform as a service-oriented platform.
    • Worked on automated scrubbing, normalization and quality control - and is based in a module-oriented ingestion architecture.
    Technologies: Microsoft SQL Server, SharePoint, Microsoft


  • Ticketmaster International Reporting (Development)

    The real time reporting solution is operational analytics platform designed to provide operation reporting and analysis to clients. Ticketmaster International Reporting was designed as a replacement for legacy platform, and is built around the Microstrategy Business Intelligence platform.

  • TicketIntel decision supporting system (Development)

    I developed most parts of decision support system that helps clients to dynamically price the tickets for their events providing the real time information and predictions presented on the nice and responsive UI. Helped with implementation of proprietary models for optimal ticket price proposals.

  • Venuemaster Reporting (Development)

    Project implements a solution to allow Reporting requirements for Venuemaster clients in a fashion that matches the desired target architecture for Reporting overall. The project is designed by myself and it's running on top technologies like AWS Redshift, Apache Kafka, Hadoop, Docker, Java 8, React.Js.

  • Airplane Price Scraper (Development)

    Since low cost companies don't have open API (at least not for non-partners), I decided to develop web-scraper using Java Selenium framework to scrape the prices and store into the database. The logic is held inside the database, running stored procedures on daily bases and giving me the insight of the cheapest returning flights from my hometown. Because of the legal stuff, I've decided to use it internally only.


  • Languages

    SQL, T-SQL, PHP, CSS, Python, R, Java, Java 8
  • Frameworks

    AWS EMR, Hadoop, Selenium, Symfony, AngularJS, Spring Boot, Apache Spark
  • Paradigms

    Requirements Analysis, Stress Testing, Agile, Scrum
  • Platforms

    Pentaho, Oracle, Docker, Mac OS, Microsoft, SharePoint, Apache Kafka, Android
  • Storage

    PL/SQL, AWS S3, AWS RDS, AWS DynamoDB, MySQL, Vertica, MongoDB, SQL Server Integration Services (SSIS), Oracle 12c, PostgreSQL, Redshift, Microsoft SQL Server, InfiniDB, SQL Server 2012, SQL Server 2016
  • Other

    Solution Architecture, Big Data Architecture, Software Architecture, Software Architect, Requirements & Specifications, Business Requirements, Performance Analysis, Capacity Planning, Architecture, Tableau Server, Big Data, Lean Development, Pentaho Reports, AWS, Performance, Back-end Performance, HadoopXML, Machine Learning, Predictive Modeling
  • Libraries/APIs

    Google API, Facebook API, Authorize.Net, Node.js
  • Tools

    Tableau, Tableau Desktop Pro, Microsoft Power BI, Pentaho Data Integration (Kettle), IntelliJ IDEA, Git, ELK (Elastic Stack), Splunk, Kafka Streams, Docker Compose, Docker Hub, Docker Swarm, Maven, Apache Storm


  • Bachelor's degree in Computer Science
    2009 - 2012
    University of Belgrade, Faculty of Mathematics - Belgrade, Serbia


  • Certified Scrum Product Owner (CSPO)
    APRIL 2020 - APRIL 2022
    Scrum Alliance
  • Certified Scrum Master (CSM)
    MARCH 2020 - MARCH 2022
    Scrum Alliance

To view more profiles

Join Toptal
Share it with others