Predrag Miocinovic, Developer in Honolulu, HI, United States
Predrag is available for hire
Hire Predrag

Predrag Miocinovic

Verified Expert  in Engineering

C++ Developer

Honolulu, HI, United States
Toptal Member Since
October 3, 2022

Predrag is a software architect and engineer specializing in C++ for 20+ years. He started his career as an astrophysicist, working on groundbreaking experiments with optical and radio detectors; signal processing; and data management, analysis, and simulations. Leveraging those skills, Predrag transitioned to the financial sector as a lead developer for KRM, the market's most advanced financial risk analysis engine, used by some of the largest financial institutions to manage risk exposures.


Kamakura Corporation
C++, Visual Studio, SQL, Oracle, Windows API, Kerberos, Azure Active Directory...
Kamakura Corporation
C++, SQL, Oracle, Sybase, PowerBuilder, IBM Db2, Microsoft SQL Server...
M2 Consulting
Physics, Python, XML, GCC, Makefile, Linux, User Experience (UX), Mathematics...




Preferred Environment

Visual Studio, C++, SQL

The most amazing...

...feature I've implemented is a customizable bitmap classifier of financial records for Basel III regulatory reporting, using only a few clock cycles per record.

Work Experience

Senior Director of Software Architecture

2019 - 2022
Kamakura Corporation
  • Led the team in redesigning Kamakura Risk Manager (KRM), the company's flagship solution, to fully transition to a cloud, on-demand, dynamically scalable data processing service while maintaining the product's existing competitive advantages.
  • Designed and implemented SSO processing architecture and led the team that transformed of the legacy solution into a multi-tier data processing system in secure computing environments.
  • Worked with clients to developed custom financial data processing algorithms.
  • Designed and integrated calculations for new regulatory requirements in the financial industry into the existing risk management solution framework.
  • Designed an API for an existing solution to allow for client-driven proprietary system enhancements.
Technologies: C++, Visual Studio, SQL, Oracle, Windows API, Kerberos, Azure Active Directory, LDAP, Financial Modeling, Finance APIs, XML, XSTL, Mathematics, Quantitative Modeling, Databases, Data Science, Microsoft Excel, Azure, RabbitMQ, Net Stable Funding Ratio (NSFR), Liquidity Coverage Ratio (LCR), Basel IV, ISDA SIMM, Data Modeling, Waterfall Delivery, Cash Flow Analysis, Linear Regression, Distributed Systems, Microsoft SQL Server, Architecture, Object-oriented Design (OOD), Object-oriented Programming (OOP), Asynchronous Programming, APIs, Multithreading, Non-blocking I/O, Low Latency, Windows, Microsoft Visual C++, Solution Architecture, Excel 2010, Excel 2016, Database Architecture, Derivative Pricing, Derivatives, Interest Rate Swaps, Cross-currency Swaps, Swaps, Standard Template Library (STL), Bloomberg, Classification, Agile, IFRS 9, CECL, Subversion (SVN), Database Design, Mathematical Finance, Oracle PL/SQL, T-SQL (Transact-SQL), Basel III, Optimization, Algorithms, Interpreter Design, ODBC, Authentication, Single Sign-on (SSO), Finance, Mathematical Modeling, Banking & Finance

Senior Development Engineer

2006 - 2018
Kamakura Corporation
  • Developed and maintained the analytical engine of KRM, the open-platform financial risk management system.
  • Modeled macroeconomic risk factors and valuations of all types of financial instruments.
  • Implemented calculations of regulatory, accounting standards-based, risk-metric, and client-specific results.
  • Designed and implemented various advanced mathematical models associated with financial risk calculations.
  • Integrated the company's products with various third-party financial risk and data processing APIs.
  • Designed and implemented the company's source code control protocol and backup and recovery system.
  • Devised and introduced license delivery, enforcement, and intellectual property security for the company's software.
Technologies: C++, SQL, Oracle, Sybase, PowerBuilder, IBM Db2, Microsoft SQL Server, Distributed Systems, Markov Chain Monte Carlo (MCMC) Algorithms, Monte Carlo Simulations, Linear Regression, Financial Modeling, Cash Flow Analysis, IFRS 9, CECL, Agile, Waterfall Delivery, TCP/IP, Sockets, Scripting, Encryption, Subversion (SVN), Mathematics, Quantitative Modeling, Databases, Data Science, Cryptography, Microsoft Excel, Excel VBA, Basel II, Basel III, Data Modeling, Finance APIs, Windows API, Object-oriented Design (OOD), Object-oriented Programming (OOP), Asynchronous Programming, APIs, Multithreading, Non-blocking I/O, Low Latency, Windows, Microsoft Visual C++, Excel 2010, Excel 2016, Database Architecture, Derivative Pricing, Derivatives, Interest Rate Swaps, Cross-currency Swaps, Swaps, Standard Template Library (STL), Classification, Visual Studio, Mathematical Finance, Oracle PL/SQL, T-SQL (Transact-SQL), Optimization, Algorithms, Interpreter Design, Monte Carlo, ODBC, Finance, Mathematical Modeling, Banking & Finance

Software Consultant

2011 - 2012
M2 Consulting
  • Developed the graphical display for high-frequency, low-power, analog-to-digital converter modules designed, built, and operated by the physics department at the University of Hawaii under a NASA-funded grant.
  • Implemented on-the-fly calibration conversions from digital readouts to voltage levels.
  • Oversaw the training and integration into field use.
Technologies: Physics, Python, XML, GCC, Makefile, Linux, User Experience (UX), Mathematics, REST, HTML, wxWidgets, C++, Agile, Bash, Object-oriented Design (OOD), Object-oriented Programming (OOP), 2D Visualization, Data Visualization, Visualization, JavaScript

Postdoctoral Fellow

2003 - 2006
University of Hawaiʻi at Mānoa
  • Implemented a system for receiving, safely storing, and pre-processing the highest-priority signal data from a NASA-operated, long-duration balloon flight.
  • Served as the head developer, leading the team in charge of processing a live telemetry data feed and displaying the status online for monitoring, command, and control of the NASA-operated balloon flight.
  • Developed a graphical interface for 3D visualization of dual-polarity antenna cluster signals with the ability to zoom into time and frequency domains of the individual channels.
  • Introduced software for automatic calibration of high-frequency antennas and managed the team in charge of antenna calibration.
Technologies: Python, PostgreSQL, C Shell, Bash, C++, MATLAB, XML, Mathematics, Databases, Data Science, POSIX, Data Modeling, Physics, Scripting, APIs, Asynchronous I/O, Low Latency, wxWidgets, wxPython, MinGW, Database Architecture, Classification, Object-oriented Design (OOD), SQL, Linux, Unix Shell Scripting, HTTP, User Experience (UX), Distributed Systems, GCC, Makefile, LaTeX, Object-oriented Programming (OOP), Asynchronous Programming, HTML, Algorithms, 3D Geometric Analysis, FFTW, Fourier Analysis, Signal Filtering, Digital Filters, Antenna Design, Signal Processing, 3D Visualization, 2D Visualization, Data Visualization, Visualization, 3D, Fortran, Data Processing, Networking, Mathematical Modeling

Postdoctoral Researcher

2002 - 2003
University of California, Berkeley
  • Developed Monte Carlo-based photon-field tracking software for signal simulation and calibration of a large, optically inhomogeneous neutrino telescope funded by the National Science Foundation (NSF).
  • Collaborated on developing a client-server model for on-demand delivery of the resulting photon-field tables for use in a distributed cluster computing environment.
  • Used neutral networks and other machine learning techniques to reduce the data volume of the photon-field table while retaining all the relevant timing and intensity information.
Technologies: C, Neural Networks, Monte Carlo Simulations, Unix Clustering, Linux, GNU Debugger (GDB), GCC, Makefile, Data Modeling, Doxygen, C Shell, Bash, Physics, Scripting, Low Latency, Database Architecture, Unix Shell Scripting, Markov Chain Monte Carlo (MCMC) Algorithms, LaTeX, Mathematics, Valgrind, Object-oriented Design (OOD), Object-oriented Programming (OOP), Gprof, Optimization, Algorithms, 2D Visualization, Data Visualization, Visualization, Monte Carlo

Kamakura Risk Manager
Kamakura Risk Manager (KRM) is an integrated risk management engine for the financial industry. It's fully scalable and deployable from a laptop to a cloud. As a lead designer and developer for 16 years, I worked on all aspects of this product.

• Addressed technology debt by redesigning the modeling library to use object-oriented design to model financial instruments.
• Implemented financial modeling and valuation for vanilla and exotic financial products.
• Designed a system for high-performance multithreaded computation and input/output.
• Expanded capabilities of Monte Carlo-based simulations of macroeconomic risk factors and instrument valuations.
• Enhanced the custom scripting interface for user-defined econometric variables.

Asset Forecast Optimization Algorithm
Insurers manage assets funded by premiums to guarantee sufficient liquidity for future claim payouts and maximize return on assets invested. A US insurer ($100 billion in AUM) needed an automated way to visualize the outcomes of its asset reinvestment rules under what-if scenarios in a multi-year forecast, which they had analyzed mainly in Excel.

I formed and led a team to implement an asset reinvestment optimization algorithm in an income forecasting simulation (KRM). This required: 1) implementing an algorithm in C++ to forecast selling and buying of assets following the reinvestment rules and constraints and 2) defining new data structures in an SQL database to configure such rules and constraints and store intermediate results of algorithmic decisions. The results of asset forecasts were integrated into the client’s Power BI framework.

By automating the execution of asset reinvestment rules in existing income forecasts, the client accelerated its processing of what-if scenarios and simplified the configurations needed to drive income forecasts. They could test and analyze the effects of new and modified reinvestment rules more quickly and precisely to select the best-performing assets for their trading strategy.

Optimized Parsing of Recursive Scripts

Kamakura Risk Manager provides a scripting interface that allows clients to extend many of its calculations with their own custom calculations, including the ability to replicate exact calculations used in other risk management solutions for cross-checking and validation.

A large Central European bank (managing €275 billion in assets) migrated from a legacy risk system to KRM, using this scripting interface to replicate interest rate forecasting logic used by its legacy solution for validation in KRM. The client’s scripts used recursive logic for rates over a 10+-year horizon. When placed in production, several hundred of these scripts took more than an hour to parse and prepare for calculation.

Using Visual Studio profiling tools, I optimized a C++ script parsing algorithm to use more efficient STL-based memory objects, significantly improving the parsing performance. With the new algorithm, original scripts took only seconds to be parsed and prepared for calculation.

The bank was able to complete several production runs in the same time it took the original code to prepare for a single run. This allowed for quicker calculation comparison and validation, which led to a faster UAT process and acceptance of the new system.

3D Animated Data Display

A pioneering international scientific project needed a way to visualize data collected by optical sensors embedded in huge volumes of glacial ice. I developed an animated 3D-event viewer that allowed for zooming, rotations, and full customization of the visual display. At the back-end, I implemented a raw data parser that converted flat files into an object-oriented data library to enable easy manipulation and a complete display of all data properties in the application.

Animations and static 3D views of the data helped to guide data analysis and provided easily understood visuals. This allowed the project to popularize and communicate its mission and results among scientific peers, funding agencies (resulting in a multimillion-dollar allocation in the Congressional budget), and the wider community interested in scientific discovery (

Conversion of LIBOR-based Contracts to Risk-free Rates
After discovering LIBOR interest rate benchmark manipulation, markets moved to abandon LIBOR in favor of risk-free rates (RFRs), and some LIBOR rates ceased to be published. This is a problem for long-term contracts underwritten with LIBOR benchmarks that expected LIBOR rates to remain available.

To support these contracts, the ISDA published rules to calculate LIBOR-like rates from RFRs. The rules satisfy original LIBOR logic and availability of RFRs for future calculations, but they are complex because defining how to construct a replacement benchmark rate must account for global differences in business days and holidays.

An Australian bank with 900 billion AUD in AUM and long-term LIBOR contracts required the ISDA rules for the LIBOR replacement rate when forecasting its long-term balance sheet positions. I implemented a C++-based algorithm that executes the rules, enables efficient recalculation of rates under simulated market shocks to RFRs, and allows for currency-specific rate differences. Despite more complex logic, simulations performed with the new algorithm have minimal speed degradation compared to original LIBOR-rate-based simulations. The client uses it in nightly production runs without workflow modifications.

Dynamically Forecasted Regulatory Ratios
Banks are required to hold sufficient liquidity and funding profiles to survive financial stress by meeting liquidity coverage and net stable funding ratio (LCR/NSFR) thresholds. To calculate these ratios, banks categorize assets based on 40+ criteria and perform specific aggregations and calculations.

In forecasts, banks must account for reinvesting maturing assets by simulating and categorizing new records on their balance sheets to include in LCR/NSFR values. Large banks have hundreds of millions of simulated asset records, so it’s untenable to categorize them across 40+ dimensions via legacy SQL-based workflows.

A major SE Asian bank ($90+ billion in AUM) had process bottlenecks and configuration limitations, thus calculating only current LCR/NSFR values. I used C++ algorithms to replace the SQL-based workflow for these calculations. By keeping the existing SQL categorization setup, the client didn’t have to modify its inputs and seamlessly used the new, faster routines. Thus, millions of simulated records were instantaneously categorized and aggregated, allowing the client to efficiently calculate and predict both current and future LCR/NSFR values and adjust funding and asset selection models to meet mandated limits.

CECL Modeling for Non-standard Loans
A US agricultural lender ($14+ billion in loan volume) had to upgrade its expected-credit-loss modeling to comply with the current expected credit loss (CECL) methodology. In simulations, the lender had to calculate expected losses at the record level to match and be attributed to specific loans on its balance sheet.

Agricultural loans are atypical in structure and the variability and lumpiness in their payoffs. This lender has detailed models for future loan payoffs and default probabilities. CECL requires loan cash flows to be adjusted and expected losses calculated based on default probabilities in a very specific way that isn’t compatible with time-dependent forecasting methods.

To cover the incongruence between the regulatory requirement and modeling methodology, I implemented a new algorithm in the KRM net income simulation, which calculates CECL-adjusted cash flows and expected losses for any type and future timing of loan payoffs without time-consuming recalculation of loan amortization schedules. This allowed the client to use a single production run, greatly reducing the cost of server time and complexity of workflow to forecast both future income based on its models and CECL-based provisions for annual filings.
1998 - 2001

PhD in Astrophysics

University of California - Berkeley, California, USA

1996 - 1998

Master's Degree in Physics

University of California - Berkeley, California, USA

1992 - 1996

Bachelor's Degree in Physics

Tennessee Technological University - Cookeville, Tennessee, USA


Standard Template Library (STL), Windows API, wxWidgets, ODBC, Sockets, POSIX, FFTW


Visual Studio, Subversion (SVN), Microsoft Visual C++, GCC, MATLAB, LaTeX, Microsoft Excel, Excel 2010, Excel 2016, wxPython, Bloomberg, Mathematica, Makefile, RabbitMQ, Git, GNU Debugger (GDB), Valgrind, Gprof, MinGW


C++, C, Python, SQL, XML, Bash, T-SQL (Transact-SQL), Excel VBA, Perl, Embedded C, XSTL, PowerBuilder, C Shell, CSS, JavaScript, HTML, PostScript, Fortran


Object-oriented Design (OOD), Object-oriented Programming (OOP), Asynchronous Programming, Agile, Data Science, Database Design, REST

Industry Expertise

Banking & Finance


Linux, Unix, Windows, Oracle, Azure


Qt, Realtime


Microsoft SQL Server, Databases, Database Architecture, Azure Active Directory, Sybase, IBM Db2, PostgreSQL, Oracle PL/SQL


Monte Carlo Simulations, Financial Modeling, Markov Chain Monte Carlo (MCMC) Algorithms, Cash Flow Analysis, Waterfall Delivery, Physics, Mathematics, Quantitative Modeling, Multithreading, Non-blocking I/O, Low Latency, Data Architecture, Optimization, Algorithms, Monte Carlo, Finance, Mathematical Modeling, X11, Unix Shell Scripting, Kerberos, Finance APIs, Distributed Systems, CECL, Scripting, Data Modeling, Mathematical Finance, Basel III, Basel II, Liquidity Coverage Ratio (LCR), Net Stable Funding Ratio (NSFR), ISDA SIMM, Architecture, Asynchronous I/O, Solution Architecture, Derivative Pricing, Derivatives, Interest Rate Swaps, Cross-currency Swaps, Swaps, Classification, Interpreter Design, 3D, 3D Geometric Analysis, Fourier Analysis, 3D Visualization, 2D Visualization, Data Visualization, Visualization, Data Processing, Unix Clustering, Linux Kernel, HTTP, User Experience (UX), LDAP, Linear Regression, IFRS 9, TCP/IP, Encryption, Neural Networks, Doxygen, Basel IV, Cryptography, APIs, Animation, Encapsulated Postscript Vector Graphics (EPS), Signal Filtering, Digital Filters, Antenna Design, Signal Processing, Networking, Authentication, Single Sign-on (SSO), Custom Scripting, RF Design

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.


Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring