AI/NLP-based Tool for the Rapid Distribution of Essential COVID-19 Goods
Designed an accelerated 10-week product implementation (during COVID-19), led a team of data scientists, engineers, and developers, presented results to 20+ executive stakeholders, and prepared a successful federal funding submission ($0.5 million).
I ideated and led the development of an AI-enabled solution with a dashboard to exhibit views on incoming critical cargo, and predict the estimated time to arrival (ETA) to enable fast cargo movements in and out the Port of Montreal, based on natural language processing (NLP) techniques for cargo identification at the container level.
I then validated this solution with several supply chain stakeholders, who are already using it as a single source of information at both tactical and operational levels, to track and fast-track critical cargo until its departure from the port.
We reduced the dwell time 83% (~three days to less than 12 hours or next working day) for critical cargo. 136,000 10-foot containers (TEU) analyzed per month equals ~38 million metric tons of goods per year with an estimated $2.6 billion impact on the Canadian economy.
Interim CTO | Fintech
Advised on the product strategy, assessed and prioritized the product backlog, helped scale the teams based on development workload, and aided the client in their hiring efforts for the next CTO.
Client: The client was a Panama fintech startup creating a platform for investors and startups seeking investors to connect.
Mandate: I served as a fractional CTO to advise the team in product strategy, advising on how to scale the technology teams and increase the velocity of the release cycles.
Geolocated (via Map-based SaaS) Retail Strategy for a Confidential Coffee Company
Led strategic and technical activities for a map-based retail project; performed customer segmentation and analyzed the best location for retailers; collected and analyzed data (e.g., Nielsen, pedestrian, transactions per product category, and more).
The project goal was to use data from a confidential coffee company to add new layers of intelligence to an existing geo-located platform. The combination of this and other data sources allowed the retail company to optimize its real estate and store opening strategy based on best potential locations (e.g., higher coffee consumption per postal code, higher pedestrian traffic per area).
I worked with both product strategy and data science teams to combine, analyze, and integrate available data sources. I also prepared and presented a series of strategic recommendations for the retail company.
Port Logistics Optimization Tool
Designed the solution and led a team of 10+ business and technical resources, prepared the submission and obtained federal funding ($2.7 million), validated the tool with stakeholders, and managed 20+ CXO and technical-level clients.
This supply chain project goal was to develop a "multi-day container flow optimization for port logistics" use case in order to enhance the overall efficiency of the port authority's operations by adopting an AI-powered supply chain solution.
Concretely, this project focused on optimizing rail operations and related dwell times at Montreal port authority, providing shared and advanced visibility to all stakeholders such as port, rail, and terminal operators, and enabling data-driven decision-making, prediction, and scenario analyses.
Reducing Installation and Repair Times for Telecommunication Customers
Managed the product backlog with 20+ data scientists/engineers from four different teams; delivered a solution that generated $3 million in annual cost savings based on a 3% improvement in telecom technician efficiency; obtained $200,000 in funding.
Bell Canada employs 5,000+ field technicians to install and repair wireline connections of its B2C customers. A wide range of technologies supports these services (e.g., FTTH, pair-bonding), and the complexity of jobs varies greatly. Also, the technicians dispatched to these jobs fall on a broad spectrum of experience, certifications, and specialization.
This project was a collaboration between Bell, IVADO Labs, Exfo (Bell supplier of network sensors), and VuPoint Systems (Bell supplier for satellite field services). The main goal was to create a proof of concept (PoC) to optimize the field install and repair experience by using AI models to assign technicians to the tasks that suit them most.
We built an AI model based on 3TB of data from 5 million interventions—from over the last three years with 100+ variables from the field, network, and customer operations sources. We would then cluster tasks with similar characteristics, predict technician efficiency at completing these tasks, and derive insights to optimize Bell systems for technicians tasks assignment.
Handover Aware Interference Management in LTE Small Cell Networks
Managed and integrated a collaborative project for 4G-LTE advanced resource management techniques.
We proposed an enhanced inter-cell interference coordination (ICIC) mechanism based on the multi-armed bandit (MAB) approach that aims at maximizing the throughputs of the attached users and also their handover (HO) performances evaluated through 3GPP mobility robustness optimization (MRO) indicators.
To this end, the MAB procedure explicitly considers the HO performances to configure the optimal spectrum split. The simulation results highlight the benefits of the proposed solution: higher throughputs and reduced HO failures. This is particularly the case concerning high and medium-speed users for which HO performances are more sensitive to interference.
Categorical Variable Selection in Risk Modeling for KYC Activities
Led the research and planning activities for a project with AI students and a banking client.
In financial institutions, categorical features appear quite often in credit datasets and in compliance models, for example, features related to clients’ risk profiles.
Traditional feature selection methods (e.g., statistical significance, recursive feature elimination, LASSO) do not work well with categorical features since these methods would retain certain levels and remove other levels of the same feature. The Group Lasso approach has shown to be more stable in terms of variable selection but displays shortcomings in terms of predictability. Instead, for a given feature, would it be more appropriate to devise a method that aggregates neighboring levels in bins in order to get a feature representation space that would better scale with the output?
Because of the numerous ways to represent categorical variables and to select which variables are of the importance we ask what are the most appropriate methods for improving categorical feature selection.
White Paper for a Blockchain-based Education App
Designed and wrote a white paper for a blockchain-enabled initiative.
I composed a white paper for an ongoing blockchain-enabled initiative called Pocket, which is a digital wallet and portfolio that captures and stores the holistic evidence of learning—what students create in any format—and gives them the autonomy to securely share it with employers and further education.