Customer Churn Debacle
To address customer retention challenges, a large BPO vendor built a dashboard to get a 360 degree customer view for all their 40 clients.
Every one on the client team was pleased to see all the information in one place. When the model predicted an 80% chance of a customer churn, the account manager was unimpressed as he knew this 6 months earlier and the dashboard did not solve his fundamental problem.
When I stepped in, I analyzed the clients ticket data to identify three things the vendor should do to prevent churn for a third of what they already spent.
Our model predicted a 90% chance of retention if three specific things were completed in 3 months. That resulted in the customer renewing the contract.
Intelligent Data Pipelines
Naval artillery on battleships generates a lot of data that is analyzed by engineers on a base.
The data doesn't reach the base until the ship reaches a port due to limited bandwidth of satellite networks. This causes huge delays in the Navy's ability to deploy new guns.
We used machine learning to identify test data to download and optimize the bandwidth, when engineers request what they need based on smart summaries generated by the models.
Engineers use a dashboard to manage their data requests withing the confines of bandwidth limitations.
This reduced the total testing time on gunships from many months to a few weeks.
Smarter Negotiations
When purchasing $2B worth of hospital supplies every year for 17 hospitals across 3 regions from hundreds of vendors, it is very challenging to negotiate the right prices for thousands of specialized hospital supplies.
The company wanted to see how much savings can be had by negotiating with vendors.
Combining the purchasing data with ECRI benchmark data using Alteryx, Tableau and machine learning models written in R and Python, the team built an intelligent smart negotiation dashboards to help negotiators. The system ensured that the hospital system pays close to the median price for most of its supplies in the region.
This resulted in $40M savings as predicted by the machine learning models.
Marketing a New Drug
A bio-pharmaceutical company needed to segment 10,000 doctors from a pool of a million doctors for a year long sales outreach program.
CMS data on providers has 381M visits, 91 Specialites, 22 Cities, 27M prescriptions, 1.6B medical services and 1B drug services.
The model needed to learn and adapt based on weekly sales.
A series of deep learning models and using patent demographics, physician performance, sales efficiency and location statistics we were able to help the company beat their sales targets.
The sales dashboard gave the executives a weekly view of how their new drug is performing.
Hospice CAP Exposure
Careful management of resources is crucial for hospices to not exceed the the payment cap.
Almost half of the patients expire within 3 weeks of entering hospice. However, 12 to 15 percent live beyond 6 months. Predicting the length of stay as accurately as possible is critical to provide excellent service without the risk of repayments.
A machine learning model to predict length of stay of the patients. We used Provider Statistical & Reimbursement (PS&R) data, number of Medicare beneficiaries served, predicted and clinical condition of the patients for the model. Running the analysis to track trends and anticipated costs allowed the hospice to adjust expenses accordingly. This reduced the cap exposure risk to an Hospice organization significantly.
Provider Data Quality
Provider data quality is a common problem is many healthcare organizations. Maintain reliable provider data quality is a time consuming process. Lack of provider data quality can cause HIPAA violations or missed payments or delay of care.
A machine learning algorithm to detect data anomalies in provider data and to identify the most reliable value saved provider data governance team hundreds of hours in research time to fix erroneous provider data records.
Using machine learning algorithm I reduced percentage of duplicate records by 4% to 1% and inconsistent records from 8% to 2%. The solutions also helped saved time and money for the provider outreach team.
Bankruptcy Auctions
Bankruptcy trustees action off properties to pay the creditors. My employer wanted to find out if there was a market to provide an auction functionality in the software used by the trustees.
I analyzed 10 years of bankruptcy data to predict the amount of money to be recovered from auctioning of properties
For this, I wrote all the scripts in python to transform the bankruptcy data to create the right datasets ready for analysis and building statistical models.
Since the bankruptcy laws varied by regions, I built the statistical models for each region and a analytic dashboard for the product managers to evaluate which states had the most opportunity and what the product for Trustees should look like.
Ship Maintenance
Estimating time and costs of US Navy Ship maintenance projects has been a research topic for decades. There is a entire DoD department that manages these projects and many DoD contractors try to solve this problem.
Using deep learning models on historical ship maintenance tickets and project data, I built the most simple and cost effective model that integrated will with existing DoD applications that manage ship maintenance. Our model had an accuracy better than other competing models and though it was not highly accurate, this proposal resulted in my employer getting two other related projects to build dashboard for ship maintenance engineers.