Welcome to LTP - Advanced Analytics & Business Consultancy

Apr 22, 2022

Have you ever participated in a fancy analytical project, that promised a complete business revolution, but ended in a deception? You should know that you are not alone. According to a recent survey by the MIT Sloan Management Review, in which more than 3000 managers were contacted, the number of companies that have a strategy for artificial intelligence and that are piloting or have deployed analytical solutions has grown from 2017 to 2020. Nevertheless, only 11% of the organizations report significant financial results in 2020.

From our experience at LTPlabs, when things do not result, two main outcomes are observed:

  • An effort has been made to devise, implement, and evaluate the potential impact of a new analytical model. However, business teams do not leverage the model after the end of the project and do not sustain it in their daily processes. It can be said that there is a gap between the company’s capacity to produce analytical results and its ability to apply them to business issues;
  • Even though a new analytical model has been successfully devised, implemented, and is currently being used, the results are not as good as initially expected.

Several causes may be behind such failures. Regarding the first outcome, the typical issues originating a poor implementation are the lack of interaction and alignment between the data science and business teams during the analytical model design and implementation, or a poor relationship between the new model and a concrete business problem (in other words, the company found a purpose for the data available instead of leverage data for a purpose).

Regarding the second outcome, things can be more complicated. It can be the case that the model is not returning the expected results because data or assumptions changed since the project execution. But more frequently than that, some implementations also fail because the analytical model is not being used to augment decision-making. Business users, accustomed to their past way of doing, manually adjust or replace model results, diminishing the impact of the analytically driven recommendations.

Strategies to boost analytical models’ impact

Different strategies may be capitalized on to maximize the chance that the analytical model will return the expected results.

We highlight six prerequisites for success, three to ensure right at the beginning of the project and the other three to guarantee a sustainable change.

Pay attention to these points while setting up the project

1. Connect the analytical model/project to a concrete use case

Managers often start new projects or make new investments just because the topic being targeted is trendy or the competition is one step ahead. We do not want to suggest that this is not relevant. But to ensure that a new analytical model will succeed it is vital to connect it to a concrete business problem. Anchor the project to a decision to be made or a real need of your business team.


2. Set the proper team to tackle the challenge

Define the required roles to ensure both the efficiency and effectiveness of the project (and consequently the success of the new analytical model). Four roles deserve to be considered:

  • Data scientists: the ones that have the ownership and required knowledge to develop the analytical model;
  • Business analysts: the ones that establish the bridge between the business needs and the analytical model. They are essential to help define the requirements;
  • Business users: the ones that will use the new decision support system. Even though the business analysts might be capable to identify the relevant business needs, including the end-users is valuable to capture the detailed idiosyncrasies of the process and to foster participation;
  • IT members: the ones that will get the required data to set up the project, and that can be later on responsible for the development of the application that will host the analytical model.


3. Do not set the bar too high or too low

Define, since the project is set up, what should be expected from the MVP (Minimum Viable Product) version. Do not wait for perfection to launch the new analytical model. Since some details can only be identified once the users start implementing the recommendations given by the analytical model, it is preferable to refine it afterwards. But, on the other hand, ensure the results make sense before starting testing and using the model’s recommendations, or you can create resistance to the adoption.

Set the conditions for evolution

1. Define the proper interaction mode

Once the model starts to be used, guarantee that the interaction mode between the human and the decision-support system is clear and is the most adequate for the problem at hand. In some cases, the model might be more autonomous, and the human only supervises deviant cases (e.g., an automatic pricing module that revises the prices of thousands of products/offers daily). On other occasions, the human might be more intervenient, with the analytical model making recommendations that will be analyzed before the final decision (e.g., prescriptive model suggesting the optimal workforce sizing for a given store or warehouse).

Ensuring this interaction is clear is important for the business teams to know what to expect from the analytical model, and also to set the conditions for its evolution.

During the analysis of the automatic results/recommendations, business users can identify improvement opportunities for the model’s refinement.


2. Create a culture of continuous improvement

Set the process and governance to continuously improve the analytical model. Define a communication platform to collect new improvement opportunities. Set a moment in the agenda to analyze and prioritize the backlog. Create the conditions for the implementation of new features with agility. Communicate and celebrate new releases with the entire team.


3. Measure both the results and the analytical model utilization

Define a set of concrete KPIs to evaluate the model’s success. Ideally, these indicators should be the same that were set at the beginning of the project and that were used to evaluate the potential behind the analytical model. Define a moment in the agenda to evaluate these indicators and debrief about additional opportunities. Moreover, in the early days of adoption, measure the utilization of the new decision support system.

All in all, to maximize the chance of improving your business results with a new analytical model, approach it as a journey.

Define conditions for the first viable version and make an effort to improve it over time. Look to the analytical model as a lever to help solve concrete business challenges. And define how your team should interact with the model while being motivated to incorporate it into their processes.

By: Daniel Pereira

Get in Touch Send Message