How to correct Marketing Mix Models' counterintuitive signs
Both Bayesian and Frequentist approaches have their flaws but there is a better approach.
Every model has some bias in it. But the real question is how much bias is too much.
In MMM, lesser the bias the better because MMM is all about accurate causal attribution. Bias means that we are straying away from the ground truth.
What is the counterintuitive sign problem?
In MMM, there is this problem of counterintuitive signs.Â
For example, the media spend variables should have a positive coefficient as it is believed that media spends impacts the KPI positively.Â
Similarly, the coefficient sign for price variables should be negative as it is believed that an increase in price decreases the KPI.
To overcome these issues, a certain amount of bias is required to be added into the model so that the model is nudged in the right direction.
Fixing Signs - Bayesian vs Frequentist Approaches.
Bayesian approach to solve for the 'wrong' signs of variables is by restricting the probability distribution itself. Predominantly they use Half Normal distribution to indicate that the variables shouldn't have negative support.
Frequentist approach to solve for the 'wrong' signs is through penalized regression either Ridge, LASSO or GLMNET.
Which method is a lesser evil?
Both Bayesian and Frequentist approach to fixing signs is not ideal. But one is a lesser evil than the other.
Frequentist approach to fixing signs through penalized regression is a lesser evil than Bayesian approach. The reason is that in Bayesian approach you are not only fixing the signs but also imposing a probability distribution that may not be congruent to the data generating process itself.
Assuming normal distribution for everything itself is a flawed approach. See post link in resources. Assuming half normal, is a even more of a flawed approach.
As stated, I am not a big fan of either approaches of fixing signs. However the frequentist approach of fixing sign is 'ok' if it is conditioned upon a goodness of fit metric.
This is what Robyn seems to have done. While penalized regression will cause some problems in model specification, penalized regression is effective to correct signs as long as it is conditioned on goodness of fit metric.
How Aryma Labs solves the problem of counterintuitive signs.
We use penalized regression to correct counterintuitive signs as last resort. Counterintuitive signs are a sign of multicollinearity and endogeneity. We try to solve for these problems first to see if our model coefficient signs becomes correct.
If despite all our efforts the wrong signs issues persists, then we take a approach similar to Robyn. But instead of conditioning process of fixing signs on NRMSE and Decomp RSSD alone, we in addition condition it on KL Divergence and Chebyshev's inequality.
The latter two metrics acts as guard rails and ensures that our models have less bias and are in alignment with ground truth.
Resources:
Research Paper: Investigating MMMs business error through KL Divergence and Chebyshev's Inequality
Why people go wrong with Normal Distribution.
Normal distribution misunderstanding
Thanks for reading.
For consulting and help with MMM implementation, Click here
Stay tuned for more articles on MMM.