Predictive Analytics

September 26, 2010

In the white paper “Seven Reasons You Need Predictive Analytics Today” the author Eric Siegel (President, Prediction Impact, Inc. and Chair, Predictive Analytics World) states:

Predictive analytics has come of age as a core enterprise practice necessary to sustain competitive advantage. This technology enacts a wholly new phase of enterprise evolution by applying organizational learning, which empowers the business to grow by deploying a unique form of data-driven risk management across multiple fronts. This white paper reveals seven strategic objectives that can be attained to their full potential only by employing predictive analytics, namely Compete, Grow, Enforce, Improve, Satisfy, Learn, and Act.

1. Compete – Secure the Most Powerful and Unique Competitive Stronghold

A predictive model distinguishes the microsegments of customers who choose your company from those who defer or defect to a competitor. In this way, your organization identifies exactly where your competitor falls short, its weaknesses.

2. Grow – Increase Sales and Retain Customers Competitively

Each customer is predictively scored for sales-related behavior such as purchases, responses, churn and clicks. The scores then drive enterprise operations across marketing, sales, customer care and website behavior. In this way, predictive analytics delivers its unique competitive advantage to a range of customer-facing activity.

3. Enforce – Maintain Business Integrity by Managing Fraud

Scoring and ranking transactions with a predictive model leverages the organization’s recorded experience with fraud to dramatically boost fraud detection. […] more fraud is detected, and more losses are prevented or recouped.

4. Improve – Advance Your Core Business Capacity Competitively

Predictive analytics improves product manufacturing, testing and repair in many ways. For example, during production, faulty  items are detected on the assembly line.

5. Satisfy – Meet Today’s Escalating Consumer Expectations

Predictive analytics is an explicit selling point to the end consumer, […] with predictive analytics, the consumer gets better stuff for less, more easily and more reliably.

6. Learn – Employ Today’s Most Advanced Analytics

The capacity for predictive analytics to learn from experience is what renders this technology predictive, distinguishing it from other business intelligence and analytics techniques.

7. Act – Render Business Intelligence and Analytics Truly Actionable

[…] predictive analytics is specifically designed to generate conclusive action imperatives. Each customer’s predictive score drives action to be taken with that customer. In this way, predictive analytics is by design the most actionable form of business intelligence.

To read the full white paper click here.


Data Mine Games

September 20, 2010

In this tutorial Christian Thurau (a post-doctoral researcher at Fraunhofer IAIS, St. Augustin in Germany) talks about the theory of data mining techniques such as Matrix Factorization, soft and hard Clustering, Principal Component Analysis etc. and explains its application to games. The video covers World of Warcraft as well as some shooter games.

Microsoft SQL Server

September 19, 2010

In this video Scott Golightly shows how to use the Microsoft SQL Server data mining wizard. The video is about 20 minutes long and quite good to follow.
In the “How Do I?” library you can find many more video tutorials about different topics e.g.:  “How Do I: Optimize SQL Server Integration Services?” or “How Do I: Render reports to a wide-range of formats?”


September 18, 2010

I have not been writting for sometime now but with a good excuse, I have been on holidays in Istanbul, Turkey 🙂

Worst practices in business forecasting

September 3, 2010

Recently I discovered a truly great online magazine called Analytics which covers applied topics about mathematics, operations research, and statistics to drive business decisions (to register for free click here). This post is an excursion into forecasting about the article Worst practices in business forecasting written by Michael Gilliland and Udo Sglavo published in the July/August issue of Analytics.

Here a summary of the eight worst practices described by the authors:

1. Overly complex and politicized forecasting process

Being aware of steps and participants that do not make the forecast considerably better is crucial to an efficient result. The authors recommend to use a method called Forecast Value Added (FVA) analysis which identifies process waste.
Another common shortcoming is that contributors tend to influence the forecasting process with respect to their own special interest. Thus biases occur through human touch points although the results should come from an objective and scientific process.

2. Selecting model solely on “fit to history”

The forecasting aim is to detect an appropriate model for predicting future behavior and not to fit a model to history. Overfitted models to randomness in (past) behavior are seldom a suitable forecasting model.

3. Assuming model fit = forecast accuracy

In general, a models’ fit to history does not indicate the accuracy of the future forecast. Indeed, it is quite common that forecast accuracy is often worse than the models accuracy of the fit to history.

4. Inappropriate accuracy expectations

No matter how hard we try to build an appropriate model, our forecast will always be limited by the nature of the behavior we try to predict. If the behavior is unstructured or unstable, then often we are better off utilizing a naive forecasting model and use this as a baseline model.

5. Inappropriate performance objectives

Think about tossing a fair coin over a large number of trials and the reasonable forecast accuracy; we will be correct about 50 percent of the time, thus an objective of achieving 60 percent accuracy is not possible.

6. Perils of industry benchmarks

Industry benchmarks for forecasting performance should be ignored considering following thoughts:

1) Can you trust the benchmark data?
2) Is measurement consistent across the respondents e.g. time frame, metric?
3) Is the comparison to the benchmark even relevant?
4) How forecastable is the benchmark companies’ demand patterns?

Instead of utilizing a benchmark, employ the naive model and steadily try to improve the process.

7. Adding variation to demand

Prediction accuracy is highly dependent on demand volatility, thus the objective should be to reduce this volatility. Unfortunately, in reality most companies add more volatility to their products demand which makes an efficient forecasting even more difficult.

8. New product forecasting

Concerning new products there is no historical data to rely on for prediction. Hence, the forecast might be based on judgment of the product manager who is biased towards the products success. Another approach is forecasting by analogy (similar products) but the analyst must be careful not to solely use data of successful products. Nevertheless, which method is applied, the most crucial point is being aware of the uncertainties of the outcome. A more reliable alternative is a structured analogy approach which helps to assess the scenarios of historical outcomes.