This Saturday, the Wallabies take on the Lions in the decider for this year's test series between Australia's and Britain's national rugby teams. As well as excessive amounts of mud and testosterone, the games also offers up some interesting lessons in a field rather removed from rugger biffo: analytics and business intelligence.
Picture: Scott Barbour/Getty Images
Ahead of the series, Accenture was commissioned by the Australian Rugby Union to help develop a new series of analytic measures around team performance in each game. Sports viewers are used to the idea of in-depth statistics during a match, but the ARU wanted to create new metrics and encourage sharing them via social media. The tools Accenture produced included substitution analysis (judging the potential effectiveness of subbing in particular players) and comparisons of kick accuracy.
Inevitably, some fans have objected to particular player selections, but Pain said that was expected. "In some sense, that's exactly what we're trying to highlight Maybe the fan 'gut feel' wouldn't have been the right decision. We have an audit trail and an algorithm and a rational basis for publishing the conclusions that we do."
While the subject matter might be unique, the approach was the same as for any business analytics task: identifying the potential data sources and the best way they could be presented. "The process we've goon through for the ARU was anchored in an objective of trying to understand the engagement of fans," Accenture analytics lead Michael Pain told Lifehacker. "We were able to leverage some work that had been undertaken in the UK about what data sources could be found, and how could we make that more digestible."
A key part of the strategy was designing the right visualisation — something Pain says is equally important for general business analytics. "One of the reasons you use visualisation is when you're trying to identify exceptions or patterns across a large amount of data. Even in sales data the hotspots, the patterns, the exceptions and opportunities can be seen a lot more effectively with visualisation."
What lessons did Accenture learn that could apply to other analytics processes? "One of the critical things is always getting the data quality high," Pain said. "Like the enterprise environment, when you're receiving data feeds from a large number of systems, unfortunately they can often develop faults and you do have to intervene."
Testing is also essential: "We had several dry runs to make sure it was working correctly."
"The other one was making the visualisation graphics digestible. We could have added a lot more complexity to those if we had chosen to, and we wrestled with making them meaningful but not too complicated."