Archive for the ‘News’ Category

Man Utd’s Chevrolet deal pushes Premier League shirt values to £191m

Monday, July 28th, 2014

By Alex Miller and Nick Harris                                                            

28 July 2014

Manchester United’s £47 million-per-year shirt sponsorship deal with Chevrolet has helped boost the combined shirt sponsorship income of the Premier League’s 20 clubs to a record £191.35m for the 2014-15 season.

United’s world-record deal was agreed and signed in 2012 on terms of $559m over seven years, starting with the coming season. The £47m-a-year is calculated at exchange rates at the time of the deal. Chevvy replace United’s previous deal with Aon, worth ‘only’ £20m a year.

The Old Trafford club separately have on ongoing £17m-a-year sponsorship deal with AON that includes their training kit and sponsorship of the Carrington training ground. That is not included in the 20 clubs’ total in the first graphic below.

Three other clubs in 2014-15 have more valuable shirt sponsorship deals than last season among those clubs in the top flight both years: Swansea’s deal with GWFX has risen by £2m to £4m, Everton’s with Chang has gone up £1.3m to £5.3m and Hull are making a bit more money from 12Bet than Cash Converters.

The 20 Premier League clubs combined have added £23.6m to last year’s combined total of £165.75m, with only Tottenham and West Brom seeing a dip in their deals.

It is worth noting the the ‘big six’ clubs between them account for 79 per cent of the value of the 20 deals – or for £151m of £191m. And United’s deal by itself is worth more than the 14 smallest deals combined. But even most of the smaller clubs now have multi-million pound-per-year deals, evidence of the value of having global reach via the PL shop window.

Article continues below

PL shirt sponsors 2014-15

.

Spurs sponsorship arrangement last season saw AIA sponsor club shirts in domestic games and Aurasma in the Europa League for a total of £19m. That deal has been replaced with a simpler £16m-a-year deal with AIA.

Zoopla decided not to extend their sponsorship deal with West Brom following former striker Nicolas Anelka’s controversial quenelle goal celebration last season. The club has signed a new deal with Intuit QuickBooks worth around £300,000 a year less.

The overall figure was also reduced as relegated clubs Fulham, Norwich and Cardiff earned a combined total of £6.5m from their shirt sponsorships last season, while promoted sides QPR, Burnley and Leicester pull in a lower combined total of £4.5m, a drop of £2m.

The global reach of the Premier League is reconfirmed with an increasing number of foreign-based companies continuing to adorn club shirts. This season 14 shirts feature overseas-based companies, compared to 10 last season. Companies from the United Arab Emirates, South Korea, the US, the Philippines, South Africa, Thailand and China are represented.

Despite a football-wide ban on betting, gambling companies continue to feature prominently in the Premier League. This season four shirts feature gambling companies: those at Aston Villa (Dafabet.com), Stoke (Bet265), Hull (12Bet) and Burnley (Fun88) – compared to three last season.

The total value of the Premier League’s shirt deals have almost doubled in five seasons. In 2010-11 (details here) they were worth £100.45m; this coming season represents a 90.5% rise since then.

Article continues below

PL shirt deals five years

.

The season by season deals for other years can be seen here: 2011-12 sponsorships2012-13 sponsorships here; 2013-14 sponsorships.

.

Follow Alex on Twitter @AlexMiller73

.

More on Chelsea / Man Utd / Arsenal (or search for anything else in box at top right)

More from Alex Miller

Follow SPORTINGINTELLIGENCE on Twitter

 

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Unlucky: 13 Premier League clubs hike ticket prices. City cheapest, Arsenal most costly.

Tuesday, July 22nd, 2014

*

By Alex Miller                                                                   Follow Alex on Twitter @AlexMiller73

22 July 2014

Despite Premier League clubs collecting massively enhanced payments last season, this season and next season from the latest TV deals, worth £5.5billion over three years from 2013-16, the benefits are not reaching fans, with 13 sides raising season ticket prices this season.

With clubs looking to maximise revenues to be able to compete in the Premier League within the constraints of FFP, fans are once again paying increased prices.

The biggest price hikes have been made by Burnley and QPR, both promoted for the coming season; Burnley have raised the price of their cheapest season ticket prices by a whopping 47%. Substantial rises have also been announced at Stoke and Hull.

The average ‘entry level’ (lowest price) adult season ticket in the Premier League is now £526, up 6.5 per cent; while the average top-price season ticket is now £870, up 7 per cent.

The table below contains pricing for standard, adult, non-concessionary season tickets, as provided by clubs. A whole range of other prices are or were available in various early bird and concessionary packages. For full details of each club’s pricing policies (which per club fill at least a page), do please feel free to visit their websites.

Article continues below

PL tix 14-15 prices

Burnley chief executive Lee Hoos defended the increases with fans able to watch Premier League football this season and because £100 from this year’s season ticket price will go towards the 2015-16 season tickets when fans renew.

Hoos said: “We have already announced that the early bird renewal prices will again be frozen this year for the 2015/2016 season when they go on sale.

“If a full price season ticket is purchased before the start of the season, not only will purchaser qualify for early bird renewal price, they will also be credited with £100 towards the purchase of a 2015-2016 season ticket.”

With the Premier League acknowledged as the most lucrative league in the world, only two clubs have lowered prices this season. Football for some fans in the North East has come down after Newcastle and Sunderland announced marginal price cuts.

Arsenal announced three per cent price rises in line with inflation, taking their most expensive season ticket prices above the £2,000 mark – the costliest in the league. Arsenal also have the most expensive low cost season ticket, priced at £1,014, which with the exception of Tottenham and Chelsea, is more costly than every other clubs’ most expensive ticket.

An Arsenal spokesman explained that priciest match day tickets of £127 were limited to 100 seats. He added: “For next season, the club will once again be operating a categorised ticket policy, with matches at Emirates Stadium being graded A, B or C.

“This initiative, introduced in 2013, has enabled the club to offer 90,000 cheaper match tickets from £25 as part of its drive to provide more tickets at affordable prices.”

Despite winning the league last term, Manchester City have frozen the cost of their cheapest season tickets at £299 – the cheapest in the Premier League – although the club have announced a 10% rise on the most expensive season tickets at the Etihad to £860.

A Football Supporters’ Federation (FSF) spokesman told Sportingintelligence: “Nine out of 10 fans already think they are paying too much for tickets and these figures only back that point of view.

“Clubs are swimming in cash and the last media deal was worth £5bn. The huge increase would have been enough for clubs to let every fan in for free and they would have been no worse off.

“Top-flight clubs need to think long-term and cut prices. Never mind all the clever PR strategies clubs come out with – nothing would earn goodwill like dropping prices.

“The FSF will lead a march on Premier League and Football League HQ demanding ‘Affordable Football For All’ on Thursday 14th August. We would encourage all fans to join us.”

..

More on Chelsea / Man Utd / Arsenal (or search for anything else in box at top right)

More from Alex Miller

Follow SPORTINGINTELLIGENCE on Twitter

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Man United and Liverpool remain top TV draws despite 2013-14 without trophies

Monday, July 21st, 2014

By Nick Harris

21 July 2014

With almost four weeks still to go before the start of the 2014-15 Premier League season, the PR battle for TV viewers in the host country of the world’s most-watched domestic football league is well underway. Sky Sports will again have the lion’s share of the live games in the UK, having paid £2.28 billion for a majority of the UK live rights for the 2013-16 period. And BT Sport will again be trying to woo a portion of the pay-TV market, having paid £738m for their own share of matches. SKy ad

BT adThe wider strategic aims of both companies and an exploration of how these deals are focussed on far more than showing football matches can be read here and here. (It’s all about the long-term triple-play market, now becoming the quad-play market). Suffice to say, for those who still read old-fashioned newspapers in the UK, you will have noticed the upsurge in full-page adverts (right and left) for the two broadcasters over recent days.

One thing that remains true regardless of who is showing the games is that certain teams have mass appeal and will guarantee relatively big audiences, whoever they are playing, while the majority of teams will attract far fewer viewers. Sportingintelligence has looked at the viewing figures for every single live Premier League match screened in the UK last season and the conclusions are clear and unambiguous:

1: Manchester United and Liverpool are the big draws. United’s average audience was 1.4m people per match and Liverpool’s was 1.31m. This is in-home TV viewing but clearly indicative of popularity. No other team averaged as many as 1.2m people per game.

2: Sky remains the main player in audience terms, although with a 20-year head start this is to be expected. Sky have somewhere north of 11 million pay-TV customers and more than 7 million of them subscribe to the sports channels. Their average PL audience last season was 1.2 million people per live game. BT Sport’s channels can now be accessed in around 5 million homes, either directly from BT on TV or via apps, or via other means, for example as a bolt-on to Sky or part of a Virgin package. BT’s average PL match audience last season was 562,000 people per match.

3: While these figures will no doubt seem extremely small to many observers, they demonstrate two things; first that pay-TV is not about ratings primarily but about getting viewers to pay for premium exclusive content they cannot get elsewhere; second that having paid for exclusive content, those viewers will want to see the ‘biggest’ and ‘best’ teams most within their subscription packages. That is why, for example, Manchester United were among the most-shown teams (25 of their 38 PL games were live on TV), as were Liverpool (28 of 38), Chelsea (25), Arsenal (25) and Manchester City (25). Less ‘attractive’ teams were shown much less often. Fulham and Cardiff were on TV just eight times each, with Norwich, Hull and West Brom just nine times each.

The first graphic below (click to enlarge), shows the TV audiences, in thousands, for each of the 154 live PL matches shown on UK TV in the 2013-14 season. It is self explanatory and shows the big teams were shown most but also got the biggest audiences.

Note that the most-watched matches were all contested between two ‘big’ clubs: the 2.7m who watched Liverpool v Chelsea; the 2.462m who watched United v Arsenal; the 2.1m who watched Chelsea v United; the 2m who watched Liverpool v City and the 2m who watched City v United.

Article continues below

PL TV 13-14 grid

.

The second graphic – the far right-hand side of it – ranks the 20 clubs from last season in terms of average audience per match, with United at No1 and Liverpool at No2 and so on. Again, click to enlarge it.

It also provides details of the every club’s TV audience for home and away games, and by channel. You can see that BT showed the big clubs the most, but only screened one game all season featuring each of Norwich, Sunderland, Stoke and West Brom.

Using United as an example of how to read the table, they were shown seven times on BT Sport (four times at home, average 740,000 viewers, three times away, 686,000) and 18 times on Sky (seven times at home. 1.833m viewers, 11 times away, average 1.567m viewers). Their 25 games were seen by 35,093,000 people at an average of 1.4m per game.

Note that Liverpool were shown away three times on BT and 14 times on Sky, so 17 of their 19 away games were live on TV. That means 17 games of disruption for the traveling Liverpool fan, where disruption is defined as a game not being played at the traditional 3pm kick-off. Yet even more controversial is the way the TV money is shared out, as we’ll deal with in a moment.

Article continues below

PL TV 13-14 full breakdown

.

Money is often cited as being the root of modern football’s evils. The Premier League, say many detractors, is obsessed with cash and to hell with the rest of the game. And certainly there is a whole load of money – primarily TV money – now sloshing around the top division of English football, whereas the rest make do on (relative) scraps.

But in fact the way in which the Premier League’s TV money is divided up is actually much more democratic than most leagues divide their money. See this article for the precise way in which the Premier League divided the TV cash from 2013-14. The ratio between the top earners, Liverpool (£97.5m) and the bottom earners, Cardiff (£62m) was 1.57 to 1.

The comparable figure in the German Bundesliga is 2 to 1; that means Germany is less fair. Then in France it is 3.2 to 1, less fair again, and in Italy 4.2 to 1, and in Spain, where Real Madrid and Barcelona basically take most of the cash between them, it is a whopping 11.3 to 1.

However.

What if the Premier League cash were divided not according to relatively democratic principles, based on equal shares, TV appearances and a sum based on performance, but solely on how many viewers each team attracted? What would the division be like then?

Sportingintelligence has calculated that Liverpool would have got £74m more last season than they actually did, while United would have got £70m extra. Chelsea, Arsenal, Manchester City and Tottenham would also have earned tens of millions more, and everyone else a lot less.

The methodology was as follows: we re-allocated the £1.563 billion pot of cash on the basis of just two metrics: performance and TV viewers. The performance element remained the same as the normal method of distribution, or around £1.2m per place in the table. And the rest was split on total TV viewers. So Liverpool, with 36.74 million TV viewers, got £148.9m at £4.05m per million viewers. And every other club got £4.05m per million viewers to give the totals in the table below.

There is an inevitable argument that United and Liverpool got more viewers because they were shown more times. Absolutely. But they were shown more because that’s what people want to watch. The numbers back it up.

The graphic below is indicative of how unequal football could become; the status quo, at least in revenue share, is actually ‘fairer’ than most other football leagues.

PL TV 13-14 cash theoretical on viewers

..

More from Nick Harris

Follow SPORTINGINTELLIGENCE on Twitter

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Germany: deserving, obvious World Cup winners (almost nobody predicted)

Wednesday, July 16th, 2014

*

So the World Cup is over, Germany are fitting champions, Lionel Messi couldn’t add the ultimate title to his glittering CV and the host nation is left to ponder what might have been. So who could have forecast this? Actually, a huge variety of ‘experts’, theorists, modelers and systems tried to predict the outcome of the tournament, from Goldman Sachs to boffin statistical organisations. In his latest post for Sportingintelligence, and as a wrap-up of an ongoing evaluation of rates of success (click HERE for Part 1 and background and click HERE for Part 2 and HERE for Part 3Roger Pielke Jr announces the winners and losers from the forecasting game. 

Follow Roger on Twitter: @RogerPielkeJR and on his blog

.

Roger PielkeBy Roger Pielke Jr.

16 July 2014

Another World Cup is now history, and with it my World Cup prediction evaluation exercise. As a reminder, this exercise is based on rankings made before the tournament started with the details of the evaluation explained here.

So to the results. Overall, Andrew Yuan, whose predictions were popularised by The Economist, took first place, beating FIFA’s rankings by a single match. Of course, it is no surprise those two were so close as Yuan and the FIFA rankings had 60 of 63 identical match predictions.

After FIFA there is a three-way tie for the bronze medal, with Bloomberg, Elo rankings and Hassan and Jimenez sharing the third step of the podium. Of note is that the latter was produced four months ago, well before the national team rosters were even announced.

The full table is as follows, and article continues below:

Pielke FINAL WC

.

One of the leaders after the group stage of the tournament, Danske Bank, performed the worst in the knockout portion of the World Cup, and slipped from the podium. By contrast, the worst performer during the group stage (Financial Times) was joint first during the knockout matches. With these methods, past performance is apparently not a good predictor of future performance.

None of the other methods outperformed the naive baseline based on TransferMarkt player values that I assembled prior to the tournament. Three methods actually under-performed that naive baseline. Were you to pick one of these methods (other than FIFA or Transfermarkt) at random prior to the tournament, you would have had a 10 per cent chance of beating FIFA and a 50 per cent chance of beating Transfermarkt.

The table above also shows how each method performed in the knockout portion of the tournament, in anticipating advancement from the group stage, and in anticipating the finalists. Interestingly, the overall winner was only one of two methods which failed to anticipate one of the finalists.

No method anticipated both Germany and Argentina in the final, and no method picked Germany to win it all. This website’s editor considered other models to predict the winner before the tournament, and made a personal forecast of an Argentina-Germany final, but he picked the wrong winner.

Here are some more general lessons to take from prediction exercise:

1: Prediction evaluation is highly sensitive to the methodology employed. For instance, were the evaluation method to award a three-game “bonus” to any method than anticipated a finalist, Andrew Yuan would fall from first place to sixth place. The weighting of results can consequently dramatically change the evaluation rankings.

In any prediction evaluation it is therefore important to settle upon an evaluation methodology in advance of the data actually coming in. It is also important to keep separate the roles of predictor and evaluator. It is obviously very easy to “game” an evaluation to look more favorable to a particular prediction method, simply by choosing a convenient evaluation metric. Be cautious with anyone who offers you both a prediction and an evaluation of their prediction, especially after the fact.

2: Beating a simple baseline is very difficult. We might debate how “naive” the FIFA rankings or Transfermarkt valuations actually are in practice. But both clearly outperformed more sophisticated approaches. The only method which actually out performed FIFA was one which successfully picked two of the three matches that they had different across the entire tournament. Was that luck or skill? None of the other 10 methods added any value beyond the FIFA rankings. Should they have even bothered?

Even though outperforming a naive baseline over a tournament is difficult, that does not take away for the entertainment value of predictions. For instance, FiveThirtyEight performed poorly according to the evaluation methods here, but nonetheless offered stimulating commentary throughout the tournament, in part based in its predictions.

3: Ultimately, we can never know with certainty how good a predictive methodology actually is in practice. Some systems that we wish to predict have closed boundaries, such as a deck of 52 cards. We can develop probabilistic predictions of poker hands with great certainty. In the real world, we can sometimes (but not often) accumulate enough experience to generate predictions of open systems that also have great certainty, like the daily weather forecast.

But other systems are not subject to repeated predictions and/or are so open as to defeat efforts to bound them. The World Cup, and sporting events in generally, typically fall into these categories. Arguably, so too does much of the human experience. Perhaps baseball, with its many repeated events over a short time period might be considered more like a weather forecast than a World Cup.

Ultimately, making good decisions depends on understanding the difference between skill and luck, even if we can never fully separate the two. A prediction evaluation exercise can help us to quantify aspects of our ignorance and lead to questions about what is is that we really know.

Ultimately, the answers to these questions cannot be resolved empirically.

After this exercise, there is one thing we all know for sure. Germany are world champions, despite being looked over by the predictions. I hope you enjoyed this exercise over the past month. I’ll be doing similar exercises in the future and welcome your suggestions. Get in touch via Twitter or via my blog, details below.

.

Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York TimesThe GuardianFiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog

.

More on this site mentioning the World Cup

Follow SPORTINGINTELLIGENCE on Twitter

 

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Bankers and bookies oust FIFA as best bets for World Cup forecasts

Tuesday, June 24th, 2014

*

This post has been updated on 27 June, UK time, an earlier version of same story is below

Yesterday’s games at the World Cup mean the World Cup group games (48 of 48) have been completed.  There have been expected victories for some nations, big upsets for others – Adios Spain! Bye-bye England! - and more goals than most fans would have expected. So who could have forecast this? Actually, a huge variety of ‘experts’, forecasters, theorists, modelers and systems have tried to predict the outcome of this tournament, from Goldman Sachs to boffin statistical organisations. In his latest post for Sportingintelligence, and as part of an ongoing evaluation of rates of success (click HERE for Part 1 and background and click HERE for Part 2) Roger Pielke Jr sorts the best from the rest. 

Follow Roger on Twitter: @RogerPielkeJR and on his blog

.

By Roger Pielke Jr.

27 June 2014

The group stage is over, and after 48 matches we can declare a winner in the first part of the World Cup prediction evaluation exercise. We made a ‘naive’ prediction ourselves based on the financial value of the squads; and we’re comparing this to 11 other predictions made by parties ranging from bankers and bookies to boffins and FIFA rankings.

Congratulations to Danske Bank and Andrew Yuan who take joint first, each picking 32 matches correctly and 11 of the 16 teams which advanced. The Elo ratings and Bloomberg also picked 11 of the 16 teams to advance but fell one short overall, picking 31 matches.

Article continues below

Pielke 48

 

The FIFA rankings fall fifth despite having only one match picked differently than Yuan, illustrating the fine edge to predictive success. Hassan and Jimenez tied the FIFA rankings, despite producing their forecast last February. The pre-group stage odds  from Betfair.com are next, followed closely by Infostrada.

Each of the seven methods discussed so far showed skill in that they outperformed the naïve baseline based on the estimated transfer market value of each of the teams. Still, the naïve baseline was just four games out of first, but only anticipated eight of the 16 teams moving on. That is the same number of teams to advance in Brazil who also advanced in 2010 in South Africa, which could have been used as another naïve baseline.

Three predictions win the “why bother?” award by under-performing the naïve baseline – 538, which only picked seven of the advancing squads, Goldman Sachs and the FT. The latter was included in the evaluation despite not being proposed as a forecasting tool. The other two don’t have that excuse for their underperformance.

The main lesson that I’d suggest taking from the exercise thus far is that it is very difficult to generate predictions that can outperform a fairly simple baseline approach. It is even more difficult to outperform the existing ratings systems of FIFA and Elo. All 10 methods were just one match away from underperforming the FIFA rankings. Ultimately, most of these prediction methods are consequently of certain entertainment value, but uncertain value in their prognostications.

Of course separating luck from skill is not possible in such an exercise. The strong performance of ranking systems in 2014 was in part due to the low number of upsets (7 vs. 14 in 2010) and draws (9 vs. 14 in 2010).

Consider that in 2006 and 2010 the FIFA rankings would have correctly predicted 26 and 20 matches (of 48) respectively in the group stages (this data sent courtesy @roddycampbell).

So was Danske Bank lucky and Goldman Sachs unlucky? Or was the former actually a more skilled forecaster? These are all good questions for the pub as the data do not provide answers.

We are now in a position to set the stage for part 2 of the prediction evaluation. Before I describe how I have chosen to evaluate the matches for this phase of the contest, let me remind you that there are many different ways to structure such an evaluation. I don’t think that there is any single best way, however it is important to be clear about procedure before evaluating. You don’t want to find yourself setting up the rules for evaluating a prediction after the fact, especially if you are one offering predictions.

* I use each method’s overall ranking of the teams presented before the tournament began. Several forecasters are providing updated predictions as the tournament unfolds, and the betting odds obviously change.

* If no such ranking was provided I use instead the ranked probability to advance from the group stage.

* As before, I convert probabilistic predictions into deterministic forecasts. There are obviously no draws in the knockout stage.

* I will generate a prediction for each method for each match. In other words, there will be at total of 15 matches predicted by each method over the knockout stage, regardless how they do in each round.

* At the end of the tournament I will provide a ranking for predictions in the knock-out stage as well as an overall ranking based on both the group stage and the knock-out stage predictions.

For the upcoming round of 16 matches, every method is in agreement on six of the matches, with the favorites as unanimous selections: Brazil, Colombia, France, Germany, Greece and the Netherlands. A majority favour Colombia and Belgium, but Uruguay and the USA get a few nods.

.

Roger PielkeBy Roger Pielke Jr.

24 June 2014

We are fast approaching the end of the group stages, and the battle for the top of the prediction league table is tight. Five approaches within one game of the lead after 36 of the 48 matches have been played.

For detailed explanation on the predictors, follow the links above, but to summarise, we made a ‘naive’ prediction ourselves based on the financial value of the squads; and we’re comparing this to 11 other predictions made by parties ranging from bankers and bookies to boffins and FIFA rankings. Those Fifa rankings have held sway … until now.

Sitting alone at the top is Danske Bank, which has the most games picked correctly overall. Among the leaders at the halfway point were the FIFA rankings and Andrew Yuan, who I noted had 47 out of 48 matches in common.

Here is the table after 36 games.

Article continues below

Pielke Part 3 table

Yuan took that one match that they split (Mexico-Croatia) ensuring that the FIFA Rankings cannot finish first. The Naive Baseline has had a good run, passing up four of the methods, and now trails the two rankings, FIFA and Elo, by just one game.

I’ve added an additional method of ranking the predictions, according to the number of countries picked to advance from the group stage. All methods have already slipped from perfection, with only five approaches correctly picking 3 of the 4 teams so far to advance. The others, including Danske Bank at the top of the table, only have 2 of the 4. It just goes to show that prediction evaluation is highly sensitive to the metrics of assessment that are used.

Looking ahead, all methods have picked France, Argentina, Germany, Belgium and Russia to advance. But no method has picked Costa Rica, and only two have the USA. On Friday I’ll provide a summary of the group stage of the competition and set the table for the knockout stage.

.

Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York TimesThe Guardian,FiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog

.

More on this site mentioning the World Cup

Follow SPORTINGINTELLIGENCE on Twitter

 

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Picking World Cup winners? After 12 games, FIFA rankings beating eminent thinkers

Monday, June 16th, 2014

*

The final whistle in the Germany-Portugal match in Group G in Salvador marked the end of the 12th game of the 2014 World Cup, and thus a quarter of the group stage is complete. There have been expected victories for some, big upsets for others and more goals per game at this stage than any World Cup since the 1950s. Who could have forecast this? Actually, a huge variety of ‘experts’, forecasters, theorists, modelers and systems have tried to predict the outcome of this tournament. In his debut post for Sportingintelligence, and as part of an ongoing evaluation of rates of success, Roger Pielke Jr sorts the best from the rest. 

Follow Roger on Twitter: @RogerPielkeJR and on his blog

Roger PielkeBy Roger Pielke Jr.

16 June 2014

Prognosticators have been hard at work generating pre-tournament predictions of who will advance and who will win. But which prediction is the best? Is it the one who picks the winner? Or is it the one which best anticipates the knock-out round seedings? How can we tell?

I will be evaluating 11 predictions over the course of the World Cup, starting with a league table after 12 games, in a moment. But suffice to say that after a dozen games, Fifa’s ranking system is proving as good as any other indicator of success, while some eminent thinkers are faring less well.

The 11 under consideration are:

To evaluate the different predictions, I am going to quantify the “skill” of each forecast. It is important to understand that forecast evaluation can be done, literally, in an infinite number of ways. Methodological choices must be made and different approaches may lead to different results. Below I’ll spell out the choices that I’ve made and provide links to all the data.

A first thing to understand is that “skill” is a technical term which refers to how much a forecast improves upon what is called a “naive baseline,” another technical term. (I went into more detail on this at FiveThirtyEight earlier this spring). A naive baseline is essentially a simple prediction. For example, in forecast evaluation meteorologists use climatology as a naive baseline and mutual fund managers use the S&P 500 Index. The choice of which naive baseline to use can be the subject of debate, not least because it can set a low or a high bar for showing skill.

The naive baseline I have chosen to use in this exercise is the transfer market value of the 23-man World Cup teams from Transfermarkt.com. In an ideal world I would use the current club team salaries of each player in the tournament, but these just aren’t publicly available. So I’m using the next best thing.

So for example, Lionel Messi, who plays his club team football at Barcelona and his international football for Argentina, is the world’s most valuable player. His rights have never been sold, as he has been with Barcelona since he was a child, yet he’s estimated to have a transfer market value of more than $200 million. By contrast all 23 men on the USA World Cup squad have a combined estimated value of $100 million. (I have all these data by player and team if you have any questions about them — they are pretty interesting on their own.)

Here then are the estimated transfer values of each World Cup team:

SqV for RP predictions analysis

.

In using these numbers, my naive assumption is that the higher valued team will beat a lower valued team. As a method of forecasting that leaves a lot to be desired, obviously, as fans of Moneyball will no doubt understand. There is some evidence to suggest that across sports leagues, football has the greatest chance for an underdog to win a match. So in principle, a forecaster using more sophisticated method should be able to beat this naive baseline.

Here is what the naive baseline (based on the team rosters as of June 5) predicts for the Group Stages of the tournament: The final four will see Brazil vs Germany and Spain vs Argentina. Spain wins the tournament, beating most everyone’s favorite Brazil. The USA does not get out of the group stage, but England does. All eight of the top valued teams make it into the final eight.

While this naive baseline is just logic and assumptions, work done by “Soccernomics” authors Stefan Szymanski and Simon Kuper indicates that a football team’s payroll tends to predict where it winds up every year in the league table. Payrolls aren’t the same thing as transfer fees, of course, but they are related. Unfortunately, as mentioned above individual player salaries are not available for most soccer leagues around the world (MLS is a notable exception).

The predictions are not all expressed apples to apples. So to place them on a comparable basis I have made the following choices:

  • A team with a higher probability of advancing from the group is assumed to beat a team with lower probability.
  • If no group stage advancement probability is given I use the probability of winning the overall tournament in the same manner.
  • This means that I have converted probabilities into deterministic forecasts. (There are of course far more sophisticated approaches to probabilistic forecast evaluation.)
  • No draws are predicted, as no teams in the group stages have identical probabilities.
  • The units here, in the group stage at least, will simply be games predicted correctly. No weightings.

Other choices could of course be made. These are designed to balance simplicity and transparency with a level playing field for the evaluation. Just as is the case with respect to the value of having a diversity of predictions, having a diversity of approaches to forecast evaluation would be instructive. No claim is made here that this is the only or best approach (laying the groundwork here for identifying eventual winners and losers).

With all that as background, below then are the predictions in one table (click on it for a bigger view). The yellow cells indicate the teams that the naive baseline sees advancing to the knockout stages, and the green shows the same for each of the 11 predictions. The numbers show the team rankings according to each prediction.

(Click to enlarge, article continues below)

Pielke 10 - start-out predictions

I will be tracking the performance of the 11 predictions against the naive baseline as the tournament unfolds, scoring them in a league table.

After 12 matches, the first league table is below. It is early still in the tournament, but there already is a bit of spread developing among the predictions. Five of the 11 are running ahead of the naive baseline, and four are trailing. But it is only one game in either direction, so I’d hesitate in saying anything much at this point. As the tournament progresses I expect we will see greater divergence. Stay tuned.

Accuracy after 12 games

.

Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York Times, The Guardian, FiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog

.

More on this site mentioning the World Cup

Follow SPORTINGINTELLIGENCE on Twitter

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

And the World Cup winners will be … Brazil. Or Argentina. Or Spain. Or Germany

Monday, June 9th, 2014

By Nick Harris

9 June 2014

With the World Cup in Brazil just a few days from kicking-off, five-times winners Brazil are the bookmakers’ favourites on home turf, with twice-winners Argentina second favourites, followed by holders Spain and three-times winners Germany. Then the chasing pack comprises Belgium, France, Italy, Uruguay, Portugal, England, the Netherlands, Colombia and Chile. And the rest, in betting terms, can pretty much forget it.

Anyone who follows football will have views on their own favourites, for their own reasons. A purely personal and non-objective view is that Argentina will win in a final against Germany, after that pair have defeated Spain and Brazil respectively in the semi-finals.

But what about objectivity, where attempts are made to predict the outcome of the World Cup based on nothing but pure numbers? A whole range of models are out there from Goldman Sachs, to Nate Silver at 538, to this simulator, to the Bloomberg analytical tool to a complex set of criteria used by Rachel Riley from Countdown.

Will any of them be completely right? Of course not. But it’s a game of opinions and models are a debate-provoking distraction. So in an attempt to look at different objective reasons why certain nations might do well this summer, Sportingintelligence has considered how the World Cup would pan out if all the games went according to A: Fifa rankings; B: player wages; C: the value of squads; D: The Wisdom of Crowds (ie: an opinion poll); E: Pedigree at prior tournaments staged in the Americas; and F: a total of these totals.

Methodology

For all the models, the criteria have been applied on a match-by-match basis, starting in the group stages and ending with the knockout stages detailed below. So, for example, when using Fifa’s rankings, Brazil (ranked 3) win the opening game against Croatia (No18), Mexico (No20) will beat lower-ranked Cameroon on Friday, and so on.

But using the players’ wages as guide, as derived from the data that underpins the GSSS 2014 and similar work from partner researchers, while Brazil will still win against Croatia because their players earn more, Cameroon will beat Mexico using the same metric, and not vice versa as when using rankings.

Every match in the group stage is predicted on the relevant criteria to produce latter stages that look like this first graphic, using the first two methods. In both these cases, Spain beat Germany in the final, with those teams having beaten Argentina and Brazil respectively.

Article continues below.

WC predictions, rank & pay

.

Next we considered the values of the squads. How do you get an accurate idea of what each squad of players is worth? We took data from two sources to get a neutral aggregate view on this. The first was the Football Observatory’s new annual review, which attempts to make objective player valuations on a wide variety of criteria. Using their data we came up with an average player value per World Cup team, with Argentina being the most valuable team, then France, Brazil, Spain and so on down to Iran in 32nd place.

The second was a piece of research by The Score, who in turn used data derived from www.valor.com.br and www.transfermarkt.com. They had Brazil as the most valuable team, then Spain, Argentina, Germany and so on down to Honduras in 32nd place. We took both sets of data from the Observatory and the Score and created a combined index that ranked the value of the squads from Brazil, Argentina and Spain as the three most valuable down to Costa Rica, Iran and Honduras as the least valuable.

Again, each game was played out with the best team (most valuable in this case) winning. The results are in the graphic below, followed by the results based on the ‘Wisdom of Crowds’, ie an opinion poll. This was a simple, one-question poll here, conducted last night and this morning, and we took into consideration the first 500 responses. Again, each game was ‘played’ based on the Wisdom of Crowds, and the results of that are below too.

Article continues below

WC predictions SqV and wisdom.

The fifth model took into account performances at the seven previous World Cup tournaments staged in the Americas. The World Cups of 1930, staged and won by Uruguay, of 1950, staged by Brazil and won by Uruguay, of 1962, staged by Chile and won by Brazil, of 1970, staged by Mexico and won by Brazil, of 1978, staged and won by Argentina, of 1986, staged by Mexico and won by Argentina, and of 1994, staged by the USA and won by Brazil, have all been won by South American teams.

Whether that is down to chance, culture, home advantage, grass, climate or any mixture of multiple elements is beside the point. What’s certain is the statistical phenomenon of South American teams always winning World Cups in the Americas, seven from seven so far.

So we looked at the results of those seven tournaments and ranked this summer’s nations based on how they (or their respective forebears) had performed in previous tournaments in the Americas, and using those rankings, ‘played’ each game of this summer to take on board this historical ‘home’ and ‘regional’ bias.

This is how a summer World Cup based on previous World Cups in the Americas would play out.

Article continues below.

WC in Americas

.

So who will win the World Cup?

Nobody knows, which is the beauty of it. But if you made a composite metric of all the above, then it’s Brazil to beat Argentina in the final, with Spain and Germany reaching the semi-finals, and Uruguay, France, England and Belgium making the quarters.

Vamos!

WC total of totals

 

More from Nick Harris

Follow SPORTINGINTELLIGENCE on Twitter

Obtain the full Global Sports Salaries report (left) FREE by clicking this sentence to send an email. (Write GSSS 2014 in the subject line, and your name / organisation in the email)NB: reports are being emailed on an individual basis so please be patient if you don’t get it immediately

 

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

OFFICIAL: Premier League 2013-14 had biggest top-flight crowds for 64 years

Monday, June 2nd, 2014

By Alex Miller

2 June 2014

The Premier League 2013-14 season attracted the highest average gates in top-division football in England in 64 years, Sportingintelligence can reveal. The average number of tickets sold per game was 36,695 last season – the highest figure in the Premier League era and the biggest in England’s top division since 1949-50.

Despite a number of fan-lead protests against the rising cost of watching matches last season, a total of 13,944,100 fans attended the 380 matches, a two per cent increase on the previous campaign. The previous capacity record in the Premier League era came in the 2007-08 season when average crowds reached 36,149.

Last season also produced a record seat occupancy rate figure of 95.9 per cent, according to official Premier League figures. The previous highest seat occupancy rate in the Premier League era was 95.3 per cent during the 2012-13 season.

Article continues below

PL 64-year high crowds

.

The last time average gates in the top division were as buoyant was in 1949-50 when 17.3m watched 462 matches – at an average of 37,400 per game. Since then the top-flight has been reduced from 22 teams to 20 meaning there are only 380 Premier League games each season.

The Premier League has rolled out a number of programmes including the Away Supporters Initiative to boost attendance and the quality of the matchday experience for fans. A Premier League spokesman said: ‘Attendances are of the utmost importance to the Premier League.’

Here are the club by club attendances for 2013-14.

Article continues below

PL 13-14 by club

 

.

The Premier League is the second best attended football league in the world in average gates, behind only the Bundesliga, which averages more than 40,000 fans per game.

If you wanted to create a Premier League with the maximum possible attendance potential based on current available capacity at clubs around the country, it would include both Sheffield clubs, Leeds, Middlesbrough, Derby, MK Dons and Blackburn, as below. That make-up of clubs, at full capacity, would attract 15.8m people in a season at an average of 41.613 fans per game.

PL 'best' in crowd terms

 

More on Chelsea / Man Utd / Arsenal (or search for anything else in box at top right)

More from Alex Miller

Follow SPORTINGINTELLIGENCE on Twitter

 

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Where the money went: Liverpool top Premier League prize cash in 2013-14

Wednesday, May 14th, 2014

By Sportingintelligence

14 May 2014

Liverpool were pipped to the Premier League title by Manchester City on Sunday but official figures released today show they ended the season at No1 in the cash stakes, measured by League ‘prize’ money.

The Merseyside giants earned £97,544,336 from Premier League funds for their 2013-14 campaign, pushing champions Manchester City into second place. City earned £96,578,329. Liverpool earned more than City because 28 of their 38 league games were screened live on TV as opposed to only 25 of City’s and TV appearances are one factor considered.

The top five earners were completed by Chelsea (£94.1m), Arsenal (£92.9m) and Tottenham (£89.7m). Last season’s champions and top earners Manchester United were seventh in this season’s Premier League but sixth highest earners with £89.2m.

At the other end of the table, bottom placed Cardiff City earned £62,082,302 for a campaign in which they ended up relegated to the Championship. Astonishingly this was £1.2m more than United pocketed in 2012-13 for winning the title – with the huge increases this season a result of the lucrative new TV deals now in place.

Article continues below

PL cash 2013-14 OFFICIAL by rank

Sky and BT Sport are paying £3.018 billion between them to show Premier League matches live in the UK across three seasons from 2013 to 2016 inclusive. Foreign broadcasters around world are paying another £2.3 billion combined, on top, for the same period.

The Premier League also earns money from the sales of highlights (on Match of the Day) and near-live rights (on Sky) and brings in further sums from commercial deals like the one with headline sponsor Barclays.

All that cash goes into one big pot and the sums announced today are the hard eye-watering rewards for the clubs.

Every club gets an ‘equal’ share of £52,198,111, which derives from domestic TV income (£21,631,444 per club), overseas income (£26,295,817) and commercial income (£4,270,850).

Every club then gets another sum depending on league position, worth £1,236,083 per place in the table, from that sum for bottom-placed Cardiff to £24,721,600 to winners City.

Payments for 2012-13 / Payments for 2011-12 / Payments for 2010-11 / Payments for 2009-10

Each club also gets a variable amount depending on how many times they were shown live on Sky or BT this season. Every club got a minimum of £8.6m from this pot, even if they were shown as rarely as Stoke (just seven live televised games). Liverpool, shown 28 times, got £21.9m in these ‘facility fees’.

This season’s rewards are a huge leap on last season’s TV income, with Liverpool earning £42.7m more than season than last, for example.

Clubs have three main revenue streams: match day income (from tickets, corporate dining etcetera), media  income (of which the payments listed are the largest but not the only part) and commercial income (from kit deals, sponsorship, merchandise, tours and so on).

RELATED STORY: It’s the economy, stupid! How money fuels glory in the Premier League

The ratio in central earnings between Liverpool at the top and Cardiff at the bottom in 2013-14 is 1.57 to 1.

This is a much lower ratio – and therefore ‘fairer’ split of TV money – than occurs in Europe’s other major leagues.

In Spain’s top division, where Barcelona and Real Madrid take the lion’s share of the TV cash because they do their own deals and don’t sell rights collectively, the equivalent ratio is around 11.3 to 1.

In Italy’s Serie A, the ratio is about 4 to 1, in France’s Ligue 1 it is about 3.7 to 1, and in the German Bundesliga it is 2 to 1. In the Champions League, the equivalent ratio between the winner and the team finishing ‘bottom’ (ie a final-stage qualifier before the group) is 30 to 1.

.

More articles mentioning salaries in sport

Follow SPORTINGINTELLIGENCE on Twitter

Sportingintelligence home page

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Winning in global sport: Often about the money, money, money ….. (but not always)

Monday, May 5th, 2014

By Nick Harris

5 May 2014

It was self-evident in the 2012-13 Premier League that Queens Park Rangers provided their owner with the worst value for money. He spend many tens of millions on buying players and many tens of millions more on paying their wages. And they were ignominiously relegated anyway.

But precisely how badly did they do in terms of resources and performance? And which team did best?

What does this tell us about the relationship between pay and performance in the Premier League? And how does that compare to other football leagues, and indeed other major sports? These are questions Sportingintelligence tries to address using the Global Salary Survey (click for details), as reported here and here.

This article scrapes the surface of giving an overview of an answer for eight leagues: The Premier League, La Liga, Bundesliga, Serie A, NFL, NHL, NBA and MLB.

Given that QPR had average first-team pay of £2.1m per player last season and earned 25 points, we can quantify that they spent £85,704 per man per point. That was the worst value by far in the Premier League, with Everton (£27,397 per man per point) giving the best value.

This is how the rest of the division performed:

Article continues below

Prem CPPPP .

The relationship between pay and performance in the Premier League has been close for many years. This linked article from two years ago goes into more detail, while this one from December 2012 used economic performance to predict QPR’s demise.

The more you spend on wages, the better you do, and vice versa, all other things being equal. And therein lies the beauty of sport. High-spenders can often find ways to cock things up, and low spenders can punch above their weight.

This next graphic plots the wage spending last season, 1 to 20, against finish position 1 to 20, and also depicts the salary spread graphically. You can see at a glance how QPR under-achieved and Everton punched above their weight. You can also see (right-hand chart) how below the top eight spenders, there is a large degree of similarity in pay among the rest of the clubs. What looks like random finishing positions in the left-hand chart are perhaps explained by there not being especially significant variations in wages in the first place.

Article continues below

PL pvp 2012-13

 

.

The following six graphics are the equivalent information for La Liga, Serie A and the Bundesliga.

Below them, we move on to the four major North American leagues.

La Liga pvp 2012-13

La Liga CPPPP

Serie A pvp 2012-13 season

Serie A CPPPP

Bundesliga 2012-13 pvp

Bundesliga CPPPP

 

.

North American major sports

There has long been a perception the North American sports are somehow ‘fairer’ because of drafts and wage caps.

But by using the same unique metric as we use across the 15 leagues in seven sports in 12 countries in the GSSS 2014, it is clear that there remains a relationship between pay and performance.

The situation for the NFL, NBA, NHL and MLB are summarised in graphic form below and are self-explanatory.

But a summary of the relationship is thus:

  • In the NFL 2013 season ending in 2014 SuperBowl, six of the 10 best paid teams were among the 12 play-off teams, and only one of the ten worst-paid teams was there.
  • Both the Super Bowl teams, the Seahawks and Broncos, were in the top four best-paid teams, the winning Seahawks at No2.
  • In the NBA 2013-14 season (ongoing at the time of writing), seven of the 10 best paid teams were among the 16 play-off teams and only three of the 10 worst paid teams.
  • At the NBA Conference semi-finals stage, five of the eight teams involved are among the 10 best paid teams, none are from the worst paid 10 teams.
  • In the NHL 2013-14 season (ongoing at the time of writing), eight of the 10 best paid teams were among the 16 play-off teams and only three of the 10 worst paid teams.
  • At the NHL Conference semi-finals stage, six of the eight teams involved are among the 10 best paid teams, none are from the worst paid 10 teams.
  • The Major League Baseball season is only in its early stages, but already there are patterns of better results for better paid teams. Those ranked 1-10 in the pay rankings have an average win % of 0.543 (at the time of writing). Those 11-20 have an average win % of 0.504; and those ranked 21-30 have an average win % of 0.456.

Details at the foot of this article about how to obtain a free copy of the Global Salaries Report that forms the basis of this analysis.

Click to enlarge any of these graphics

.

NFL pvp 2013-14

NBA pvp 2013-14

NHL pvp 2013-14

MLB pvp 2014

 

.

GSSS 2014 coverMore from Nick Harris

Follow SPORTINGINTELLIGENCE on Twitter

Obtain the full Global Sports Salaries report (left) FREE by clicking this sentence to send an email. (Write GSSS 2014 in the subject line, and your name / organisation in the email)NB: reports are being emailed on an individual basis so please be patient if you don’t get it immediately

.

.

.

.

  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark