Archive for the ‘News’ Category

Everton’s failings against Arsenal and Co undermine top-four credentials

Friday, August 22nd, 2014

By Brian Sears

22 August 2014

Everton are widely and rightly expected to finish in the top seven places in the Premier League this season, as they have done for the past four seasons and in eight of the last 10 seasons. They are stable, consistent, have a talented (and likeable) manager in Roberto Martinez, and a squad with plenty of youthful potential, including multiple home-grown players.

But only once in 22 completed seasons of the Premier League era – the ‘monied era’, during which they have lagged behind financially – have they finished in the top four, and that was in 2004-05. In fact that was the only time they have finished in the top four since the 1980s. One significant reason for that is their poor record against the ‘bigger’ teams, specifically those, like themselves, who are ‘ever present’ in the Premier League era: Manchester United, Arsenal, Chelsea, Tottenham, Liverpool and Aston Villa.

And as Everton prepare to host Arsenal at Goodison Park on Saturday, it is notable that, since 1992, Everton have lost twice as many home Premier League game against Arsenal (10 of them) as they have won (five). There have also been seven PL draws for Everton v Arsenal at Goodison.

Everton have also lost more times at home against Liverpool than they’ve won in the era (9 defeats, 7 wins), and lost more against Tottenham (7-5), and against Chelsea (8-7) and against Manchester United (14-5).

Away from home, Everton’s record against those clubs has been even worse; of 110 games against those clubs on their grounds, Everton have won just nine in total in the PL era, and lost 65. Their full record against all the current PL clubs in the PL era is in the first graphic below.

It also contains the stats for just the past 10 seasons, and the past five. The bad news is that Everton are becoming weaker over the years against Liverpool, although slightly better (NB slightly, relatively) against Tottenham, Chelsea and slightly better still against Manchester United. But they’re still some way from parity. And against Arsenal they are still taking less than a point per meeting on average. That’s why games for Everton against the ‘big’ clubs are arguably the best barometer of how they might fare in a season. And the first such test this campaign is this weekend.

Article continues below (click on graphic to enlarge)

Everton PL v big boys


H-away the lads: Toon visit Villa with history on their side 

Newcastle lost their opening Premier League game of the season and Aston Villa won at Stoke but Newcastle will visit Villa Park for Saturday’s lunch-time kick-off with history on their side.

Whether home or away, Newcastle have won more Premier League points from Aston Villa than from any other club (in total). At home, Newcastle have won 44 points in 20 PL games. Away they have won 30 points in 20 games. Of clubs Newcastle have played 10 or more times in the PL era, they have a better record even in points per game terms against Villa than against any club at home bar Southampton, and against any club away bar Sunderland.

The full record is in this graphic; article continues below

 NUFC rec v current PL clubs

Newcastle have won as many 13 times against Villa at home and won eight times in the PL at Villa Park, or more times than Villa have beaten them on their own turf (six). And this is not a story of only ancient history. Since Newcastle returned to the Premier League four seasons ago, they have played Villa eight times in the league, winning five, drawing two and losing just one.

And another thing …

This is the 23rd season of Premier League football and only once before have there been more away wins on the opening weekend of matches than this season’s six, and that was in the second season, when there were 22 teams, 11 games, and seven away wins. Five seasons ago there were six away wins on opening weekend but that was distinguished by being the only opening weekend without a single draw.

The present crop of six away wins follows on from last season when the rate of away wins over the whole season was at record Premier League levels. The majority of the away wins came from the clubs finishing in the top seven: Arsenal had 11, with 10 each for Manchester City, Liverpool, Chelsea, Tottenham and Manchester United, and eight for Everton.

Perhaps the ‘big seven’ becoming increasingly dominant over ‘the rest’ is the reason. We may find out more as this season wears on.

Aways in PL to 22.8.14


More on Man City / Liverpool  /Arsenal / Chelsea (or search for anything else in box at top right)

More from Sears



  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Survival chances up but ‘at least one of Leicester, Burnley, QPR down’

Friday, August 15th, 2014

By Brian Sears

15 July 2014 

The Premier League is about to start its 23rd season, the 23rd season since England’s top division was rebranded ‘Premier League’ and effectively broke away from the rest of English football. The 23rd season of the ‘money era’ when TV riches have poured into the game in ever-increasing amounts. The 23rd season when the difference between the ‘haves’ of the Premier League and the ‘have nots’ of the rest of English football is especially pronounced.

That is the context for what is to follow: an assessment of the chances that this season’s promoted clubs – Leicester City, Burnley and QPR – will be able to survive this coming campaign without being immediately relegated.

To move quickly and bleakly to the point, it is statistically highly probable that at least one of those clubs will be relegated immediately. It is likely that more than one of them will go down. If we take the precedent of the past 22 years into consideration then it would not be a surprise if QPR and Leicester go down.

Why? (And remember, this is not personal, this is simply explaining what statistics tell us from previous years). As the first graphic shows, 65 promoted teams have played in the previous 22 Premier League seasons, with three each year except 1995-96, when only Middlesbrough and Bolton came up as the PL was slimmed down. Of those 65 teams, 28 of 65 have been immediately relegated (or 43 per cent). As such, we would expect more than one promoted team per season to go down, on average.

Actually, only one of three has gone down in each of the past six seasons (except in 2011-12 when none went down), so we might argue that surviving is getting easier. But in seven seasons at least two of three promoted clubs have gone down and in 1997-98 all three went down.

It is undoubtedly harder for a promoted team to ‘thrive’ in the sense of coming up and challenging for the top spots. Such a concept is laughable now. Yet as recently as 1993-94, promoted Newcastle finished third, as did promoted Forest the following season, while promoted Ipswich were fifth in 2000-01. That seems ancient history now. Of the last 21 clubs promoted, only one of them has finished their first season up in the top half of the table, let alone near European places. (Birmingham, 9th, in 2009-10).

Statistically, the most likely promoted club to go down has come up through the play-offs (a 55% chance), then the team who have come up as champions (41% relegation chance) and then the team who came up as second-tier runners-up (33% per cent relegation chance). On that basis, play-off winners QPR are most likely to fall, then Leicester, then Burnley. But of course sport is not so simple, or so easily predicted.

Promoted teams are particularly susceptible to relegation because, in general, they arrive with poorer squads and fewer financial resources than the existing Premier League teams who have been fattened on the PL TV riches.

Next we’ll consider the chances of the promoted clubs getting off to good starts this weekend, but first, here are the fates of all the previous promoted clubs in their first seasons in the Premier League after promotion.

Article continues below

PL promoted teams in 1st season


All three promoted sides – Foxes, Clarets and R’s – have experienced the Premier League before, Leicester for as many as eight seasons although the most recent one was as long ago as 2003-04. QPR have played in the PL for six seasons and as recently as the season before last, and Burnley for just the one season, in 2009-10.

What chance that any of the promoted trio will win on their returns to the PL? Leicester host Everton on Saturday as QPR host Hull while Burnley must wait until Monday to host title favourites Chelsea.

The 65 promoted sides of the past 22 seasons have only known 13 opening day victories (one in five) and there have been 16 draws alongside the 36 first-day defeats. The list of those 13 wins to encourage Leicester, Burnley and QPR is below.

Yet each of them statistically, historically, on average, has just that one in five chance, although the bookies have Burnley as a one in ten chance, with QPR likelier to win than Leicester but neither strongly fancied to win.

Article continues below

PL promoted 1 in 5 od wins

Leicester have had three promotions to the Premier League before this one. On their debut game in 1994 they lost 1-3 at home to Newcastle but then they improved their opening-game results, drawing away at Sunderland in 1996 and at home to Southampton in 2003.

In their one previous Premier League promotion Burnley lost 0-2 at Stoke in 2009 and QPR lost 0-4 at home to Bolton in 2011.

At least the fixture list has been kind to all three promoted clubs, giving them home fixtures.

Leicester’s hosting of Everton is statistically intriguing. Here is their full Premier League history, and the significance of it is detailed below.

Article continues below

LCFC v EFC in PL full record


Which brings us to ….

Sears stat of week 15.8.14


Leicester fans will be less keen to be reminded that they allowed Bolton the biggest ever opening-day win of any side promoted to the Premier League. That was on the 18 August 2001 and the five Bolton goals were netted by Kevin Nolan and Per Frandsen (two goals each) and Michael Ricketts.


More on Man City / Liverpool  /Arsenal / Chelsea (or search for anything else in box at top right)

More from Sears


  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Arsenal’s unbroken post-war top-flight tenure puts them top of 15 ‘deserved’ PL teams

Friday, August 8th, 2014

By Brian Sears

8 August 2014

With the 2014-15 English football season about to begin, and with every fan still at the stage where they can dream that this will be the year, the notion of where a club ‘deserves’ to be is again a topic of relevance.

Up and down the country there will be supporters who will assure you that their team ‘should‘ be in the top four, or top six, or top division, or top two divisions. And there will be others who will assure you their team does not ‘belong’ in the lowly place where they currently reside.

It is uncontroversial enough to say that fans of Manchester United (and City), and Arsenal and Liverpool and Chelsea will believe they should have an excellent shot of finishing in the Premier League’s top four. Some Everton fans too, will probably argue that, and those of Spurs.

And there will be plenty from other clubs who swear their club’s ‘rightful’ place is in the Premier League. Leeds fans will be most likely loudest on this subject (with some justification). Those who support Blackburn and Wolves, both Sheffield clubs and Middlesbrough and Derby will also make claims.

Lower down the divisions, Coventry fans will tell you they really shouldn’t be in the third tier, and lower still, Portsmouth, twice champions of England, can argue that really, all things being equal, they should be at least two divisions higher.

Before we use one method to explore where clubs ‘should’ be playing, it is worth looking at a historic league table, from 40 years ago, the 1974-75 season, below. It is notable not only for the absence of Manchester United, and for the presence of the likes of Carlisle and Luton, but also for how many of the same names will contest the forthcoming 2014-15 Premier League; twelve of the same teams who contested the 1974-75 top division.

Article continues below

1974-75 First Division final table, England

First Div 1974-75


The very essence of the English league pyramid system is that clubs can go up and down. Any team can aspire to move from the non-league to the top division, and perhaps even Europe. Wigan in recent decades proved they could make just such a move.

And as was described in some detail on this website last year – link to the relevant article here – participating in the Premier League is on the verge of being a ‘majority experience’ for the professional football clubs of England. The ‘breakaway’ league, which began in 1992-93, has now featured 46 of the current 92 teams from England’s top divisions for at least one season each. The fact that half of all clubs have tasted the top division, even in this ‘monied era’ over the past 22 years, shows that upward mobility remains possible.

Yet most clubs remain fairly ‘stable’ in where they play their football. There is a certain order of dominance where the ‘big’ clubs tend to play high up, and achieve titles and cup wins, and the ‘small’ clubs play lower down, only now and again punching above their level.

In an attempt to measure this, and also highlight which clubs might justifiably show they are currently punching ‘above their weight’, or are temporarily below where ‘they should be’, we have looked at the post-war experiences of all 92 current clubs, specifically which division each club has played in for each of the completed 68 post-war seasons. (Post-war is used simply because it is one unbroken stretch of football history).

We have allocated each club four ‘pedigree points’ for each season spent in the top tier of English football, since 1946-47 (Premier League now, old First Division), and three points for each season in the second tier (Championship now, old First Division, even older Second Division), and two points for each third-tier season and one point for each fourth-tier season.

Arsenal, with an unbroken run of 68 years in the top division, have most points, followed by Manchester United, Everton, Tottenham, Liverpool, Chelsea, Aston Villa and Manchester City. Using this measurement, those eight clubs are the eight clubs with the highest post-war ‘pedigree’.

And how indicative of success is such ‘pedigree’? Well seven of those eight finished in the Premier League top eight last season, Villa being the only ones who did not. So actually, on a broad level, such ‘pedigree’ is indicative.

In the first table below, we show how 15 of the 20 clubs for the 2014-15 Premier League season ‘deserve’ by their post-war ‘pedigree’ to be in the top division. This same graphic shows how Burnley, QPR, Crystal Palace, Hull and Swansea are all punching above their historical pedigree to be there – a commendable thing for those clubs.

Article continues below

PL 2014-15 pedigree


The graphic shows where each club finished in the league (1 to 92) last season; where their ‘pedigree’ says they ‘should’; and the difference. A difference in single digits is really neither here nor there. That Palace, Hull and Swansea finished 23, 28 and 36 places about their ‘natural’ post-war level is especially commendable.

Looking at the 2014-15 Championship (below), the post-war pedigrees of the clubs suggests that Leeds, Wolves, Forest, Middlesbrough and Birmingham should be in the Premier League. Subsequently their finishing positions last season were respectively 22, 30, 15, 15 and 22 places worse than they ‘should’ have been. At the other end of the Championship, the likes of Bournemouth and Wigan are punching well above their historic ‘weight’ just to be in the second tier this coming season, let alone any higher.

Article continues below

Champ 2014-14 pedigree


Moving down further, we see that Coventry and Sheffield United are the two teams clearly below their ‘level’ in League One, and that Portsmouth, Luton and Plymouth are below where they should be in League Two. It will be no surprise whatsoever if multiple clubs across the leagues who are in divisions above and below where they ‘should’ be don’t ‘correct’ that via promotion or relegation this coming season. In fact it would be a surprise if we didn’t see at least a handful of those clubs moving.

Article continues below

League One 14-15 pedigree

.League Two 2014-15 pedigree


It goes without saying there are multiple ways you can measure where a club ‘should’ be playing. The exercise above is just one method. This website has previously considered how a ‘deserved’ Premier League might be made up using multiple different factors, such as trophies or ground capacity. Link here.

Equally one might average out the finishing positions for all the clubs over 68 years. In that respect you would find Manchester United have a higher average finish position than any club: 5th place on average over 68 seasons. Arsenal are next best in 6th on average, then Liverpool in 7th, Tottenham in 10th, Everton in 11th, Chelsea in 12th, Aston Villa in 14th and Manchester City in 15th. The same eight clubs, in other words, who also have the best ‘pedigrees’ by division.

You could consider 100 years, or 130-plus back to the start of the league in England, or 22 years for the Premier League. You could find all sorts of patterns. And yet nothing, definitively, will tell you, for sure, what will happen this season. Which is the beauty of the game. Today – anything can still happen.


More on Man City / Liverpool  /Arsenal / Chelsea (or search for anything else in box at top right)

More from Sears


  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Measuring the ‘Tiger effect’ – doubling of Tour prizes, billions into players’ pockets

Wednesday, August 6th, 2014

Roger PielkeBy Roger Pielke Jr

6 August 2014 

With the final Major of the golf season starting on Thursday at Valhalla Golf Club in Louisville, Kentucky, most of the talk in anticipation of the PGA Championship is about a player who almost certainly has no chance of winning, even if he were to play. I’m of course referring to Tiger Woods.

Woods reinjured his back last week at the WGC-Bridgestone Invitational leading to questions about his future – not just this week, but as a professional golfer. With Tiger on everyone’s mind, I thought it worth taking a look at his impact on the game, specifically Tiger’s role in boosting purses and the corresponding financial benefits to his peers.

From 1990 to 1996 the total purses on the PGA Tour increased from $82 million to $101 million, a respectable increase of about 3.4% per year. (All data in this post comes from and is adjusted to constant 2014 dollars to eliminate the effects of inflation). Tiger burst on the scene as a professional in 1996, winning 2 of the 8 events that he entered.

Before the Masters this year, Phil Mickelson explained what Tiger’s success and corresponding fame did to the game:


“Look at what he’s doing for the game the last 17 years he’s played as a professional. It’s been incredible. .. I remember when I was an amateur and I won my first tournament in Tucson in 1991, the entire purse was $1 million, first place was $180,000 and Steve [Loy, my agent] and I would sit down and say, ‘I wonder if in my lifetime, probably not in my career, we would have play for a $1 million first-place check.’

“[Now] it’s every week. It’s unbelievable the growth of this game. And Tiger has been the instigator. He’s been the one that’s really propelled and driven the bus because he’s brought increased ratings, increased sponsors, increased interest and we have all benefited, but nobody has benefited more than I have, and we’re all appreciative. That’s why we miss him so much; we all know what he’s meant to the game.”Tiger dollars


The numbers bear out Mickelson’s observations. By 2008 purses totaled $292 million, representing an increase of 9.3% per year since Tiger joined the Tour. This difference in the growth in prize money from 3.4% in the years before Tiger joined the Tour to 9.3% in the years after can be called the ‘Tiger Woods effect.”  I was curious as to what financial impact the “Tiger effect” had on his peers, so I looked at the data.

The results are astonishing. Tiger effectively more than doubled the prize money for every other golfer, adding billions of dollars to fellow players’ pockets. How can we demonstrate this?

Here is what I did. I considered all players who earned a pay cheque on the Tour in 2013. I then calculated their total earnings from 1997 to 2008 (176 players). I then calculated how much of those earnings were due to the “Tiger Woods effect” under the assumption that golf purses would have grown at the earlier rate of increase. I then subtracted this value from what they actually earned leaving a residual due to the “Tiger Woods effect.”

Other assumptions are of course possible, but the overall conclusions will be much the same – Tiger’s peers have benefited enormously in competition from his successes, even though Woods himself took home almost $100 million in prize money over that period.

Looking at the data Mickelson is almost right. He has benefitted more than anyone except Vijay Singh from the “Tiger Woods effect.” Singh earned an extra $36 million over his career thanks to Tiger and Phil an extra $29 million. (This is PGA tour alone). Here is a table with the top 10, and a full list appears at the end of this post.

Article continues below

Tiger effect

Further evidence for the “Tiger Woods effect” can be seen in the fact that since Woods’ infamous car crash in 2009, and subsequent loss of form, purses have decreased by 2.3% per year. It was a remarkable run, but one that now appears to be over.

It is important to point out that these numbers for the 176 players on the 2013 money list represents just a portion of the overall PGA Tour prize money from 1997 to 2008.

Those 176 golfers earned about $1.7 billion over that time period with about $867 million due to the “Tiger Woods effect.” In other words, slightly more than half the prize money was down to the ‘Tiger effect’. Overall, however, there was about $3.1 billion in total prize money won over that period, meaning that the overall Tiger Woods effect Tour-wide was more than $1.6 billion. This does not even begin to consider the possible knock-on effects on increased prize money in the other major international golf associations. So even if we were to ascribe only a fraction of the improved fortunes of golfers from 1997 to 2008 to the “Tiger Woods effect” it would still be a very, very large number.

Here is the list of the other players not in the graphic above who benefitted from the “Tiger Woods effect” from 1997 to 2008 on the PGA Tour. It’s safe to say that Tiger will never again have to buy a round at the 19th hole.


Player / 1997-2008 Earnings / Due to the “Tiger Woods Effect”

Stuart Appleby $27,069,938 / $14,076,368

Kenny Perry $26,961,363 / $14,019,909

Scott Verplank $25,897,096 / $13,466,490

Chris DiMarco $24,968,127 / $12,983,426

Retief Goosen $23,663,124 / $12,304,824

Robert Allenby $23,332,671/ $12,132,989

Adam Scott $22,762,323 / $11,836,408

K.J. Choi $22,369,711 / $11,632,250

Jerry Kelly $21,514,784/ $11,187,687

Rory Sabbatini $21,140,214 / $10,992,911

Steve Flesch $20,956,948 / $10,897,613

Chad Campbell $19,450,954 / $10,114,496

Geoff Ogilvy $19,085,741 / $9,924,585

Tom Lehman $18,710,749 / $9,729,589

Stephen Ames $18,673,342 / $9,710,138

Bob Estes $18,317,272 / $9,524,982

Tim Herron $18,111,067 / $9,417,755

Charles Howell III $17,872,120 / $9,293,502

Steve Stricker $17,777,975 / $9,244,547

David Duval $17,736,622 / $9,223,044

Jesper Parnevik $17,212,977 / $8,950,748

Billy Mayfair $16,745,084 / $8,707,443

Frank Lickliter II $16,524,062 / $8,592,512

Jeff Maggert $15,643,691 / $8,134,719

Kevin Sutherland $15,304,258 / $7,958,214

Luke Donald $14,999,283 / $7,799,627

Fred Couples $14,936,589 / $7,767,027

Joe Durant $14,573,984 / $7,578,472

Zach Johnson $14,355,856 / $7,465,045

Woody Austin $14,243,436 / $7,406,587

John Rollins $14,162,658 / $7,364,582

Rod Pampling $13,992,920 / $7,276,318

Tim Clark $13,640,723 / $7,093,176

Carl Pettersson $13,373,838 / $6,954,396

Jose Maria Olazabal $13,253,510 / $6,891,825

Chris Riley $12,430,834 / $6,464,034

Padraig Harrington $12,427,442 / $6,462,270

Bart Bryant $11,864,046 / $6,169,304

Lee Janzen $11,752,711 / $6,111,410

Scott McCarron $11,741,850 / $6,105,762

Billy Andrade $11,344,069 / $5,898,916

Duffy Waldorf $10,896,552 $5,666,207

Peter Lonard $10,843,814 / $5,638,784

Ben Crane $10,843,207 $5,638,468

Heath Slocum $10,816,834 / $5,624,754

Jonathan Byrd $10,805,101 / $5,618,652

Brian Gay $10,688,178 / $5,557,853

Aaron Baddeley $10,653,603 $5,539,874

J.J. Henry $10,057,300 / $5,229,796

Skip Kendall $9,914,401 / $5,155,489

Tim Petrovic $9,662,516 / $5,024,508

Pat Perez $9,622,105 / $5,003,495

Lucas Glover $9,567,037 / $4,974,859

Ben Curtis $9,448,947 / $4,913,453

Glen Day $9,441,199 / $4,909,424

Joe Ogilvie $9,433,871 / $4,905,613

Trevor Immelman $9,242,956 / $4,806,337

Justin Rose $9,210,276 / $4,789,344

Sean O’Hair $8,994,327 /$4,677,050

Camilo Villegas $8,895,967 / $4,625,903

Bernhard Langer $8,807,491 / $4,579,895

John Senden $8,807,100 / $4,579,692

John Daly $8,688,582 / $4,518,062

Hunter Mahan $8,587,849 / $4,465,681

Matt Kuchar $8,530,993 / $4,436,116

Bo Van Pelt $8,446,441 / $4,392,149

Vaughn Taylor $8,307,598 / $4,319,951

Dean Wilson $8,276,009 / $4,303,524

D.J. Trahan $8,268,906 / $4,299,831

Cameron Beckman $8,249,619 / $4,289,802

Brandt Jobe $7,865,093 / $4,089,848

Ted Purdy $7,698,420 / $4,003,178

Robert Gamez $7,629,100 / $3,967,132

Mark O’Meara $7,550,361 / $3,926,188

Ryan Palmer $6,915,552 / $3,596,087

Ryuji Imada $6,819,511 / $3,546,146

Jason Bohn $6,421,360 / $3,339,107

Ian Poulter $6,143,533 / $3,194,637

Neal Lancaster $6,084,437 /$3,163,907

Paul Stankowski  $6,051,261 / $3,146,656

Todd Hamilton $6,025,489 / $3,133,254

Mark Wilson $5,743,377 / $2,986,556

Boo Weekley $5,654,770 /  $2,940,481

Charlie Wi $5,529,715 / $2,875,452

Kent Jones $5,352,222 / $2,783,156

Nick Watney $5,175,327 / $2,691,170

Kevin Na $5,116,818 / $2,660,745

Nick O’Hern $5,071,805 / $2,637,339

J.B. Holmes $5,044,949 / $2,623,373

Ken Duke $4,959,224 / $2,578,796

Greg Chalmers $4,882,436 / $2,538,866

Brandt Snedeker $4,844,096 / $2,518,930

Troy Matteson $4,807,491 / $2,499,895

Lee Westwood $4,722,506 / $2,455,703

Bubba Watson $4,696,308 / $2,442,080

Brian Davis $4,607,788 / $2,396,050

Nathan Green $4,503,907 / $2,342,031

Ryan Moore $4,358,163 / $2,266,245

Charley Hoffman $4,209,967 / $2,189,183

David Frost $4,153,357 / $2,159,746

Kevin Streelman $3,814,208 / $1,983,388

Greg Owen $3,614,521 / $1,879,551

Steve Marino $3,590,828 / $1,867,230

Angel Cabrera $3,511,173 / $1,825,810

Tag Ridings $3,457,862 / $1,798,088

Marco Dawson $3,446,424 / $1,792,140

Bill Haas $3,194,965 / $1,661,382

John Mallinger $3,189,475 / $1,658,527

George McNeill $3,163,681 / $1,645,114

Dicky Pride $3,151,821 / $1,638,947

Russ Cochran $3,056,177 / $1,589,212

Michael Letzig $3,029,926 / $1,575,561

Robert Garrigus $2,855,665 / $1,484,946

Jeff Overton $2,764,765 / $1,437,678

Johnson Wagner $2,686,510 /$1,396,985

Tom Watson $2,681,860 / $1,394,567

Jeff Gove $2,612,659 / $1,358,583

Arjun Atwal $2,536,872 / $1,319,173

James Driscoll $2,374,530 / $1,234,756

Nicholas Thompson $2,328,498/ $1,210,819

Sandy Lyle $2,276,029 / $1,183,535

Andres Romero $2,233,902 / $1,161,629

John Merrick $2,149,254 / $1,117,612

Henrik Stenson $2,131,978 / $1,108,628

Hank Kuehne $2,002,238 / $1,041,164

Kevin Stadler $1,997,796 / $1,038,854

Michael Bradley $1,971,492 / $1,025,176

Dustin Johnson $1,936,659 $1,007,063

Wes Short, Jr. $1,856,870 / $965,572

Darron Stiles $1,619,522 /$842,151

Chez Reavie $1,562,513 / $812,507

Russell Knox $1,537,423 / $799,460

Paul Casey $1,484,065 / $771,714

Shawn Stefani $1,456,317 / $757,285

Marc Turnesa $1,438,968 / $748,263

Jason Dufner $1,345,346 / $699,580

Chris Stroud $1,278,222 / $664,675

Alexandre Rocha $1,274,125 / $662,545

Joey Snyder III $1,259,266 / $654,818

Brendan Steele $1,213,240 / $630,885

Andre Stolz $1,121,887 / $583,381

Tom Gillis $1,049,304 / $545,638

Doug LaBelle II $1,045,187 / $543,497

D.A. Points $946,902 / $492,389

Martin Laird $937,646 / $487,576

Justin Bolli $852,046 / $443,064

Matt Jones $839,520 / $436,550

Jason Day $830,316 / $431,764

Jimmy Walker $669,188 / $347,978

Peter Hanson $668,180 / $347,454

Will Claxton $650,806 / $338,419

Tommy Gainey $608,304 / $316,318

Brendon de Jonge $502,416 / $261,256

Y.E. Yang $499,586 / $259,785

Graeme McDowell $413,570 / $215,057

Andres Gonzales $360,427 / $187,422

Bryce Molder $289,416 / $150,496

Jin Park $245,723 /$127,776

David Hearn $235,525 / $122,473

Russell Henley $225,910 / $117,473

Steven Bowditch $164,890 / $85,743

Robert Karlsson $152,054 / $79,068

Brendon Todd $66,520 / $34,590

Troy Kelly $54,483 / $28,331

Steve LeBrun $19,348 / $10,061


Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York TimesThe GuardianFiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog


More on this site mentioning sport and money


  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Man Utd’s Chevrolet deal pushes Premier League shirt values to £191m

Monday, July 28th, 2014

By Alex Miller and Nick Harris                                                            

28 July 2014

Manchester United’s £47 million-per-year shirt sponsorship deal with Chevrolet has helped boost the combined shirt sponsorship income of the Premier League’s 20 clubs to a record £191.35m for the 2014-15 season.

United’s world-record deal was agreed and signed in 2012 on terms of $559m over seven years, starting with the coming season. The £47m-a-year is calculated at exchange rates at the time of the deal. Chevvy replace United’s previous deal with Aon, worth ‘only’ £20m a year.

The Old Trafford club separately have on ongoing £17m-a-year sponsorship deal with AON that includes their training kit and sponsorship of the Carrington training ground. That is not included in the 20 clubs’ total in the first graphic below.

Three other clubs in 2014-15 have more valuable shirt sponsorship deals than last season among those clubs in the top flight both years: Swansea’s deal with GWFX has risen by £2m to £4m, Everton’s with Chang has gone up £1.3m to £5.3m and Hull are making a bit more money from 12Bet than Cash Converters.

The 20 Premier League clubs combined have added £23.6m to last year’s combined total of £165.75m, with only Tottenham and West Brom seeing a dip in their deals.

It is worth noting the the ‘big six’ clubs between them account for 79 per cent of the value of the 20 deals – or for £151m of £191m. And United’s deal by itself is worth more than the 14 smallest deals combined. But even most of the smaller clubs now have multi-million pound-per-year deals, evidence of the value of having global reach via the PL shop window.

Article continues below

PL shirt sponsors 2014-15


Spurs sponsorship arrangement last season saw AIA sponsor club shirts in domestic games and Aurasma in the Europa League for a total of £19m. That deal has been replaced with a simpler £16m-a-year deal with AIA.

Zoopla decided not to extend their sponsorship deal with West Brom following former striker Nicolas Anelka’s controversial quenelle goal celebration last season. The club has signed a new deal with Intuit QuickBooks worth around £300,000 a year less.

The overall figure was also reduced as relegated clubs Fulham, Norwich and Cardiff earned a combined total of £6.5m from their shirt sponsorships last season, while promoted sides QPR, Burnley and Leicester pull in a lower combined total of £4.5m, a drop of £2m.

The global reach of the Premier League is reconfirmed with an increasing number of foreign-based companies continuing to adorn club shirts. This season 14 shirts feature overseas-based companies, compared to 10 last season. Companies from the United Arab Emirates, South Korea, the US, the Philippines, South Africa, Thailand and China are represented.

Despite a football-wide ban on betting, gambling companies continue to feature prominently in the Premier League. This season four shirts feature gambling companies: those at Aston Villa (, Stoke (Bet265), Hull (12Bet) and Burnley (Fun88) – compared to three last season.

The total value of the Premier League’s shirt deals have almost doubled in five seasons. In 2010-11 (details here) they were worth £100.45m; this coming season represents a 90.5% rise since then.

Article continues below

PL shirt deals five years


The season by season deals for other years can be seen here: 2011-12 sponsorships2012-13 sponsorships here; 2013-14 sponsorships.


Follow Alex on Twitter @AlexMiller73


More on Chelsea / Man Utd / Arsenal (or search for anything else in box at top right)

More from Alex Miller



  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Unlucky: 13 Premier League clubs hike ticket prices. City cheapest, Arsenal most costly.

Tuesday, July 22nd, 2014


By Alex Miller                                                                   Follow Alex on Twitter @AlexMiller73

22 July 2014

Despite Premier League clubs collecting massively enhanced payments last season, this season and next season from the latest TV deals, worth £5.5billion over three years from 2013-16, the benefits are not reaching fans, with 13 sides raising season ticket prices this season.

With clubs looking to maximise revenues to be able to compete in the Premier League within the constraints of FFP, fans are once again paying increased prices.

The biggest price hikes have been made by Burnley and QPR, both promoted for the coming season; Burnley have raised the price of their cheapest season ticket prices by a whopping 47%. Substantial rises have also been announced at Stoke and Hull.

The average ‘entry level’ (lowest price) adult season ticket in the Premier League is now £526, up 6.5 per cent; while the average top-price season ticket is now £870, up 7 per cent.

The table below contains pricing for standard, adult, non-concessionary season tickets, as provided by clubs. A whole range of other prices are or were available in various early bird and concessionary packages. For full details of each club’s pricing policies (which per club fill at least a page), do please feel free to visit their websites.

Article continues below

PL tix 14-15 prices

Burnley chief executive Lee Hoos defended the increases with fans able to watch Premier League football this season and because £100 from this year’s season ticket price will go towards the 2015-16 season tickets when fans renew.

Hoos said: “We have already announced that the early bird renewal prices will again be frozen this year for the 2015/2016 season when they go on sale.

“If a full price season ticket is purchased before the start of the season, not only will purchaser qualify for early bird renewal price, they will also be credited with £100 towards the purchase of a 2015-2016 season ticket.”

With the Premier League acknowledged as the most lucrative league in the world, only two clubs have lowered prices this season. Football for some fans in the North East has come down after Newcastle and Sunderland announced marginal price cuts.

Arsenal announced three per cent price rises in line with inflation, taking their most expensive season ticket prices above the £2,000 mark – the costliest in the league. Arsenal also have the most expensive low cost season ticket, priced at £1,014, which with the exception of Tottenham and Chelsea, is more costly than every other clubs’ most expensive ticket.

An Arsenal spokesman explained that priciest match day tickets of £127 were limited to 100 seats. He added: “For next season, the club will once again be operating a categorised ticket policy, with matches at Emirates Stadium being graded A, B or C.

“This initiative, introduced in 2013, has enabled the club to offer 90,000 cheaper match tickets from £25 as part of its drive to provide more tickets at affordable prices.”

Despite winning the league last term, Manchester City have frozen the cost of their cheapest season tickets at £299 – the cheapest in the Premier League – although the club have announced a 10% rise on the most expensive season tickets at the Etihad to £860.

A Football Supporters’ Federation (FSF) spokesman told Sportingintelligence: “Nine out of 10 fans already think they are paying too much for tickets and these figures only back that point of view.

“Clubs are swimming in cash and the last media deal was worth £5bn. The huge increase would have been enough for clubs to let every fan in for free and they would have been no worse off.

“Top-flight clubs need to think long-term and cut prices. Never mind all the clever PR strategies clubs come out with – nothing would earn goodwill like dropping prices.

“The FSF will lead a march on Premier League and Football League HQ demanding ‘Affordable Football For All’ on Thursday 14th August. We would encourage all fans to join us.”


More on Chelsea / Man Utd / Arsenal (or search for anything else in box at top right)

More from Alex Miller


  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Man United and Liverpool remain top TV draws despite 2013-14 without trophies

Monday, July 21st, 2014

By Nick Harris

21 July 2014

With almost four weeks still to go before the start of the 2014-15 Premier League season, the PR battle for TV viewers in the host country of the world’s most-watched domestic football league is well underway. Sky Sports will again have the lion’s share of the live games in the UK, having paid £2.28 billion for a majority of the UK live rights for the 2013-16 period. And BT Sport will again be trying to woo a portion of the pay-TV market, having paid £738m for their own share of matches. SKy ad

BT adThe wider strategic aims of both companies and an exploration of how these deals are focussed on far more than showing football matches can be read here and here. (It’s all about the long-term triple-play market, now becoming the quad-play market). Suffice to say, for those who still read old-fashioned newspapers in the UK, you will have noticed the upsurge in full-page adverts (right and left) for the two broadcasters over recent days.

One thing that remains true regardless of who is showing the games is that certain teams have mass appeal and will guarantee relatively big audiences, whoever they are playing, while the majority of teams will attract far fewer viewers. Sportingintelligence has looked at the viewing figures for every single live Premier League match screened in the UK last season and the conclusions are clear and unambiguous:

1: Manchester United and Liverpool are the big draws. United’s average audience was 1.4m people per match and Liverpool’s was 1.31m. This is in-home TV viewing but clearly indicative of popularity. No other team averaged as many as 1.2m people per game.

2: Sky remains the main player in audience terms, although with a 20-year head start this is to be expected. Sky have somewhere north of 11 million pay-TV customers and more than 7 million of them subscribe to the sports channels. Their average PL audience last season was 1.2 million people per live game. BT Sport’s channels can now be accessed in around 5 million homes, either directly from BT on TV or via apps, or via other means, for example as a bolt-on to Sky or part of a Virgin package. BT’s average PL match audience last season was 562,000 people per match.

3: While these figures will no doubt seem extremely small to many observers, they demonstrate two things; first that pay-TV is not about ratings primarily but about getting viewers to pay for premium exclusive content they cannot get elsewhere; second that having paid for exclusive content, those viewers will want to see the ‘biggest’ and ‘best’ teams most within their subscription packages. That is why, for example, Manchester United were among the most-shown teams (25 of their 38 PL games were live on TV), as were Liverpool (28 of 38), Chelsea (25), Arsenal (25) and Manchester City (25). Less ‘attractive’ teams were shown much less often. Fulham and Cardiff were on TV just eight times each, with Norwich, Hull and West Brom just nine times each.

The first graphic below (click to enlarge), shows the TV audiences, in thousands, for each of the 154 live PL matches shown on UK TV in the 2013-14 season. It is self explanatory and shows the big teams were shown most but also got the biggest audiences.

Note that the most-watched matches were all contested between two ‘big’ clubs: the 2.7m who watched Liverpool v Chelsea; the 2.462m who watched United v Arsenal; the 2.1m who watched Chelsea v United; the 2m who watched Liverpool v City and the 2m who watched City v United.

Article continues below

PL TV 13-14 grid


The second graphic – the far right-hand side of it – ranks the 20 clubs from last season in terms of average audience per match, with United at No1 and Liverpool at No2 and so on. Again, click to enlarge it.

It also provides details of the every club’s TV audience for home and away games, and by channel. You can see that BT showed the big clubs the most, but only screened one game all season featuring each of Norwich, Sunderland, Stoke and West Brom.

Using United as an example of how to read the table, they were shown seven times on BT Sport (four times at home, average 740,000 viewers, three times away, 686,000) and 18 times on Sky (seven times at home. 1.833m viewers, 11 times away, average 1.567m viewers). Their 25 games were seen by 35,093,000 people at an average of 1.4m per game.

Note that Liverpool were shown away three times on BT and 14 times on Sky, so 17 of their 19 away games were live on TV. That means 17 games of disruption for the traveling Liverpool fan, where disruption is defined as a game not being played at the traditional 3pm kick-off. Yet even more controversial is the way the TV money is shared out, as we’ll deal with in a moment.

Article continues below

PL TV 13-14 full breakdown


Money is often cited as being the root of modern football’s evils. The Premier League, say many detractors, is obsessed with cash and to hell with the rest of the game. And certainly there is a whole load of money – primarily TV money – now sloshing around the top division of English football, whereas the rest make do on (relative) scraps.

But in fact the way in which the Premier League’s TV money is divided up is actually much more democratic than most leagues divide their money. See this article for the precise way in which the Premier League divided the TV cash from 2013-14. The ratio between the top earners, Liverpool (£97.5m) and the bottom earners, Cardiff (£62m) was 1.57 to 1.

The comparable figure in the German Bundesliga is 2 to 1; that means Germany is less fair. Then in France it is 3.2 to 1, less fair again, and in Italy 4.2 to 1, and in Spain, where Real Madrid and Barcelona basically take most of the cash between them, it is a whopping 11.3 to 1.


What if the Premier League cash were divided not according to relatively democratic principles, based on equal shares, TV appearances and a sum based on performance, but solely on how many viewers each team attracted? What would the division be like then?

Sportingintelligence has calculated that Liverpool would have got £74m more last season than they actually did, while United would have got £70m extra. Chelsea, Arsenal, Manchester City and Tottenham would also have earned tens of millions more, and everyone else a lot less.

The methodology was as follows: we re-allocated the £1.563 billion pot of cash on the basis of just two metrics: performance and TV viewers. The performance element remained the same as the normal method of distribution, or around £1.2m per place in the table. And the rest was split on total TV viewers. So Liverpool, with 36.74 million TV viewers, got £148.9m at £4.05m per million viewers. And every other club got £4.05m per million viewers to give the totals in the table below.

There is an inevitable argument that United and Liverpool got more viewers because they were shown more times. Absolutely. But they were shown more because that’s what people want to watch. The numbers back it up.

The graphic below is indicative of how unequal football could become; the status quo, at least in revenue share, is actually ‘fairer’ than most other football leagues.

PL TV 13-14 cash theoretical on viewers


More from Nick Harris


  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Germany: deserving, obvious World Cup winners (almost nobody predicted)

Wednesday, July 16th, 2014


So the World Cup is over, Germany are fitting champions, Lionel Messi couldn’t add the ultimate title to his glittering CV and the host nation is left to ponder what might have been. So who could have forecast this? Actually, a huge variety of ‘experts’, theorists, modelers and systems tried to predict the outcome of the tournament, from Goldman Sachs to boffin statistical organisations. In his latest post for Sportingintelligence, and as a wrap-up of an ongoing evaluation of rates of success (click HERE for Part 1 and background and click HERE for Part 2 and HERE for Part 3Roger Pielke Jr announces the winners and losers from the forecasting game. 

Follow Roger on Twitter: @RogerPielkeJR and on his blog


Roger PielkeBy Roger Pielke Jr.

16 July 2014

Another World Cup is now history, and with it my World Cup prediction evaluation exercise. As a reminder, this exercise is based on rankings made before the tournament started with the details of the evaluation explained here.

So to the results. Overall, Andrew Yuan, whose predictions were popularised by The Economist, took first place, beating FIFA’s rankings by a single match. Of course, it is no surprise those two were so close as Yuan and the FIFA rankings had 60 of 63 identical match predictions.

After FIFA there is a three-way tie for the bronze medal, with Bloomberg, Elo rankings and Hassan and Jimenez sharing the third step of the podium. Of note is that the latter was produced four months ago, well before the national team rosters were even announced.

The full table is as follows, and article continues below:



One of the leaders after the group stage of the tournament, Danske Bank, performed the worst in the knockout portion of the World Cup, and slipped from the podium. By contrast, the worst performer during the group stage (Financial Times) was joint first during the knockout matches. With these methods, past performance is apparently not a good predictor of future performance.

None of the other methods outperformed the naive baseline based on TransferMarkt player values that I assembled prior to the tournament. Three methods actually under-performed that naive baseline. Were you to pick one of these methods (other than FIFA or Transfermarkt) at random prior to the tournament, you would have had a 10 per cent chance of beating FIFA and a 50 per cent chance of beating Transfermarkt.

The table above also shows how each method performed in the knockout portion of the tournament, in anticipating advancement from the group stage, and in anticipating the finalists. Interestingly, the overall winner was only one of two methods which failed to anticipate one of the finalists.

No method anticipated both Germany and Argentina in the final, and no method picked Germany to win it all. This website’s editor considered other models to predict the winner before the tournament, and made a personal forecast of an Argentina-Germany final, but he picked the wrong winner.

Here are some more general lessons to take from prediction exercise:

1: Prediction evaluation is highly sensitive to the methodology employed. For instance, were the evaluation method to award a three-game “bonus” to any method than anticipated a finalist, Andrew Yuan would fall from first place to sixth place. The weighting of results can consequently dramatically change the evaluation rankings.

In any prediction evaluation it is therefore important to settle upon an evaluation methodology in advance of the data actually coming in. It is also important to keep separate the roles of predictor and evaluator. It is obviously very easy to “game” an evaluation to look more favorable to a particular prediction method, simply by choosing a convenient evaluation metric. Be cautious with anyone who offers you both a prediction and an evaluation of their prediction, especially after the fact.

2: Beating a simple baseline is very difficult. We might debate how “naive” the FIFA rankings or Transfermarkt valuations actually are in practice. But both clearly outperformed more sophisticated approaches. The only method which actually out performed FIFA was one which successfully picked two of the three matches that they had different across the entire tournament. Was that luck or skill? None of the other 10 methods added any value beyond the FIFA rankings. Should they have even bothered?

Even though outperforming a naive baseline over a tournament is difficult, that does not take away for the entertainment value of predictions. For instance, FiveThirtyEight performed poorly according to the evaluation methods here, but nonetheless offered stimulating commentary throughout the tournament, in part based in its predictions.

3: Ultimately, we can never know with certainty how good a predictive methodology actually is in practice. Some systems that we wish to predict have closed boundaries, such as a deck of 52 cards. We can develop probabilistic predictions of poker hands with great certainty. In the real world, we can sometimes (but not often) accumulate enough experience to generate predictions of open systems that also have great certainty, like the daily weather forecast.

But other systems are not subject to repeated predictions and/or are so open as to defeat efforts to bound them. The World Cup, and sporting events in generally, typically fall into these categories. Arguably, so too does much of the human experience. Perhaps baseball, with its many repeated events over a short time period might be considered more like a weather forecast than a World Cup.

Ultimately, making good decisions depends on understanding the difference between skill and luck, even if we can never fully separate the two. A prediction evaluation exercise can help us to quantify aspects of our ignorance and lead to questions about what is is that we really know.

Ultimately, the answers to these questions cannot be resolved empirically.

After this exercise, there is one thing we all know for sure. Germany are world champions, despite being looked over by the predictions. I hope you enjoyed this exercise over the past month. I’ll be doing similar exercises in the future and welcome your suggestions. Get in touch via Twitter or via my blog, details below.


Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York TimesThe GuardianFiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog


More on this site mentioning the World Cup



  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Bankers and bookies oust FIFA as best bets for World Cup forecasts

Tuesday, June 24th, 2014


This post has been updated on 27 June, UK time, an earlier version of same story is below

Yesterday’s games at the World Cup mean the World Cup group games (48 of 48) have been completed.  There have been expected victories for some nations, big upsets for others – Adios Spain! Bye-bye England! - and more goals than most fans would have expected. So who could have forecast this? Actually, a huge variety of ‘experts’, forecasters, theorists, modelers and systems have tried to predict the outcome of this tournament, from Goldman Sachs to boffin statistical organisations. In his latest post for Sportingintelligence, and as part of an ongoing evaluation of rates of success (click HERE for Part 1 and background and click HERE for Part 2) Roger Pielke Jr sorts the best from the rest. 

Follow Roger on Twitter: @RogerPielkeJR and on his blog


By Roger Pielke Jr.

27 June 2014

The group stage is over, and after 48 matches we can declare a winner in the first part of the World Cup prediction evaluation exercise. We made a ‘naive’ prediction ourselves based on the financial value of the squads; and we’re comparing this to 11 other predictions made by parties ranging from bankers and bookies to boffins and FIFA rankings.

Congratulations to Danske Bank and Andrew Yuan who take joint first, each picking 32 matches correctly and 11 of the 16 teams which advanced. The Elo ratings and Bloomberg also picked 11 of the 16 teams to advance but fell one short overall, picking 31 matches.

Article continues below

Pielke 48


The FIFA rankings fall fifth despite having only one match picked differently than Yuan, illustrating the fine edge to predictive success. Hassan and Jimenez tied the FIFA rankings, despite producing their forecast last February. The pre-group stage odds  from are next, followed closely by Infostrada.

Each of the seven methods discussed so far showed skill in that they outperformed the naïve baseline based on the estimated transfer market value of each of the teams. Still, the naïve baseline was just four games out of first, but only anticipated eight of the 16 teams moving on. That is the same number of teams to advance in Brazil who also advanced in 2010 in South Africa, which could have been used as another naïve baseline.

Three predictions win the “why bother?” award by under-performing the naïve baseline – 538, which only picked seven of the advancing squads, Goldman Sachs and the FT. The latter was included in the evaluation despite not being proposed as a forecasting tool. The other two don’t have that excuse for their underperformance.

The main lesson that I’d suggest taking from the exercise thus far is that it is very difficult to generate predictions that can outperform a fairly simple baseline approach. It is even more difficult to outperform the existing ratings systems of FIFA and Elo. All 10 methods were just one match away from underperforming the FIFA rankings. Ultimately, most of these prediction methods are consequently of certain entertainment value, but uncertain value in their prognostications.

Of course separating luck from skill is not possible in such an exercise. The strong performance of ranking systems in 2014 was in part due to the low number of upsets (7 vs. 14 in 2010) and draws (9 vs. 14 in 2010).

Consider that in 2006 and 2010 the FIFA rankings would have correctly predicted 26 and 20 matches (of 48) respectively in the group stages (this data sent courtesy @roddycampbell).

So was Danske Bank lucky and Goldman Sachs unlucky? Or was the former actually a more skilled forecaster? These are all good questions for the pub as the data do not provide answers.

We are now in a position to set the stage for part 2 of the prediction evaluation. Before I describe how I have chosen to evaluate the matches for this phase of the contest, let me remind you that there are many different ways to structure such an evaluation. I don’t think that there is any single best way, however it is important to be clear about procedure before evaluating. You don’t want to find yourself setting up the rules for evaluating a prediction after the fact, especially if you are one offering predictions.

* I use each method’s overall ranking of the teams presented before the tournament began. Several forecasters are providing updated predictions as the tournament unfolds, and the betting odds obviously change.

* If no such ranking was provided I use instead the ranked probability to advance from the group stage.

* As before, I convert probabilistic predictions into deterministic forecasts. There are obviously no draws in the knockout stage.

* I will generate a prediction for each method for each match. In other words, there will be at total of 15 matches predicted by each method over the knockout stage, regardless how they do in each round.

* At the end of the tournament I will provide a ranking for predictions in the knock-out stage as well as an overall ranking based on both the group stage and the knock-out stage predictions.

For the upcoming round of 16 matches, every method is in agreement on six of the matches, with the favorites as unanimous selections: Brazil, Colombia, France, Germany, Greece and the Netherlands. A majority favour Colombia and Belgium, but Uruguay and the USA get a few nods.


Roger PielkeBy Roger Pielke Jr.

24 June 2014

We are fast approaching the end of the group stages, and the battle for the top of the prediction league table is tight. Five approaches within one game of the lead after 36 of the 48 matches have been played.

For detailed explanation on the predictors, follow the links above, but to summarise, we made a ‘naive’ prediction ourselves based on the financial value of the squads; and we’re comparing this to 11 other predictions made by parties ranging from bankers and bookies to boffins and FIFA rankings. Those Fifa rankings have held sway … until now.

Sitting alone at the top is Danske Bank, which has the most games picked correctly overall. Among the leaders at the halfway point were the FIFA rankings and Andrew Yuan, who I noted had 47 out of 48 matches in common.

Here is the table after 36 games.

Article continues below

Pielke Part 3 table

Yuan took that one match that they split (Mexico-Croatia) ensuring that the FIFA Rankings cannot finish first. The Naive Baseline has had a good run, passing up four of the methods, and now trails the two rankings, FIFA and Elo, by just one game.

I’ve added an additional method of ranking the predictions, according to the number of countries picked to advance from the group stage. All methods have already slipped from perfection, with only five approaches correctly picking 3 of the 4 teams so far to advance. The others, including Danske Bank at the top of the table, only have 2 of the 4. It just goes to show that prediction evaluation is highly sensitive to the metrics of assessment that are used.

Looking ahead, all methods have picked France, Argentina, Germany, Belgium and Russia to advance. But no method has picked Costa Rica, and only two have the USA. On Friday I’ll provide a summary of the group stage of the competition and set the table for the knockout stage.


Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York TimesThe Guardian,FiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog


More on this site mentioning the World Cup



  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark

Picking World Cup winners? After 12 games, FIFA rankings beating eminent thinkers

Monday, June 16th, 2014


The final whistle in the Germany-Portugal match in Group G in Salvador marked the end of the 12th game of the 2014 World Cup, and thus a quarter of the group stage is complete. There have been expected victories for some, big upsets for others and more goals per game at this stage than any World Cup since the 1950s. Who could have forecast this? Actually, a huge variety of ‘experts’, forecasters, theorists, modelers and systems have tried to predict the outcome of this tournament. In his debut post for Sportingintelligence, and as part of an ongoing evaluation of rates of success, Roger Pielke Jr sorts the best from the rest. 

Follow Roger on Twitter: @RogerPielkeJR and on his blog

Roger PielkeBy Roger Pielke Jr.

16 June 2014

Prognosticators have been hard at work generating pre-tournament predictions of who will advance and who will win. But which prediction is the best? Is it the one who picks the winner? Or is it the one which best anticipates the knock-out round seedings? How can we tell?

I will be evaluating 11 predictions over the course of the World Cup, starting with a league table after 12 games, in a moment. But suffice to say that after a dozen games, Fifa’s ranking system is proving as good as any other indicator of success, while some eminent thinkers are faring less well.

The 11 under consideration are:

To evaluate the different predictions, I am going to quantify the “skill” of each forecast. It is important to understand that forecast evaluation can be done, literally, in an infinite number of ways. Methodological choices must be made and different approaches may lead to different results. Below I’ll spell out the choices that I’ve made and provide links to all the data.

A first thing to understand is that “skill” is a technical term which refers to how much a forecast improves upon what is called a “naive baseline,” another technical term. (I went into more detail on this at FiveThirtyEight earlier this spring). A naive baseline is essentially a simple prediction. For example, in forecast evaluation meteorologists use climatology as a naive baseline and mutual fund managers use the S&P 500 Index. The choice of which naive baseline to use can be the subject of debate, not least because it can set a low or a high bar for showing skill.

The naive baseline I have chosen to use in this exercise is the transfer market value of the 23-man World Cup teams from In an ideal world I would use the current club team salaries of each player in the tournament, but these just aren’t publicly available. So I’m using the next best thing.

So for example, Lionel Messi, who plays his club team football at Barcelona and his international football for Argentina, is the world’s most valuable player. His rights have never been sold, as he has been with Barcelona since he was a child, yet he’s estimated to have a transfer market value of more than $200 million. By contrast all 23 men on the USA World Cup squad have a combined estimated value of $100 million. (I have all these data by player and team if you have any questions about them — they are pretty interesting on their own.)

Here then are the estimated transfer values of each World Cup team:

SqV for RP predictions analysis


In using these numbers, my naive assumption is that the higher valued team will beat a lower valued team. As a method of forecasting that leaves a lot to be desired, obviously, as fans of Moneyball will no doubt understand. There is some evidence to suggest that across sports leagues, football has the greatest chance for an underdog to win a match. So in principle, a forecaster using more sophisticated method should be able to beat this naive baseline.

Here is what the naive baseline (based on the team rosters as of June 5) predicts for the Group Stages of the tournament: The final four will see Brazil vs Germany and Spain vs Argentina. Spain wins the tournament, beating most everyone’s favorite Brazil. The USA does not get out of the group stage, but England does. All eight of the top valued teams make it into the final eight.

While this naive baseline is just logic and assumptions, work done by “Soccernomics” authors Stefan Szymanski and Simon Kuper indicates that a football team’s payroll tends to predict where it winds up every year in the league table. Payrolls aren’t the same thing as transfer fees, of course, but they are related. Unfortunately, as mentioned above individual player salaries are not available for most soccer leagues around the world (MLS is a notable exception).

The predictions are not all expressed apples to apples. So to place them on a comparable basis I have made the following choices:

  • A team with a higher probability of advancing from the group is assumed to beat a team with lower probability.
  • If no group stage advancement probability is given I use the probability of winning the overall tournament in the same manner.
  • This means that I have converted probabilities into deterministic forecasts. (There are of course far more sophisticated approaches to probabilistic forecast evaluation.)
  • No draws are predicted, as no teams in the group stages have identical probabilities.
  • The units here, in the group stage at least, will simply be games predicted correctly. No weightings.

Other choices could of course be made. These are designed to balance simplicity and transparency with a level playing field for the evaluation. Just as is the case with respect to the value of having a diversity of predictions, having a diversity of approaches to forecast evaluation would be instructive. No claim is made here that this is the only or best approach (laying the groundwork here for identifying eventual winners and losers).

With all that as background, below then are the predictions in one table (click on it for a bigger view). The yellow cells indicate the teams that the naive baseline sees advancing to the knockout stages, and the green shows the same for each of the 11 predictions. The numbers show the team rankings according to each prediction.

(Click to enlarge, article continues below)

Pielke 10 - start-out predictions

I will be tracking the performance of the 11 predictions against the naive baseline as the tournament unfolds, scoring them in a league table.

After 12 matches, the first league table is below. It is early still in the tournament, but there already is a bit of spread developing among the predictions. Five of the 11 are running ahead of the naive baseline, and four are trailing. But it is only one game in either direction, so I’d hesitate in saying anything much at this point. As the tournament progresses I expect we will see greater divergence. Stay tuned.

Accuracy after 12 games


Roger Pielke Jr. is a professor of environmental studies at the University of Colorado, where he also directs its Center for Science and technology Policy Research. He studies, teaches and writes about science, innovation, politics and sports. He has written for The New York Times, The Guardian, FiveThirtyEight, and The Wall Street Journal among many other places. He is thrilled to join Sportingintelligence as a regular contributor. Follow Roger on Twitter: @RogerPielkeJR and on his blog


More on this site mentioning the World Cup


  • LinkedIn
  • Twitter
  • Yahoo Messenger
  • Share/Bookmark