BSI Results: September 2015

      Comments Off on BSI Results: September 2015

Well, you all know how this is going to go, don’t you? Or do you? Let’s find out!

The last Boro’ Satisfaction Index was conducted in April 2015, just as Boro’ were on the verge of securing a playoff place, and a viable opportunity at promotion for the fourth season out of the last six (the only exceptions being the second and third seasons in League One). This poll was conducted as Boro’ had completed ten games of its latest League Two season, and made a significant change in its direction, having parted ways with manager Graham Westley and his team and handed responsibilities to Teddy Sheringham. Along with that, a significant proportion of the previous playing staff departed, and new players recruited (or promoted from youth squads).

In this survey fans were asked the same four questions as before – how satisfied they were in regards to the players, the manager (split between on and off the pitch) and the club itself. This time a further question was also introduced to see whether there are any material splits in sentiment according to how many games fans attended, as a possible proxy for devotion.

I’m going to cut to the chase and look at the bigger picture to open up the article, and then delve into each of the questions to see what we can find out from some of the finer details. To begin with, here’s the final chart I posted back in April, showing how Net Satisfaction had tracked over the course of the season:

Boro’ Satisfaction Index, April 2015

Boro’ Satisfaction Index, April 2015

To offer a quick reminder of what this chart shows, each point represents the proportion of supporters returning a positive response, minus the proportion of supporters returning a negative response. So if 40% were positive, 30% were neutral and 30% were negative, the Net Satisfaction would be 10% (40%-30%). The highest possible score would be 100% and the lowest possible would be -100%.

As you can see, we closed out the last campaign with high levels of satisfaction across the board that would be very hard to sustain, unless Boro’ had won the playoffs and had a great run in League 1 this season. But, in any case, let’s extend the chart out to September and see what happens:

Boro' Satisfaction Index, September 2015

Boro’ Satisfaction Index, September 2015

Nothing short of a nose-dive. From positive Net Satisfaction across the board, all metrics now show negative Net Satisfaction, and in fact three of the four metrics with lower scores recorded than at any stage of the previous season, and the fourth almost level with the previous lowest.

With recent sentiment among many supporter media echoing these sentiments, I’m sure that none of this will be a huge surprise to many people, but in some respects it’s the scale of the shift that is quite so remarkable. This is, after all, the same club we’re talking about. It may help to some extent to focus on just two data points on that chart – December 2014 and September 2015. The former date was soon after Boro’ had exited the FA Cup at the hands of Maidstone at a first round replay and represented the previous low on all four measures.

Net Satisfaction April 2015 vs September 2015

Net Satisfaction April 2015 vs September 2015

Comparing back with December 2014, there is a shared sentiment in regards to the players and the manager – the difference in Net Satisfaction is only a handful of percentage points which, in the scheme of the sample size we’re dealing with is unlikely to be significant. However, there is one area in which there does remain a material difference, and that’s in regard to the Club’s scores. Back in December there was a strong swell of support for the club’s actions with those who were dissatisfied being far outweighed by those who were happy with how things were being run, in spite of feelings for the players and manager at the time. Cut back to today, however, and a far different story can be told where, for the first time since these polls have been conducted there are more detractors than there are advocates. Let’s delve into the responses to that question a little further – here’s the chart showing the proportion of responses at each level, where 1 represents those most dissatisfied and 5 those who are most satisfied:

Boro' Satisfaction Index, September 2015

Boro’ Satisfaction Index, September 2015

In the 2014/15 season the average response to this question was consistently around 3.6, rising to 4.2 by April. That equivalent figure is now just 2.6, with the average fan now marginally dissatisfied and almost none responding that they were “extremely satisfied”.

There wasn’t a further question in the survey to capture the reasons for answers, so there is no immediate, firm conclusion to draw as to why this figure has dropped so much, but there are some probable candidates, not least related to the change in managerial make-up.

The next three charts turn our attention to the other three elements of satisfaction, and there’s a large degree of commonality between the distributions of the responses provided:

Boro' Satisfaction Index, September 2015

Boro’ Satisfaction Index, September 2015

Boro' Satisfaction Index, September 2015

Boro’ Satisfaction Index, September 2015

In both of the above questions, more of those surveyed answered a ‘1’ than answered a ‘3’ or higher, demonstrating some serious levels of dissatisfaction with both the players and the manager’s actions in regards to what he can influence on the pitch. There is, though, a minor difference when it comes to Teddy’s off-the-field exploits, demonstrated below:

Boro' Satisfaction Index, September 2015

Boro’ Satisfaction Index, September 2015

In this case, over a third of respondents appear to be willing to give him additional benefit of the doubt, perhaps related to his limited experience, as they returned either a neutral or positive score. However, the average score returned is still only 2.2, a mark lower than any recorded by Graham Westley last season.

One feature of the survey changed this time around with the introduction of an additional question. As well as looking for levels of satisfaction I also asked how many games our respondents attended. Just over half of them answered that they had been to three-quarters of the games or more, and three-quarters of those who completed the survey having attended at least a half of the games. Only a tiny contribution came from those who hadn’t been to any of the games.

With a reasonable 46:54 split between those attending half the games or less versus three-quarters or more, I set out the table below, which picks out the average score awarded for each of the four satisfaction metrics:

Average satisfaction score, broken down by attendance rate and element

Average satisfaction score, broken down by attendance rate and element

It’s pretty obvious, then, that there’s not much to separate the most devoted fan from the one who has other matters keeping them away from games on occasion, with no more than 0.1 splitting their average scores for three of the four metrics. The fourth, however, is more interesting, and comes back to the question we observed with different behaviours – how the manager is considered to be doing off the field. Digging into that question in finer detail shows how the fanbase is split, again showing their average score dependent upon rate of attendance:

Average score for Satisfaction with manager off the pitch, broken down by attendance rate

Average score for Satisfaction with manager off the pitch, broken down by attendance rate

There is a clear difference in the average scores depending upon how often a fan attends the games, with a lower score awarded the more games a fan can get to.

What could be behind this? Are those who attend more frequently also most attached to Graham Westley’s time with the club? Have they seen more to inform their decisions better? Or do those who have less of their time and money invested in the club’s fortunes have less extreme perspectives on the matter? Certainly, they don’t appear to in other areas. As always, thoughts welcome, and I’ll look to continue to see whether that distinction exists as time goes by.

Thanks once again to those who took part and made this survey the most responded to so far, it’s really appreciated and the more contributions there are the more we can trust the findings produced. If there are any other perspectives you’d like to see captured here, by all means let me know, via the usual channels.

Polling Method
Each question asked for a response on a 1-5 scale, where 1 represented “extremely dissatisfied” and 5 “extremely satisfied”. The intervening points were deliberately not labelled, but on an odd-numbered Likert scale we can assume responses of 2 to be leaning towards dissatisfied, 3s to be neutral and 4s to be leaning towards satisfied.
Respondent anonymity was preserved throughout, and every effort was made to prevent misuse by mandating a Google Account in order to take part, which can only be used once per poll.

Poll summary details
Poll dates: 29 September – 1 October, 2015
Number of respondents: 72