How to objectively best measure development?

Kante

PREMIER
This is a longer post so apologies upfront (and congrats to those who make it through).

I just saw this excellent article in the Athletic, ranking the MLS Academies - https://theathletic.com/516093/2018/09/11/parchman-ranking-all-23-mls-academies/

Got me thinking. Lots of $, time and energy from families go into academy soccer, but the question “is it all worth it?” nags at most parents. Is my boy getting better? Is it going to mean anything after high school?

Ideally, there should be a simple, objective way to measure if a team or a player is improving, staying the same or getting worse.

Came up with something that I think makes sense as a first step, measuring team progress. Would to love to hear comments, good or bad. (Please keep it constructive.)

At the end of this post, I’ll walk thru 05 LAFC’s 2017-18 season as a prototype/example of how these metrics apply to a real team. (btw, not picking on LAFC, it’s just that as the #1 team in the country and recent Concacaf champs, they are a best practice example to look at.)

If there’s interest in a look at other teams, send me a note direct or post to this thread, I can follow-up with additional posts.

Here’s the two metrics I came up with:

1) Goal Scoring % (GSP)

This would be calculated as: Goals scored by a team in a particular game as a % of average goals allowed by that opponent. Each game result would then be charted over time, and an upwards trend line would be indicative of team goal scoring improvement/development.

For example, if the opponent allows, on average, two goals, and your team scores three goals, then your goals scored as % of average goals allowed would be 150%.

Then, later in the season, your team plays an opponent that has only allowed, on average, one goal per game, and your team scores two goals. Then your goals scored as % of average goals allowed for that game would be 200%.

And this progress – going from 150% earlier to 200% later - would, represent an improvement/positive development over time in your team’s goal scoring ability.

Charting this over 25-30 games then would provide enough data to indicate team improvement or decline over the course of the season, which would imply development or lack of development relative to their peer group.

However, if the team roster changes in a significant way at some point in the season. For example, then you have to also look at the before/after impact of the roster change. For example, recruiting in a star goal scoring player mid-season (it happens), and then seeing an improvement in team goal scoring %, reasonably would not be attributed to development.

So pretty straight forward, I think.

2) Goals Allowed % (GAP)

This is very similar to GSP and would be calculated as: Goals allowed by a team in a particular game as a % of average goals scored by that opponent. Each instance would be charted over time, and a downwards trend line would be indicative of defensive improvement.

For example, if the opponent scores, on average, four goals each game, and your team only allows them to score three goals, then your goals allowed as % of average goals scored would be -25%.

If your team, later in the season, played an opponent that only scores, on average, two goals per game, and your team allows one goal, then your goals allowed as % of average goals scored would be -50%.

And this would, because it happened later in the season, represent an improvement/positive development over time in your team’s defending ability.

Again, this assumes the team roster remains constant over the course of the season.

So, again, pretty straight forward, I think.

So, how did LAFC do in 2017-18?

Here is LAFC Goals Scored as % of Opponent’s Average Goals Allowed charted over their 23 game group play season: (LAFC was schedule to play a final 24th game against FC Golden State but the game was cancelled).

upload_2018-9-13_10-52-20.png

LAFC Goal Scoring % started strong with six goals scored against LAUFA in their 1st game but declined over the next several games and bottomed out in their 10th game where LAFC only scored three goals against the Pateadores. (see the first dashed red trend line)

After that point, LAFC turned their goal scoring around but also added a very good, new forward - starting in Game 11. (see the dashed green trend line) This forward ended up leading the SoCal group in goals scored per 70 minutes by the end of the 2017-18 season.

LAFC Goal Scoring % peaked in Game 18 where they scored 11 goals against LA Galaxy San Diego. Then, again, declined thru the end of the season. (see the second dashed red trend line)

One note on this is that LAFC closed out games 19 thru 23 with five clean sheets in a row, and the decline in Goal Scoring % may have been related to a shift in training emphasis to defending.

Overall, from the beginning of the season to the end, even with the ups and downs, LAFC had a high, constant level of relative Goal Scoring %, scoring more than double the average number of goals allowed by any team they played, but did not show improvement over the course of the season.

Here is LAFC Goals Allowed as % of Opponent’s Average Goals Scored:

upload_2018-9-13_10-55-13.png


For most of the season, on average, LAFC had an impressive defense, allowing an average of just .61 goals per game, managing 14 shut outs across 23 group play games and holding opponents through the first half of the season to an average of about 20% of the goals opponents would have typically scored.

After game 10, there seemed to be the start of a conscious effort to maintain clean sheet, but that focus hit a bump in the road during a tough late season stretch when LAFC had consecutive games against LAUFA, LA Galaxy and San Diego Surf. After that, LAFC closed out the season with five clean sheets in a row.

Overall, from the beginning of the season to the end, despite three late season hiccups, LAFC improved significantly on the defensive side. (see the green dashed trend line).

I used this method to look at my son’s team last year, and it helped bring more sense to some confusing outcomes and trends. For example, my son’s team won a game late in the season but allowed the other team to score well above their season average.

Using this method, it was clear that the result was just part of a decline in defensive effectiveness over the course of the season.

The club wasn’t too interested in the insights but understanding what was happening, and starting to look at why it was happening, was helpful for our son.
 
I used this method to look at my son’s team last year, and it helped bring more sense to some confusing outcomes and trends. For example, my son’s team won a game late in the season but allowed the other team to score well above their season average.

Not to put you to more work, but is there a comparative chart or table you could put SoCal DA teams into that gives us an idea what your concept looks like on paper?
 
Wow really applaud your efforts, must love the game & stats to put in this much effort & data.

One of the things we have always looked at how well my player & his team is doing toward the later part of the season compared to when they started. There are other stats: passes connected, combos on scores, tackles, set play efficiency, shots, etc that can be used also besides the game results but individual development can be something difficult to nail down.

Each club is suppose to compete evals twice a season that should highlight Areas of playing standard, above, national team level, etc so first thing I tell my player is to compare but you should already know and we talk those over.

Over a number of years normally see the more well financed & stacked teams start off hot as others are playing catch-up and ramping up trainings, tatics, rosters, etx
 
Wow really applaud your efforts, must love the game & stats to put in this much effort & data.

One of the things we have always looked at how well my player & his team is doing toward the later part of the season compared to when they started. There are other stats: passes connected, combos on scores, tackles, set play efficiency, shots, etc that can be used also besides the game results but individual development can be something difficult to nail down.

Each club is suppose to compete evals twice a season that should highlight Areas of playing standard, above, national team level, etc so first thing I tell my player is to compare but you should already know and we talk those over.

Over a number of years normally see the more well financed & stacked teams start off hot as others are playing catch-up and ramping up trainings, tatics, rosters, etx
On the evals, are those supposed to be provided to the player? Our club did evals but it was very high level, and the eval was verbal only
 
Last edited:
Just as European academies, mls academies should measure success of development by ability to produce 2-4 players for their senior pro squad. All other parameters are not important.
 
Just as European academies, mls academies should measure success of development by ability to produce 2-4 players for their senior pro squad. All other parameters are not important.

That would be almost a zero sum and with only 1 mls academy after u15 in socal not much help to most. Some former academy players make the senior teams only to sit on the bench or go down to usl 2 to get playing time.
 
On the evals, are those supposed to be provided to the player? Our club did evals but it was very high level, and the eval was verbal only

Written given to the players along with individual talk with coach(s) and sometimes director's. several sections I recall: offense, defense, tatics, mental, teamwork, etc.
 
Written given to the players along with individual talk with coach(s) and sometimes director's. several sections I recall: offense, defense, tatics, mental, teamwork, etc.
got it. thx. and is there an evaluation scale from ussda? or just the coach's criteria?
 
got it. thx. and is there an evaluation scale from ussda? or just the coach's criteria?

A ussda supplied document & scale. He was surprised that first season when they were given out just before the winter break. His previous club gave out yearly evals but those weren't nearly as detailed and kind of sugar coated by comparison.

What I'm not national team level on defense or both feet yet? Nope, I guess not keep working son, that actually motivated him a bunch: 2nd half of the season he racked up good stats with his off foot and made a point to remind the coaches. Has a folder of his evals and I still tell him to look them over occasionally and make sure he's always working to improve.
 
A ussda supplied document & scale. He was surprised that first season when they were given out just before the winter break. His previous club gave out yearly evals but those weren't nearly as detailed and kind of sugar coated by comparison.

What I'm not national team level on defense or both feet yet? Nope, I guess not keep working son, that actually motivated him a bunch: 2nd half of the season he racked up good stats with his off foot and made a point to remind the coaches. Has a folder of his evals and I still tell him to look them over occasionally and make sure he's always working to improve.
got it. thx. that's awesome.
 
Not to put you to more work, but is there a comparative chart or table you could put SoCal DA teams into that gives us an idea what your concept looks like on paper?

Below is the ranking for 2017-17 u13/05 SoCal group teams offensive improvement/decline over the course of the 2017-18 season.

Couple of caveats. Many of the improvements over time were due to adding players to the roster (shocking, I know), and where that clearly was the case, I called it out.

For example, when, after game 8, SD Surf added two new forwards, who each averaged about one goal per game, it was probably not a coincidence that they had just lost to LAFC 0-3 in game 7.

Where teams appear to have improved organically, i.e. their players developed into better offensive players over the course of the season, I tried to call that out as well.

As a reminder, the metric I’m using for evaluating goal scoring effectiveness is calculated as the # of goals a team scores in a particular game divided by the number of goals the opposing team allowed over the course of the season.

It is a relative metric, so if a team is able to score one goal against LAFC that counts more than if that team is able to score two goals against a lesser team.

The point being that, rather than measure who is the best offensive team, the goal with this metric is to measure who is most improved offensive team.

With 24 games to look at last season, there were some pretty clear trends. I also included a summary chart and table.

If anyone is interested in more info about specific teams, let me know. If it would be helpful, I can take a couple of minutes to send over or post specific team data charted over the course of the season (i.e. similar to what I did for LAFC). Probably more informative than the summary below but doing 13 charts with commentary is above my current pay grade.
_______________

Here’s the data:

#1 Most Offensively Improved Team: Strikers – Organic development w/ +65% improvement
Definitely the most improved team from beginning of season to end, with almost entirely same roster the whole season. They added one player in the last half of the season, but he averaged about 40 minutes per game and only had one goal.

#2: San Diego Surf – Roster adds w/ +43% improvement
Strong offensively to start with, SD Surf improved their offense by adding two players - each of whom was good for about one goal scored per game - after game 8. For context, SD Surf lost to LAFC 0-3 in game 7.

#3: LA Galaxy -Organic development w/ +26% improvement
After a slow start, the LA Galaxy improved almost every game from game 6 to game 17 - but had tragedy strike their team in April and took a downturn after that. LAG was on track prior to April to be the most improved team offensively in 2017-18.

#4: Nomads – Roster adds w/ +20% improvement
After barely scoring in the first part of the season, the Nomads got better after game 11 but also added several players who helped offensively.

#5: Arsenal – Organic development and roster adds w/ +16% improvement
After the toughest beginning of season schedule of any team - SD Surf, LA Galaxy and LAFC - Arsenal improved gradually over the season. They also added a forward from FC Golden State about mid-season.

#6: LAUFA – Organic development w/ +8% improvement
LAUFA progress was up and down. They started strong against LAFC in the first game of the season, scoring the most goals LAFC would allow in 2017-18. LAUFA then dipped but had fairly steady improvement from game 4 thru game 14, after which they took another downturn. LAUFA also picked up several players thru the season but those players don’t appear to have driven offensive improvement.

#7: LA Galaxy San Diego w/ -5% decline despite roster adds
LAGSD goal scoring % was flat for the year, even with adding a new forward late in the season.

#8: Real SoCal w/ -15% decline but no roster adds
Real SoCal started strong but faded after several 2005 players started playing up with the 04/u14 2017-18 team.

#9: FC Golden State w/ -21% decline with one lost player to Arsenal
Although their numbers look almost identical to Albion below, FC Golden is a different story of slow but steady decline in goal scoring % over the course of the season. To be fair, they did lose one player to Arsenal but had that player playing mostly defense (Arsenal put that player up to forward).

#10: Albion w/ -21% decline but no roster adds
Albion's goal scoring % declined from the very beginning of the season, bottoming out around game 10. After that point, they recovered and had a nice improvement, but not enough to dig themselves out the start of season hole.

#11: LAFC w/ -23% decline despite adding one key forward
LAFC offensive effectiveness declined (albeit from a very high start) from the start of the season until game 11 when they added a key forward. They had another late drop towards the end of the season but that may have been due to a shift in focus towards defending.

#12: Pateadores w/ -36% decline

The Pats were up and down at the beginning of the season, bottomed out in game 8, had a significant improvement from game 9 to game 15 (a streak which included 1-3 loss to LAFC), but then dropped at the end of the season for a net decline. Of note, the Pats did have a 3-5 loss to LAUFA in their last game, which represented an uptick against the overall downward trend.

#13: Santa Barbara SC w/ -55% decline
Santa Barbara SC started strong, dropped a bit, had significant improvement from game 4 thru game 19 but ended with a collapse in the last five games of the season.

Here's chart:
upload_2018-9-18_17-56-16.png

Here's table:
upload_2018-9-18_17-58-40.png
 
Last edited:
  • Like
Reactions: Wez
Wow interesting

The way I thought about team improvement or not was to compare stats such as amt of points first half compared to 2nd half.

For example team #1
1st half: 38pts/14 = 2.71ppg
2nd half 36/16 = 2.25
T: 74pts/30 =2.47ppg Diff = -.46pp

Team #2
1st: 16pts/14 games = 1.14
2nd: 38 pts/16 = 2.75
T: 54/30 =1.8ppg diff + 1.61

Both teams playoffs qualifed but the higher seeds didn't fair well and only 2 out of top 8 seeds made qts in u18 for example so regular season doesn't always carry over to the post season
 
I applaud you for the effort in trying to bring statistics to this. One of the points that you mention is that there could be many reasons why a team performs a way throughout a year. The easiest thing to point at is injuries. How you would quantify the impact of a teams number 10 versus their 4th sub? I think it would be next to impossible. Some teams could easily handle an injury or 2 while other teams could be significantly impacted.

Second is the sample size. Due to the number of variables with all of the teams, you need a significantly larger sample size before making any conclusions of the effectiveness of a team or training.
 
Back
Top