How do leagues stack up?

It's pretty clear. If your girl is not playing in ECNL, or your boy is not playing in MLS Next, you should be looking really closely at the local NPL teams. Cut out the extra costs and travel, to play the same level of competition as all the other letter leagues. Spend the extra time you have on the weekend getting in individual training intead of riding in a car or flying. Keep trying for the ECNL or MLS team while you play NPL; don't be a sucker thinking there is value playing teams in other Counties/States that are the same as you local teams.
 
I think we're saying the same thing.

With interplay between leagues everything works with the current model.

Without interplay between leagues some leagues won't be able to rank as highly as others.

Another way to look at it is a team not in the highest league could win for 4-5 years straight + not rank very highly. Espicially if they never play outside of their league.

Personally I believe leagues will allow interplay less and less over time either directly or indirectly + clubs will be ok with it. It's just easier to state that you're the best vs proving it all the time in games.

We are not saying the same thing at all, and I still think you are completely misunderstanding. There is plenty of overlap in all of the leagues, including ECNL and GA, to show that one is several goals stronger. It isn't because of banding, it isn't because of mathemetical idiosyncracies, it's because the teams in one league are better than the teams in the other league, by a very significant margin. Any lesser league can look to any excuse why their individual team's rating is lower, or why the league as a whole is lower, but there's really nothing to it but excuses that don't hold any water.

It would be a better world, perhaps, if everyone played eachother directly - but this isn't the world we live in. Playoffs are the method that those within a sanctioned league can show they are better than others, and tournaments that attract the top teams from multiple leagues are a glimpse into how it can be. From a ratings standpoint, it's this type of interplay that confirms and readjusts the ratings from team to team, age to age, league to league, etc.
 
It's the top 10 teams of each league. Previous lists have used the average of the entire league. In some cases, the results are the same, and in others - they are significantly different. Leagues that have a huge variation from the top teams to the bottom teams are helped more in this view, than leagues that have a much smaller variation from top teams to bottom teams. What it shows is that the top teams, in the top leagues, are pretty close to eachother - moreso than the average of each league. So if the top team in one league were to compete against a top team in another - it would be a good game.

What it does do, is illuminate which leagues consider themselves top leagues, but their best teams aren't great.

For all lists like this, it would be even better if you could drill down to see the 10 teams chosen, and confirm that they are in the same league that you believe them to be in. There is certainly some weirdness, as applied to NorCal.
Limiting it to the top 10 would favor a larger league and punish the smallest. For instance, GA seems to have only 9 teams, so you've got the best and the lowest ranked team in GA, versus only considering the top half of ECRL and ECNL. Not sure how many teams the other leagues have, but if they have 15 or more, it would skew them upwards as opposed to having 10 or less. Taking a cursory look at the soccer rankings app's ranking of Calif teams, it seems that the best teams are all ECNL. The next group that shows up most is GA and then ECRL, but they are pretty close. DPL even seems to have far more teams higher up in the rankings than NPL, so that seems hard to square with what this new listing provides.
 
what is the difference between NPL and National League PRO?
NL PRO takes top teams from other states that compete under the USYS umbrella and has their own national championship series from it. It consists of top qualifying teams from California from NPL, Elite 64, CSL Premier, and California Regional League to compete in USYS National Championships. NL PRO is the top flight of this championship series. The non E64 teams go through a regional playoff (ie Far West Regionals) on their pathway to the national championships.
1704017665507.png
 
Limiting it to the top 10 would favor a larger league and punish the smallest. For instance, GA seems to have only 9 teams, so you've got the best and the lowest ranked team in GA, versus only considering the top half of ECRL and ECNL. Not sure how many teams the other leagues have, but if they have 15 or more, it would skew them upwards as opposed to having 10 or less. Taking a cursory look at the soccer rankings app's ranking of Calif teams, it seems that the best teams are all ECNL. The next group that shows up most is GA and then ECRL, but they are pretty close. DPL even seems to have far more teams higher up in the rankings than NPL, so that seems hard to square with what this new listing provides.

You can't really base it on team names alone. Many NPL teams do not include "NPL" in their team names. In fact, one ("So Cal Blues Call DPL Fly" in SR) is even listed as a DPL team by its name in SR. But if you look at the sources for that team, you will see a lot of "NPL" results.

Look here (GotSport ) for a list of actual SoCal NPL teams per SoCal NPL Fall League standings.
 
You can't really base it on team names alone. Many NPL teams do not include "NPL" in their team names. In fact, one ("So Cal Blues Call DPL Fly" in SR) is even listed as a DPL team by its name in SR. But if you look at the sources for that team, you will see a lot of "NPL" results.

Look here (GotSport ) for a list of actual SoCal NPL teams per SoCal NPL Fall League standings.
ETA: Posting G09 NPL Standings to keep the data set consistent. But it still holds true; many NPL teams do not include "NPL" in their team names. GotSport
 
I think Mark is probably pulling his data from the source of where the games are being recorded. So if it’s got sport then he’s able to parse out which league the data is from and then make his rankings accordingly. What I was trying to get at is, as NL pro for example that many of these teams have cross over within these leagues if you’re not ecnl, GA, DPL or ECRL.

Elite 64, NPL, SoCal, CRL/ Coast, all participate in USYS Nationals which also crosses into NL PRO. So are these results having duplicate team entries skewing the data set?
 
I think Mark is probably pulling his data from the source of where the games are being recorded. So if it’s got sport then he’s able to parse out which league the data is from and then make his rankings accordingly. What I was trying to get at is, as NL pro for example that many of these teams have cross over within these leagues if you’re not ecnl, GA, DPL or ECRL.

Elite 64, NPL, SoCal, CRL/ Coast, all participate in USYS Nationals which also crosses into NL PRO. So are these results having duplicate team entries skewing the data set?
Statistics is the science of producing unreliable facts from reliable data.

In this case a specific interpretation was presented. There's probably 10+ different ways to present data which would have implied varying results. That's the benefit of being the one that compiles and presents what's available.
 
Statistics is the science of producing unreliable facts from reliable data.

In this case a specific interpretation was presented. There's probably 10+ different ways to present data which would have implied varying results. That's the benefit of being the one that compiles and presents what's available.

I think you missed my point of having duplicated teams in the leagues mentioned. Specifically NL PRO. No need to mansplain how stats work brother…
 
I think you missed my point of having duplicated teams in the leagues mentioned. Specifically NL PRO. No need to mansplain how stats work brother…
I saw it as well + I agree with you. It's just not worth getting upset about. Anyone involved in youth soccer looked at that graphic + instantly discounted it because it doesn't align with general preconceived perceptions.

LIke I said before, the person that collects the data gets to present it however they want.
 
Limiting it to the top 10 would favor a larger league and punish the smallest. For instance, GA seems to have only 9 teams, so you've got the best and the lowest ranked team in GA, versus only considering the top half of ECRL and ECNL. Not sure how many teams the other leagues have, but if they have 15 or more, it would skew them upwards as opposed to having 10 or less. Taking a cursory look at the soccer rankings app's ranking of Calif teams, it seems that the best teams are all ECNL. The next group that shows up most is GA and then ECRL, but they are pretty close. DPL even seems to have far more teams higher up in the rankings than NPL, so that seems hard to square with what this new listing provides.

You're right, there are any number of ways to present the differences in league. One of the common ways that was done before, is taking the average rating, of all teams that were assigned per league. This helped leagues that had a limited top to bottom spread, and hurt leagues that had a large spread. For example, an average NPL team is somewhere near the midpack of NPL 2 (assuming ECNL-RL, NPL1, NPL2, NPL3), while the average MLS N team is somewhere in the middle of MLS N. The top RL team vs. the average NPL 2 team is going to be expected to be a blowout, while the top MLS N team vs. the midpack MLS N team is expected to be a win, but not nearly the difference implied.

Another way to compare, is to ask the question - how do the top teams in league A vs the top teams in league B stack up, and the way this could be done is choosing the top 10 teams by league. Of course - if a league only has 9 teams total, that means all of them and an average of them would show pretty weak, as the top 9 teams in league would include both the top teams as well as all of the bottom teams. Some would say that league that has less than 10 teams in it state-wide is already weak - but that's a different discussion. A potential fix might be to only take the top 10% of teams rather than the top 10, but for a league that has 300 teams, that's comparing 30 teams against another league where the top 10% means less than 1 team. Top 10 seems like the better choice of the two, given only those 2 options - but in either option, the disfavored league would complain about the results.

One of the things that this type of view does show pretty clearly, is that the top teams in several leagues, are much closer together, than the earlier comparison of average teams in league. It is answering a different question, "How would one top team do against another", rather than "How would an average team of League A stack up against an average team of League B". Both questions have value, but maybe the first is a better way to compare leagues, maybe some disagree.

Drilling into the league descriptions to see which teams went where would end at least some of the questions about what each league represents, and confirming that the team association / league association jives with local understanding. It's quite possible now that the underlying data has issues, especially if the particular league is harder to identify - and while it would be good to know how it was done - it would also be good to just know the top 10 teams, so there would be inherent validation over the data, and therefore much of the results.

All of this becomes irrelevant when comparing one specific team against another specific team, where their ratings can be compared directly, and a probability between the two can be compared and displayed. It will always be more accurate in predicting which team might best the other on the day, rather than comparing the leagues they play in to make the same type of inference.
 
You're right, there are any number of ways to present the differences in league. One of the common ways that was done before, is taking the average rating, of all teams that were assigned per league. This helped leagues that had a limited top to bottom spread, and hurt leagues that had a large spread. For example, an average NPL team is somewhere near the midpack of NPL 2 (assuming ECNL-RL, NPL1, NPL2, NPL3), while the average MLS N team is somewhere in the middle of MLS N. The top RL team vs. the average NPL 2 team is going to be expected to be a blowout, while the top MLS N team vs. the midpack MLS N team is expected to be a win, but not nearly the difference implied.

Another way to compare, is to ask the question - how do the top teams in league A vs the top teams in league B stack up, and the way this could be done is choosing the top 10 teams by league. Of course - if a league only has 9 teams total, that means all of them and an average of them would show pretty weak, as the top 9 teams in league would include both the top teams as well as all of the bottom teams. Some would say that league that has less than 10 teams in it state-wide is already weak - but that's a different discussion. A potential fix might be to only take the top 10% of teams rather than the top 10, but for a league that has 300 teams, that's comparing 30 teams against another league where the top 10% means less than 1 team. Top 10 seems like the better choice of the two, given only those 2 options - but in either option, the disfavored league would complain about the results.

One of the things that this type of view does show pretty clearly, is that the top teams in several leagues, are much closer together, than the earlier comparison of average teams in league. It is answering a different question, "How would one top team do against another", rather than "How would an average team of League A stack up against an average team of League B". Both questions have value, but maybe the first is a better way to compare leagues, maybe some disagree.

Drilling into the league descriptions to see which teams went where would end at least some of the questions about what each league represents, and confirming that the team association / league association jives with local understanding. It's quite possible now that the underlying data has issues, especially if the particular league is harder to identify - and while it would be good to know how it was done - it would also be good to just know the top 10 teams, so there would be inherent validation over the data, and therefore much of the results.

All of this becomes irrelevant when comparing one specific team against another specific team, where their ratings can be compared directly, and a probability between the two can be compared and displayed. It will always be more accurate in predicting which team might best the other on the day, rather than comparing the leagues they play in to make the same type of inference.
I manually pulled the top 9 G09 GA teams in SoCal (that is, all GA teams in SoCal) and the top 9 G09 NPL teams in SoCal. The average GA rank was slightly lower (better) than the average NPL rank. Which would seemingly contradict the graphic, even though they were quite close.

But then I realized that I accidentally pulled the G10 data.

I might pull the #s for G09 tomorrow and try again.
 
NL PRO takes top teams from other states that compete under the USYS umbrella and has their own national championship series from it. It consists of top qualifying teams from California from NPL, Elite 64, CSL Premier, and California Regional League to compete in USYS National Championships. NL PRO is the top flight of this championship series. The non E64 teams go through a regional playoff (ie Far West Regionals) on their pathway to the national championships.
View attachment 19228
Thanks. Wow, a whole extra layer of confusion I didn't know about.
 
I manually pulled the top 9 G09 GA teams in SoCal (that is, all GA teams in SoCal) and the top 9 G09 NPL teams in SoCal. The average GA rank was slightly lower (better) than the average NPL rank. Which would seemingly contradict the graphic, even though they were quite close.

But then I realized that I accidentally pulled the G10 data.

I might pull the #s for G09 tomorrow and try again.

There is no separation between norcal and socal. If you are trying to recreate the graphic, you need to pull all 2009G GA teams in California, which shows at least 14 with recent games, looking at name alone, and take the top 10. You also need to take the top 10 teams from NPL state-wide. Using the same methodology on a subset doesn't show much, and is misleading to what you are trying to say (choosing top 10 of a subset of 9 will be incorrect on its face).
 
You can't really base it on team names alone. Many NPL teams do not include "NPL" in their team names. In fact, one ("So Cal Blues Call DPL Fly" in SR) is even listed as a DPL team by its name in SR. But if you look at the sources for that team, you will see a lot of "NPL" results.

Look here (GotSport ) for a list of actual SoCal NPL teams per SoCal NPL Fall League standings.
I don't see the "Call DPL fly" team you are talking about, but when you take out the well-known teams, there's not enough mis-identification that could have skewed things that far. So unless there's some significant league cross-over situation that I'm unaware of, as mentioned above, the data needs to be drilled down into because it doesn't match what is shown on SR.
 
I don't see the "Call DPL fly" team you are talking about, but when you take out the well-known teams, there's not enough mis-identification that could have skewed things that far. So unless there's some significant league cross-over situation that I'm unaware of, as mentioned above, the data needs to be drilled down into because it doesn't match what is shown on SR.

He's looking at the 2010 teams instead of the 2009s. Here's that team, with DPL in their name, though the game history is entirely SoCal league at the NPL level.

so cal blues fly.jpg

Identifying what league teams are in isn't an exact science, for one thing some play in multiple leagues over time. Sure - MLS teams are probably pretty easy to identify as MLS, and perhaps ECNL as well, but other than that there has to be some set of algorithms/rules to determine which league best fits. Using name only is quick and easy (though often incorrect), while digging in to each team (10 teams by however many leagues is the end result, and is a small fraction of the number of teams that need to be validated), probably isn't scalable.
 
In that team's specific case ("So Cal Blues Call DPL Fly 2010G"), that team is only named that because they registered for December 2023's Coronado Holiday Cup as "So Call Blues SC Blues DPL Fly 2010". Prior to that data source, this team would have been named So Cal Blues NPL Fly, and probably will be again once the new games come in from the spring season. SR is typically going to take the most recent data source as the proper name, and if anyone has concerns about their naming (assuming they are assigned to the right club), they should to take it up with the organizer of the event that is posting the schedules/results on gotsport and everywhere else. If in this case it truly is a separate team, someone should remove those results from that team and the name will correct itself - but as far as I can tell there is no DPL team, and this is probably the same team entity.
 
He's looking at the 2010 teams instead of the 2009s. Here's that team, with DPL in their name, though the game history is entirely SoCal league at the NPL level.

View attachment 19239

Identifying what league teams are in isn't an exact science, for one thing some play in multiple leagues over time. Sure - MLS teams are probably pretty easy to identify as MLS, and perhaps ECNL as well, but other than that there has to be some set of algorithms/rules to determine which league best fits. Using name only is quick and easy (though often incorrect), while digging in to each team (10 teams by however many leagues is the end result, and is a small fraction of the number of teams that need to be validated), probably isn't scalable.
[/QUO

I see that I failed to note that there are GA teams in NorCal. So when you look at the top 10 GA teams in all of CA, the 10th best team is ranked 57th in SR.
The top 10 ECRL teams are all within the top 46 on SR.
There are only 6 other teams in that range that are not ECNL/GA/ECRL/DPL, and mostly towards the bottom of that range, so how does NL PRO and NPL both end up higher than GA and ECRL? Like was mentioned above, it could only happen if there are ECNL, ECRL, GA or DPL teams that also play in NPL or NL PRO and the analysis is showing that, whereas the SR ranking is not.
 
Right, in that knowing which teams went into each league would either validate the league rankings, or confirm that there is a problem with the selections and/or math. For the top 50 teams in California 2009G, a quick look (and best guess in a few cases), shows this:

ECNL: 21
ECRL: 10
GA: 7
DPL: 8
E64: 2
NPL: 1
Coast: 1

Rank it however one likes, but in this view it goes ECNL>RL>DPL>GA>E64>NPL>Coast

All of this roughly in the same order of rankings as well, though if the average of GA/DPL were taken, GA would probably show as higher than DPL, as their 7 seven teams in top 50 are generally higher than DPL's 8 teams in top 50. I don't see how NL Pro fits in here, and it would be interesting to see the teams that are chosen in the data set., which would probably answer most of the questions that people have.
 
It looks like USA Sports Statistics just posted the same chart, but for national rankings rather than just California. This has same methodology, in choosing the top 10 teams in each league, rather than the older methodology of taking the average of the entire league. Age used is the same 2009 group. Will likely trigger the same questions about NPL vs. ECNL-RL, National League Pro, etc.

Top National Leagues Fall 2023.jpg
 
Back
Top