Jump to content

Another Attempt at Predicting DCI


Recommended Posts

Before I left for tour, I was discussing a formula/method I had developed to predict BOA scores with strong results over the past few years. If you don't recall that, I don't really expect you to, but I wanted to share the results from this year as well as my own analysis on it's errors, and a few (occasionally funny) predictions the model occasionally spat back at me.

A Brief Description

-----------------------------------------------------

The formula takes into account 3 main things when using it attempts to generate scores for World Championships.

1. Growth Patterns

The Corps' growth patterns for the last 4 years are modeled with scores being represented as percentages of the World Championship scores in relation to days remaining of tour. The most recent year is weighted 50%, then 25%, then both 3 and 4 years ago at 12.5%. This gives a solid basis of a corps' general growth while also allowing for upward momentum from the most recent years to be modeled in.

2. Spreads

Scores on their own are a mere fragment of the end result however. A key portion of the prediction comes from the attempt to cancel out "easy" or "hard" judging. Looking back at where corps usually score at that particular time of season, scores are readjusted to meet where the groups should be at that point in time. This is a known issue, as with DCI competitive fields are often small enough that if enough groups at a show are overperforming from years previous, scores are brought down too far as a result of an apparent "easy" judge.

3. Judges

Finals week judges tend to reward groups that they have seen before slightly more. Though often amounting to a tenth or less, this slight bump is modeled in.

It then takes these 3 parts and combines them to generate a full set of finals week scores. As a corps performs more, their scores from across the season are averaged (This is a definite huge issue that gets carried over from the BOA formula. There, judging is erratic enough where upward trends can't be fully trusted, whereas here in DCI with the tour format, more recent scores should be weighted higher to ensure that trends outside the norm are accounted for i.e. Crown's uncharacteristic surge this year.)

Results

-----------------------------------------------------------------

Open Class Prelims

---------------------------

1. Santa Clara Vanguard Cadets 81.000

2. Blue Devils B 80.15

3. 7th Regiment 77.05

4. Genesis 76.7

5. Legends 76.05

6. Spartans 75.925

7. Music City 73.325

8. Gold 72.9

9. Colt Cadets 68.45

10. River City Rhythm 66.45

11. Raiders 65.7

12. Les Stentors 61.8

13. Racine Scouts 53.875

14. Blue Saints 51.725

INT Jubal 66.925

If they had attended: groups with enough information to generate scores are as follows

(9) Golden Empire 71.125

(9) Louisiana Stars 69.625

(10) Guardians 67.675

(11) Watchmen 66.275

(12) Impulse 65.55

(12) Thunder 61.875

(13) Blue Devils C 55.05

(14) Incognito 52.575

Open Class Finals

--------------------------

1. Santa Clara Vanguard Cadets 81.925

2. Blue Devils B 81.35

3. Genesis 77.25

4. 7th Regiment 77.15

5. Spartans 76.275

6. Legends 76.025

7. Music City 73.5

8. Gold 73.45

9. Raiders 65.875

10. Colt Cadets 65.525

11. River City Rhythm 63.35

12. Les Stentors 61.8

If they had attended: groups with enough information to generate scores are as follows

(9) Golden Empire 72.2

(9) Louisiana Stars 70.25

(9) Guardians 69.425

(9) Watchmen 67.05

(9) Impulse 66.3

(12) Thunder 62.95

World Class Prelims

-------------------------------

1. Blue Devils 99.375

2. Cadets 99.3

3. Bluecoats 98.425

4. Carolina Crown 97.75

5. Santa Clara Vanguard 97.35

6. Cavaliers 92.875

7. Blue Knights 92.2

8. Phantom Regiment 91.425

9. Madison Scouts 90.075

10. Blue Stars 89.2

11. Crossmen 87.8

12. Boston Crusaders 87.65

13. Troopers 85.15

14. Colts 85.075

15. Academy 83.95

16. Spirit of Atlanta 82.275

17. Pacific Crest 79.875

18. Mandarins 79.325

19. Oregon Crusaders 78.725

20. Santa Clara Vanguard Cadets 78.525

21. Blue Devils B 78.15

22. Jersey Surf 73.95

23. Cascades 73.425

24. Genesis 72.875

25. Legends 72.85

26. 7th Regiment 72.35

27. Spartans 72.275

28. Music City 71.275

29. Pioneer 69.875

30. Gold 67.325

31. Jubal 65.125

32. Colt Cadets 62.55

33. Raiders 62.5

34. Gita Surosowan Banten 61.9

35. Chien Kuo 61.3

36. River City Rhythm 60.7

37. Les Stentors 59.15

38. Blue Saints 50.625

39. Racine Scouts 48.775

If they had attended: groups with enough information to generate scores are as follows

(30) Columbians 69.225

(31) Golden Empire 66.425

(31) Louisiana Stars 65.675

(32) Guardians 62.875

(35) Watchmen 61.325

(37) Southwind 60.275

(38) Impulse 58.425

(38) Heat Wave 56.95

(38) Thunder 55.75

(38) Eruption 55.125

(38) Blue Devils C 53.6

(39) Incognito 48.95

World Class Semis

-------------------------------

1. Blue Devils 99.2

2. Cadets 98.825

3. Bluecoats 97.675

4. Carolina Crown 97.65

5. Santa Clara Vanguard 96.85

6. Blue Knights 92.225

7. Cavaliers 92.2

8. Phantom Regiment 91.1

9. Madison Scouts 89

10. Blue Stars 88.2

11. Boston Crusaders 87.35

12. Crossmen 87.325

13. Troopers 84.85

14. Colts 83.8

15. Academy 83.7

16. Spirit of Atlanta 81.475

17. Pacific Crest 79.675

18. Mandarins 79.25

19. Blue Devils B 79.15

20. Santa Clara Vanguard Cadets 78.95

21. Oregon Crusaders 78.825

22. Jersey Surf 73.65

23. Cascades 73.6

24. Genesis 72.625

25. Legends 72.525

World Class Finals

-----------------------------

1. Blue Devils 98.85

2. Cadets 98.275

3. Bluecoats 97.675

4. Carolina Crown 97.3

5. Santa Clara Vanguard 96.675

6. Cavaliers 92.2

7. Blue Knights 91.675

8. Phantom Regiment 91.05

9. Madison Scouts 88.625

10. Blue Stars 87.95

11. Boston Crusaders 87.05

12. Crossmen 86.7

Reflections

-------------------------

This was pretty bad. That's a bit of an understatement actually. It got all 12 finalists correct (Take that machine learning!), but didn't experience the success I hoped for after the run it had with BOA scores this year. At least I know where to start fixing it. The plan to fix it is a step-by-step process, making minor adjustments until the desired results are achieved. The main fix will be weighting more recent predicted scores higher to account for moves outside the trend instead of simply plugging forward as this version attempts to do.

Tidbits

-----------------------

BD would win the Open Class Championship with an incredible 109.375

Despite missing the cutoff for semis, Spartans were predicted in their proper 23rd place if they hadn't.

Comments? Questions? Feel like re-iterating just how bad this was, let me know and I'll try to respond to you quickly.

Edited by gloriousgoo
Link to comment
Share on other sites

This was developed for a calculus project my junior year of high school. I know very little of actual statistics (mostly self-taught). This was a massive trial and error project I started and just let it get out of hand. The initial project was just growth curve derivatives and I just kept working on it in my free time until it became decent enough to post in public during the BOA season last year.

So to answer your question: None/I'm not sure.

Edited by gloriousgoo
Link to comment
Share on other sites

This is meant to be take into account all scores prior to the relevant event?

If that's right, I'm surprised that you have the top Prelims scores so high, since all season long, scores were lower than they have been in recent years.

But I guess that explains why your top Finals scores are lower than your top Semifinals scores, which are lower than your top Prelims scores. Usually the opposite happens, as seen in the following list for the last five years, where I have put the exceptions in boldface: for the top six positions, the score from night to night during championships has gone down just four times out of sixty chances:

2011

1. 97.350 > 97.800 > 98.350

2. 96.850 > 97.300 > 97.800

3. 96.450 > 96.900 > 96.850

4. 94.800 > 95.650 > 95.300

5. 94.150 > 93.950 > 95.050

6. 91.850 > 92.200 > 92.200

2012

1. 97.550 > 98.200 > 98.700

2. 96.000 > 97.450 > 97.650

3. 94.600 > 95.700 > 96.550

4. 94.050 > 94.100 > 95.050

5. 93.200 > 93.700 > 94.450

6. 92.250 > 92.800 > 92.550

2013

1. 97.200 > 98.250 > 98.300

2. 97.050 > 97.600 > 98.050

3. 96.100 > 96.850 > 96.950

4. 95.500 > 96.500 > 96.850

5. 92.900 > 93.050(t) > 93.350

6. 92.350 > 93.050(t) > 93.250

2014

1. 98.550 > 98.950 > 99.650

2. 96.925 > 96.700 > 97.175

3. 95.800 > 96.550 > 96.875

4. 94.450 > 95.650 > 96.075

5. 94.750 > 95.300 > 95.675

6. 93.000 > 93.275 > 93.675

2015

1. 96.475 > 96.775 > 97.650

2. 95.850 > 96.625 > 97.075 (ignoring Thursday penalty)

3. 95.475 > 95.775 > 96.925

4. 95.425 > 94.500 > 95.900

5. 93.300 > 93.425 > 93.850

6. 90.550 > 91.150 > 91.850

Now there is some movement by corps from position to position in those five years, but you can reasonably expect the score for any placement to improve each night and probably should tweak your model accordingly, so that it doesn't do this:

Glorious Model 2015

1. 99.375 > 99.200 > 98.850

2. 99.300 > 98.825 > 98.275

3. 98.425 > 97.675 > 97.675

4. 97.750 > 97.650 > 97.300

5. 97.350 > 96.850 > 96.675

6. 92.875 > 92.225 > 92.200

But this is still a fascinating exercise. Thanks for sharing it!

Finally, I can't resist a comment on this:

BOA ... There, judging is erratic enough where upward trends can't be fully trusted

That's saying a mouthful. I only first paid any serious attention to BOA last year, and was shocked to see that one band from this area had a lower score at Grand Nationals Prelims on Nov. 14 than they had received in a regional on Sep. 20. I saw them live near the beginning and the end of their season and can say confidently that they had shown significant improvement over that time. The lowest-scoring DCI corps do tend to see a notable drop at Prelims in August, but not back to what they scored in June!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...