REACHING 20,000 MUSIC EDUCATORS EACH MONTH IN PRINT/DIGITAL. SUBSCRIBE NOW FOR FREE! CLICK HERE!

Inside the BCME: Behind the Scenes

Mike Lawson • Archives • October 11, 2011

Three years ago, researchers at The Institute for Educational Research & Public Service, an affiliate of the University of Kansas, took over the back end of the NAMM Foundation’s Best Communities for Music Education (BCME) project. Dr. Karin Chang-Rios, Dr. Becky Eason, and project coordinator Stephani Howarter, along with two graduate assistants, are responsible for constructing the application form and then processing the materials submitted by communities and schools hoping to be recognized for musical excellence. These researchers crunch the numbers and analyze the data, working closely with the NAMM Foundation to ensure that the criteria and bench marks that have been decided on accurately reflect the qualities that NAMM is hoping to recognize, and that communities of all shapes and sizes have a chance to be included in the final tally of America’s best music education communities.

To learn a little bit more about what happens behind the scenes of the BCME, how the application questions were created and how the results are tabulated, SBO recently spoke with the three researchers, who graciously shed light on what some might consider to be one of the more mysterious aspects of this project.

School Band & Orchestra: How did you become involved in the BCME?

Dr. Becky Eason: We bid for and won this project three years ago. In doing so, we inherited a set of questions for the application that were already developed. We found a textbook where there had been some research on the critical elements of music education, and used that as a basis determine the categories and qualities that we were looking for. Then we brought in our team of experts from the music education department here at KU – we had a choral music specialist, a general music specialist, and a band person, both elementary and high school – sit down with us and, based on their years of experience working in music education, rank the importance of those categories. We then used that information to provide the weighting of each category, which we use to formulate an overall score.

SBO: Music programs throughout the country are extraordinarily diverse, usually tailored to the needs of each specific community in which they’re located. Given that, what are some of the common threads that lead them to be considered a “Best Community for Music Education”?

Dr. Eason: We’re looking for school districts where music is thriving. One piece of criteria that ends up being really important is that these are communities that support music in their schools. You hear that their concerts are sold out, and it’s not just parents; it’s a whole community supporting them. Another piece of criteria is the opportunity to take individual instruction, so that the kids can grow as musicians beyond what is offered during the school day.

SBO: Do the results of music competitions weigh heavily into the designation?

Dr. Eason: We ask about whether or not groups have performed in state or regional competitions, and we ask about how groups might be recognized on those platforms, but it’s not a really strong point of criteria.

Dr. Karin Chang-Rios: There are a large number of elements that are related to infrastructure. Do the schools and districts have infrastructure that supports music education? And then there are some quality elements that we look at, including staff or support from administration – those are quality indicators. Also, are they aligned with standards? More than just achievement, we look at the structures that are in place that might support and facilitate a quality music program.

SBO: What percentage of applications is selected?

Dr. Eason: This year was 56.7 percent were selected. We don’t make the final call on where the line is; the NAMM Foundation does that. Part of their desire this year was that during these really troubling financial times, where band programs are getting their funding slashed left and right, that people would still feel like they’re being rewarded for the good work that they’re doing. Our stroke might have been a little broader than it was in previous years because we wanted to make sure that even if your program is fighting for its life, there is someone out there saying, “See! Yes, you are doing great work!”

SBO: In the time that you all have been running the statistical end of it, have you noticed any trends in the applications that you received?

Dr. Eason: We had more applications this year than in the past, yes. There was a frustrating trend that we had to combat when we first started, which was that school districts thought that the way to show us their level of support for their program was to inundate us with multiple applications, which was maddening! So the numbers were higher before we took over the process, but that was because we had so many redundant applications. One district sent us 56 applications. Because of that, it’s hard to compare the trend data on the raw numbers; this is the first year that we didn’t have a significant number of duplicate applications.

Stephani Howarter: This year it seemed like funding was a major issue for many of the applicants. There was also a follow-up question about impact and, surprisingly, even though funding was a major issue, most people were saying that the financial woes weren’t directly impacting the program in a significant way. When I compared it to last year, funding was more of an issue this year, but it didn’t have a huge impact.

SBO: Funding isn’t heavily weighted in the rubric, which is interesting, because so much of the battle for music educators revolves around that particular topic.

Dr. Eason: We are really sensitive to not punishing districts based on size. Smaller districts, by necessity, not only are they going to have less money overall, they are also going to have less money per student. They don’t have the resources to have six different band options. We didn’t want that to come into play.

Dr. Chang-Rios: I would say that the conceptual framework for the scoring system isn’t necessarily static. We are constantly reviewing the items and ensuring that they are valid, so they will include a diverse range of communities.

SBO: Is there anything you wish more people knew about this process?

Dr. Eason: First of all, it’s important that people understand that this is a very carefully vetted process. These questions were not randomly selected; we brought in experts to prepare these questions with us and they’re based on a conceptual framework. It is a quite legitimate and valid process; it’s not as if NAMM is asking for random information and then picking the people they like.

The other piece that would be useful for people to know is that it really needs to be a qualified district representative who fills out the survey. We ask for some very specific information that your average high school choral director isn’t going to know; it pretty much has to be a district administrator, or done in conjunction with a district office to gather all of the information we need. It would probably be very frustrating if you sat down without all of those pieces of information.

Dr. Chang-Rios: We used to have just one application for both schools and districts, but we realized that there might be a really great school within a district, but the district isn’t interested in applying. We’ve adapted it so that schools can apply on their own now.

SBO: Last year was the first year that happened, correct?

Dr. Eason: Yes, that makes a huge difference, and it diminishes the huge burden of trying to gather all of that data if you really just want to talk about your school.

SBO: Who made the decision to recognize individual schools?

Dr. Chang-Rios: There were some applicants from individual schools that first year, but they weren’t doing very well on the application process. We saw that it was in large part because of their size. As Becky said, we didn’t want to punish them because they didn’t have great funding or couldn’t offer a ton of different choices. So we separated them out and realized that we really needed two scoring systems, one for schools and one for districts.

SBO: Do you think it’s going to be more prevalent for schools applying versus districts?

Dr. Eason: We don’t have enough data points to know if we have a trend in that direction; we’ll just have to give it a few more years to see what which way the numbers go. It’s a dynamic process, so we’ll look again before we release the survey in January to see how well different groups faired, and if there’s anything we need to do to the questions themselves or how they’re weighted and scored to make sure that everyone is getting a fair shot in the process.

SBO: How long does the whole application review process take on your end?

Ms. Howarter: There is a lot of cleaning of the data, and the scoring rubric is actually somewhat complicated because sometimes some elements of the data are missing.

Dr. Eason: There is a lot of back and forth for about a month after the application deadline closes and before the results are announced.

Dr. Chang-Rios: It’s really a two-part process. First we report out the raw scores and then the iterate part is talking with the NAMM folks and determining what the appropriate cut-off point is, about who makes the list and who isn’t quite up to par. For example, this year, applicants had to score at least 50 percent in every category. That was the minimum threshold for each of these components. That’s a big part of the process, because you don’t want to put anyone on the list that doesn’t mean the minimum requirements. The overall process really is a holistic look at these programs.

Breaking Down the BCME Scoring System

According to the researchers at the Institute for Educational Research & Public Service, there are nine categories used to calculate the BCME. Here they are, along with their respective weights and the formula for producing an overall score:

  • Music for all – 500 points
  • Support from administrators – 800 points
  • Scheduling – 400 points
  • Opportunity – 800 points
  • Qualified faculty– 400 points
  • Standards – 1100 points
  • Community – 400 points
  • Funding – 500 points
  • Technology – 400 points
  • TOTAL – 13400 points

BCME weighted formula = (3*Music for all) + (3*Support from Administrators) + (3*Scheduling) + (3*Opportunity) + (3*Qualified Faculty) + (3*Standards) + (2*Community Partnerships) + (1*Funding) + (.25*Technology).

The Latest News and Gear in Your Inbox - Sign Up Today!