The annual American Fitness Index is well-intentioned but conflicting metrics dilute its focus.
The American College of Sports Medicine (ACSM) has released their sixth annual American Fitness Index (AFI) which “evaluates the infrastructure, community assets and policies that encourage healthy and fit lifestyles in the 50 most populous metro areas in the United States.”
I’m going review their study and tell you what they did right, and where the ACSM could improve their analysis. In fact, I’ll show you how in their effort to make a great study, they actually ended up making their index much weaker.
First, here’s the complete ranking of America’s Fittest Cities, with Minneapolis snagging the top spot for the third year in a row. Check out their web site for details about each metro, or download the complete report.
1. Minneapolis-St. Paul
2. Washington, D.C.
3. Portland, Ore.
4. San Francisco
9. Hartford, Conn.
10. San Jose
12. Salt Lake City
14. San Diego
15. Raleigh, N.C.
18. Virginia Beach
20. Richmond, Va.
24. New York City
28. Kansas City, Mo.
29. Los Angeles
30. Columbus, Ohio
31. St. Louis
35. Riverside, Calif.
38. New Orleans
39. Las Vegas
41. Birmingham, Ala.
48. San Antonio
50. Oklahoma City
This looks like a good ranking of the nation’s fittest large metros, based on my 30 years of rating and ranking places. And it illustrates a situation I like to summarize as “wealth equals health.” Looking at the ACSM list, the top-ranked fittest places are generally those with higher incomes, and because income and educational attainment are closely correlated, those places have a more educated population. Doing a simple Pearson r-squared correlation analysis comparing each places’s ranking with its household income, we find a value of .625, which generally classified as a strong association.
The ACSM made the choice to rank metro areas (instead of cities), which was a smart decision. Metropolitan areas include the central city and the surrounding counties, with the suburbs. If you just analyze the cities, you’re often excluding most of the population living in and around a city.
A key part of every study is the choice of the factors or metrics to analyze and score to calculate the rankings. The ACSM Fittest Cities does a great job in choosing their metrics, which fall into two categories – Personal Health Indicators and Community Indicators.
There are 14 Personal Health Indicators, which include measures of physical activity, smoking, obesity, chronic medical conditions, and mortality. Most of these are from the CDC’s BRFSS, which is an outstanding resource I’ve used many times in my studies.
The ACSM included 16 metrics as Community Health Indicators, most of which are the number of recreational facilities (dog parks, swimming pools, tennis courts) from an annual study by the Trust for Public Lands. There’s a problem here, in that the TPL report only lists the resources for the largest cities in the United States, and not the metropolitan areas. The issue is that city does not always represent the assets of the entire metro area. For example, the city of Atlanta has about 430,000 but the Atlanta metro area sprawls over 29 counties with a population of 5.5 million. Combining measures at the city and metro level is shaky at best.
Too Many Measures
But my biggest complaint is that the ACSM went overboard in including too many health, wellness, and fitness measures in their study. It’s easy to lose focus when a study has an excessive number of factors, because either they are redundant (and you can eliminate some of them) or they conflict with each other and introduce noise into the analysis.
In the ACSM study of Fittest Cities, they are actually measuring two distinctly different city characteristics – the health of its residents and the resources available in the community for healthy living. In an ideal world, they would be in sync with each other… an area’s residents uses its abundant resources to become more fit and healthy. But this either this reasoning is flawed or its measurement and analysis.
For example, Cincinnati has the highest score for community resources, ranking #1, but ranks #38 for the personal health of its residents. Pittsburgh ranks #5 for community resources and #29 for personal health. and Austin ranks #28 for community resources but #6 for personal health. New Oreleans ranks dead last for personal health (sorry, bad choice of words) but just misses a top-ten ranking (at #11) for community resources. Overall, the r-squared correlation between the categories of Personal Health and Community Resources is only .499, showing that their relationship is not as strong as it should be to contribute to an overall ranking.
I suggest that the ACSM refine its study to either measure the health of an area’s residents or create an assessment of its recreational and fitness resources, because the two are not the same. In fact, “fitness” is not the same as “health” or “wellness”, and there should be some distinction made between these categories.
An Annual Study?
The ACSM is publishing these rankings on an annual basis, and it is too much to expect any substantive changes from year to year. In my experience, I’ve found that data of this type rarely changes significantly in a period of less than three years, which is part of the reason Minneapolis has been been the top-ranked metro for the last three years. The idea behind an annual measure of health and resources in commendable but don’t expect the rankings to vary much.