Search for any place in the USA:

        


Money Magazines Best Places to Live 2011

Rating the Rankings

By Bert Sperling, Sperling’s BestPlaces

August 17, 2011

Money magazine’s Best Places to Live – 2011

http://money.cnn.com/magazines/moneymag/bplive/2011

Published August 15, 2011

The Bottom Line:

Grade:  B-

Money magazine presents their annual list of 100 best places.  Though the title is “Best Places to Live”, it’s really focused on small towns for upper-middle class families.  To that end, they present generally solid choices, with rosy, upbeat descriptions.  Money’s analysis is direct and simple, without any nuance, but fails to consider several key aspects of livability.

(see the detailed ratings)

Pros:
Good selection of upper-middle class family towns (suburbs, actually)
100 cities ranked (instead of only ten)
Full list presented (no slideshow)
Photo of each town, with a short descriptive text
New web tool to compare cities

Cons:
Chosen cities represent unaffordable dreams for many
Fails to consider health and transportation resources
Vague methodology
Occasional inaccuracies
Misleading title

Overview

I always look forward to Money magazine’s annual list of Best Places to Live.  It’s probably because I was  asked to create their flagship feature (way back in 1987) and I continued to perform the rankings and contribute to their annual feature for over 20 years.  So to some extent, this study is my baby.

Over the years, Money’s iconic cover feature has been relegated to a few print pages and for better or worse, now lives mostly as a web feature on their CNN/Money site.

Here are their top ten small towns for 2011.  (Money deserves kudos resisting the current trend of burying the rankings in a slideshow and holding the reader hostage.)

1.       Louisville, CO

2.       Milton, MA

3.       Solon, OH

4.       Leesburg, VA

5.       Papillion, NE

6.       Hanover, NH

7.       Liberty, MO

8.       Middleton, WI

9.       Mukilteo, WA

10.   Chanhassen, MN

(click here to view the complete list of 100 cities)

This year, Money’s editors look at the best small towns in the United States (they change the focus from year to year.)   And their choice for #1 best small town?  Louisville, Colorado, which I really can’t argue with, considering that Louisville was the top pick of our 2006 best-selling book, “Best Places to Raise Your Family” (Bert Sperling and Peter Sander, Wiley).  Money’s list also includes other towns and suburbs which have appeared in my previous Best Places lists, so I know they’ve made some solid choices.

Unlike some previous Money articles which provided a full ranking of places, both top and bottom, Money now only reveals only the top 100 choices.  I don’t have any criticism of this year’s list, other than the cities are overwhelmingly white and wealthy.  Money is not going to help you discover any edgy, gritty small-town bargains. 

Money editors don’t present any negative aspects to life in any of their choices, other than local homes may be expensive.  Descriptions of the cities are relentlessly upbeat, without any shades of gray.

A number of cities on Money’s 2011 list have been recycled from previous years.  Actually, I regard this somewhat as a positive since it demonstrates a consistency regarding their focus and analysis.

One thing Money does well is thoroughly vet their final list of 100 cities through phone calls, interviews, or on-site visits.  No doubt part of this diligence is an attempt to avoid the embarrassments which have occurred in previous years, such as choosing Wexford, PA as one of their “Great American Towns” (#28) despite the fact that Wexford is not actually a town at all, just a post office serving parts of suburban Pittsburgh.  This is one of the problems in relying too much on raw data, which in this case is provided by OnBoard Informatics.

Behind the Numbers

·         It wasn’t easy to find, but I finally found a description of Money’s methodology on their site.  http://money.cnn.com/magazines/moneymag/bplive/2011/faq
There are some interesting questions raised by the description of their analysis.

·         Starting with a list of cities from 8,500 to 50,000 population, Money uses a filtering process to winnow the list down to the final 100.  I understand the cap of 50,000, but why exclude towns smaller than 8,500 residents?  (Fun Fact  – there are over 35,000 U.S. towns and places with less than 8,500 residents.)

·         Regarding the population, a couple of towns snuck onto the list which would have missed the cut if their population was reported correctly.  According to U.S. Census figures for 2010, Sharon, MA has a population of 5,862 (not 17,500) and Portland, CT has 5,823 residents (not 9,600).

·         When I took a closer look at the locations of the cities and towns comprising Money’s list of best places, I found that 98 of the 100 places are part of a larger metropolitan area (The only two truly independent small towns are Hanover, NH and Columbus, NE.  Click here for the list  link to bbb).  Because a standard demographic definition of a “suburb” is any place in a metro area outside of the central city (or cities), one could say a more correct title of the Money list is “Best Suburbs for Families”, but that doesn’t quite have the same ring as “Best Places to Live.”

·         One of Money’s primary filters is “screen out places with median family income more than 200% or less than 85% of the state median.”  I can understand excluding the overly wealthy cities on the basis that they are probably unaffordable, but why reject cities that are a little less well-off?  Could one logically conclude that Money believes a city can’t be a great place to live if their residents have a below-average income?

·         A filtering process only works so far, and then the cities need to be ranked from the small pool of chosen places.  There is no clue how the Money editors ranked the final places from 1 to 100.

·         An overwhelming deficiency of Money’s analysis is absence of criteria regarding health care.  There is no accounting of either health indicators or health resources (other than air quality), and there is no measure of mass transit options or local walkability.

·         There are two issues which most concern the American public today; jobs and housing.  Money’s analysis makes no mention of examining the current unemployment rate, which is the single best indicator of current economic stress (not job growth).  Also, there was no attempt to measure the local impact of our nation’s recent housing meltdown.  A community cannot be judged livable without plentiful jobs and an affordable, stable housing market.

·         Among the few “arts and leisure activities” considered are bars.  I guess there’s nothing like a nearby watering hole when family life gets a little too idyllic for Mom and Dad.

·         Money lists separate indices for both “Air Quality” and “Air Pollution”.  Wouldn’t these be considered two sides of the same coin, essentially measuring the same thing?  If so, how can a city earn different scores for each?

·         In the data that is provided for each city, there are a number of missing statistics for various cities in key categories such as housing prices.  There are also some outright errors, such as listing Chicago’s sales tax as 6.25% (they wish… it’s currently 10.25%).  And what does “Clear days” mean?  (San Diego has only 40 “clear days” each year?)  When errors like these creep in, it casts a pall over the whole effort.  And how were they able to accurately analyze the cities without a complete set of data?

·         The final list is heavily represented by the Midwest (34 of the final 100 cities) and the West coast is under represented with only a total of eight cities from California, Oregon, Washington, Alaska, and Hawaii.  This might make more sense if the focus of the study was affordability, instead of Best Places.