An article today in the NYT states that a number of (for the most part) liberal arts colleges are no longer going to participate in U.S. News and World Report's annual college rankings. Furthermore, the consortium of these schools has plans in the works to come up with their own ranking system.
At first, I applauded the desire to not be subjected to arbitrary outside judgment. For example, I once taught two sections of Principles of Microeconomics (not at Harvard) with exactly 49 students each. I thought this was a little bizarre, and when I inquired I was told that things were done this way because U.S. News and World Report has a category for the number of classes with fewer than 50 students. (As a sidenote, I doubt the students would have had a less positive educational experience had there been 98 students in one class. I do believe that there are some subjects and courses in which a smaller class with a discussion format is specifically desirable, but introductory microeconomics is not one of those courses.) Therefore, at least in some cases, the rankings result in inefficiency due to the fact that they cause nonproductive effort to be exerted on things like arbitrarily reducing class size. I would have to guess that this is not the only area that results in inefficient use of resources, and I am trying to think of a way to analyze rigorously the degree to which the rankings alter the priorities of schools and whether the changes are productive or unproductive.
However, the more I thought about this, the less positive I was toward the decision. First, wouldn't any ranking system that was designed by schools that, for whatever reason, found it not optimal to participate in the current system be inherently biased in favor of these schools? U.S. News can at least say that it is an independent entity with little incentive to favor any school or set of schools. I would certainly take the results of any in-house ranking system with a grain of salt. Second, isn't all publicity good publicity? Maybe I am not representative, but if I were using the U.S. News rankings as a guide, I would either fail to realize that a non-mentioned school existed at all or conclude (being on the losing side of asymmetric information) that the school refused to participate because it wouldn't have ranked well anyway. (So much for "no news is good news".) My assumption regarding a non-mentioned school's quality would likely be worse than what an actual ranking would have indicated. If potential students are like me, then refusal to participate doesn't appear to be optimal, though it may give the school administrators some utility from being able to issue a "so there".
Apparently (but not shockingly) I am not representative. Alex Brown, Colin Camerer and Dan Lovallo have done research that shows that "cold opening" a movie (i.e. not showing it to critics before it is released) can, for some types of movies, increase box office revenue (compared to a counterfactual of a reviewed movie with similar attributes). Economic theory would suggest that if a movie is not reviewed prior to opening, there must be a reason, since the review generates publicity for the movie if nothing else. Therefore, iterated thinking would lead an individual to believe that any cold-opened movie was a lemon and would stay away. However, this type of iterated thinking doesn't seem to always happen in the real world, and it seems that in some cases providing no information can be preferable to providing negative information. I suppose we'll have to see how those liberal arts colleges do in coming years.