At the end of 2013, our first Drugs on Campus report revealed which American colleges had the most drug- and alcohol-related arrests per 1,000 students between 2009 and 2011. Now, with 2012’s data just released by the Office of Postsecondary Education, we crunch the numbers once more to update the rankings and, more importantly, explore the following question: Do more on-campus drug arrests indicate that a college has a bigger problem with drugs, or a better-than-average approach to tackling them?

Every year, all colleges and universities that receive Title IV funding from the government submit certain crime figures to a database maintained by the Office of Postsecondary Education. Because it takes time to collect and clean up this data from the more than 7,000 institutions that submit it, the OPE’s database of drug and alcohol arrests (as well as the other incidents they log, such as fires and hate crimes) is always a year behind. Recently, the stats for 2012 have been released. These fresh numbers are therefore the most contemporary reflection of drug and alcohol arrests on college campuses available and are naturally begging to be crunched. Before we look at them though, it’s worth briefly summarizing exactly what approach we took in the first Drugs on Campus report and why its results proved to be somewhat controversial. Not because of their accuracy, but rather because of what they may or may not imply about the colleges at the top of the lists.

The Story So Far

Part one of Drugs on Campus focused on one particular measure of drug and alcohol activity: arrests on college campuses. The OPE’s database also logs drug and drinking arrests that occur in the areas surrounding colleges (off-campus), as well as those that happen strictly within residential halls. The on-campus arrest metric that our first report looked at includes residential hall arrests but excludes those which happened off-campus, and as such seemed a good general measure of illicit drug and drinking activity.

This time around, we’ve used the same figures once more (on-campus arrests), so we are comparing like-for-like with previous years’ stats. We’ve also only included institutions with at least 5,000 enrolled students and which have residential halls on their grounds (again, just like last time).

The map and rankings below show the 50 colleges that had the most on-campus drug arrests in 2012, as well as how much their ranks have fallen or risen since 2011.


What changes in drug arrest rates occurred between 2011 and 2012?

A lot changed between 2011 and 2012. In fact, the only colleges that maintained their exact ranks for on-campus drug arrests across both years were SUNY Oneonta and SUNY Oswego. But perhaps that’s to be expected. After all, even the slightest difference in the total number of arrests for a college can shift its rank up or down a few spots. However, it seems more than that has happened.

  • One-third of the colleges are newly in the top 50 for drug arrests in 2012

  • One-third have dropped places since 2011 (had fewer drug arrests per capita)

  • About two-thirds of the top 50 have risen since 2011 (had more drug arrests per capita)

Perhaps the most intriguing additions to the top 50 in 2012 are not just those which are new to the list (having not appeared in 2011’s top 50 at all) but which have also risen triple-digits in rank to be there – North Carolina at Pembroke, for example. This college ranked 186th in the country for on-campus drug arrests in 2011, but in 2012 rocketed to 5th (a spot that in 2011 was occupied by UW Oshkosh, which has since dropped to 33rd).

How did North Carolina at Pembroke make such a dramatic leap up the table? Well, the simple answer is that it had a lot more on-campus drug arrests per capita in 2012 than 2011. In 2011 it had 1.91 per 1,000 students, while in 2012 it had almost six times as many with 11.32. The actual numbers of drug arrests for each year were 12 in 2011 and 71 in 2012 (with a negligible difference in student population size). That means the campus went from having one drug arrest per month in 2011 to just over one a week in 2012 – a significant difference. We looked at North Carolina at Pembroke’s annual crime report from 2013 in an attempt to find out what caused the increase.

One potential clue can be seen in the disciplinary referrals for drug violations that took place specifically in student residence halls: they were twice as high in 2012 as 2011. There were also nine illegal weapons incidents in residence halls, where in 2011 there were none. These numbers suggest that whatever caused the jump in drug arrests (which put the campus so much higher in the rankings than they were last time) wasn’t a product of non-student crime activity.


Now here’s a key point and perhaps the central theme of this installment of Drugs on Campus: Does that mean that North Carolina at Pembroke (and other colleges whose ranks have risen) has suffered a decline since 2011 (because there have been more drug arrests), or seen an improvement (because they’ve allowed fewer people to slip through the net for drug offenses)? We’ve color-coded colleges who have gotten closer to the No. 1 spot with red arrows in the table above to imply that their shift is a negative one caused by having more drug arrests. But maybe the arrows for the colleges closer to the top spot should be green, because they’ve done a finer job of catching drug users and dealers.

Perhaps a better way to articulate this distinction (which is by no means a minor one) is by quoting Petra Roter, who is Vice Chancellor of Student Affairs at UW Oshkosh (a college that came first for alcohol arrests and fifth for drug arrests in 2011). Here is some of what Roter had to say in a response to Drugs on Campus part one and how her college placed for the two aforementioned measures:

“While there are serious issues with the analysis, the report might be better framed as a ranking of responsibility,” the statement said. “It is a demonstration of how seriously UW Oshkosh and other institutions in Wisconsin and around the country are taking a national problem [and] how well they are responding to dangerous behaviors and crimes in their campus communities. Serious questions could be raised about universities and colleges reporting few to no arrests, citations, and referrals.”

Vice Chancellor for Student Affairs Petra Roter, as quoted by UW Oshkosh’s student newspaper Advance-Titan, February 13, 2014

This makes absolute sense. Not necessarily the part about there being serious issues with our first report (we’ll get to that shortly), but certainly Roter’s comment on colleges that report few or no arrests. After all, there could only be three main circumstances that might result in a college reporting few or no drug arrests to the Office of Postsecondary Education’s database. They are:

  1. The college has had its fair share of drug activity, but the police have done a poor job of arresting the culprits. Hence, when they report their numbers, they are surprisingly low.
  2. The college has had plenty of drug activity and lots of drug arrests, but has done a poor or negligent job of reporting its numbers in full (seems unlikely, but it’s possible).
  3. The college has had very little drug activity and therefore not many arrests (despite being diligent in their policing efforts). Their numbers are therefore rightfully low.

It could, of course, be the case that a particular college’s numbers are a result of a mix of the above. Perhaps UW Oshkosh actually has only marginally more drug and alcohol activity on its campus than average, but does such an excellent job of policing it that their numbers give the impression that they actually do deserve to be called UW Sloshkosh (a nickname that’s sometimes used on social media by the college’s students to refer to the campus).

The question that demands an answer, then, is: How much of each college’s rank can be attributed to excellent policing of relatively low levels of crime, as opposed to average policing of legitimately high levels of crime?

Common sense suggests two things. One is that for a police force to make a lot of arrests, there still have to be a lot of arrestable offenses occurring (not necessarily more than average, but still more than none). The other is that some colleges inevitably must have worse problems with underage drinking and illicit drug-taking than others. And unless the worst ones are systematically being negligent in how they police and report drug activity on their grounds, they will presumably show up somewhere near the top of the rankings for per capita on-campus arrests.

This certainly does not settle the question of whether UW Oshkosh and other high-ranking colleges deserve to be called the ‘druggiest’ in the nation. In truth it’s probably the case that some actually are pretty druggy, while others are a bit druggy but have a particular method of making arrests that gives the impression that they are worse than they actually are.

On that note, let’s examine the fresh figures for 2012’s on-campus alcohol arrests. They may shed more light on this central question of whether high-ranking colleges are sub- or above-par in tackling drug problems.


Comparing 2011 and 2012 state rankings for drug and alcohol arrests on
college campuses


Based on the above rankings, it appears as though there have been more changes in position at the state level for drug arrests than there have been for alcohol arrests – echoing what we saw earlier in the college-level rankings from 2011 to 2012. One way to quantify this particular fact is to count how many states appear in the top ten in 2011 and 2012 for each category of arrest: drugs and alcohol.

Seven of the same states appear on the top ten lists for most drug arrests in 2011 and 2012, while only five of the same states appear on the bottom ten lists for the same years. For alcohol arrests, however, eight states appear on the top ten lists for 2011 and 2012, and six show up in the bottom for both years.

There’s clearly some annual fluctuation happening in drug and alcohol arrest rates per capita, both at the level of individual campuses and on a larger scale by state. Let’s examine how big this fluctuation has been over the last ten years for a set of campuses that have consistently appeared on the top ten lists for drug arrests for the last three years (2012, 2011, and 2010). Have they just been having a bad time recently (or using a great approach to making arrests!), or have they consistently ranked highly for more than a decade?

Comparing some of the top ranking colleges for on-campus drug arrests over the last decade


Although most of the colleges above show high rates of drug arrests for the majority of years in the last decade (relative to other colleges that are selected at random), it’s also clear that the year-on-year fluctuations can definitely vary, sometimes quite dramatically. But despite the differences, certain colleges (like those above) do have consistently high rates of drug arrests. What’s more, the per capita rates of arrests for drug violations haven’t for the most part been dropping in the last decade (again, just based on the seven selected above). Colorado Boulder has seen a sharp increase in drug arrests starting in 2009 and continuing until 2011. So has SUNY Oswego. UW Oshkosh is possibly the most intriguing of the seven, though, because while the other six show very rough trends upwards (that is, more per capita arrests year-on-year), Oshkosh had a significant low from 2006 to 2008. Then its drug arrest rate increased until it matched what it was in 2002, until settling back down in 2012 to about the same level as in 2005. In other words, it’s pretty much all over the place.

Does this mean that UW Oshkosh has been altering its policing policy every year and therefore having varying levels of success in making drug arrests (and then reporting them to the OPE database)? Probably not. It seems more likely that something else has been causing the fluctuations, like shifts in the college’s drug culture. Or, even more likely, that the varying arrest rate has nothing to do with the college in and of itself, but rather the surrounding area and even the state as a whole. After all, why would a new batch of students at the start of a school year bring a sudden influx of drug abuse, if they have been enrolled on the same basis as students from previous years and from similarly diverse parts of the country? In actual fact, it’s probably the constants that are more likely to cause the drifting figures, like the policies of the geographically-fixed police force and the socioeconomic conditions surrounding the city and state. Perhaps even the availability of drugs in any particular year could have an effect, which would create a special kind of market force of its own.

Another potentially illuminating method of judging whether a college is actually rife with more than its fair share of drug crime is by looking at how much of other types of crime occur on its campus. While high levels of burglaries and sex offenses, for example, could certainly still be attributed to a more vigilant police force, it seems possible that they might also hint at a deeper problem a college has with all types of criminal activity on its grounds.


All eight lists above are based on the number of on-campus incidents per 1,000 students in 2012. Only the drug and alcohol lists relate specifically to arrests, however. The crimes from other categories may or may not have resulted in arrests – rather, they were reported as criminal incidents that happened on the respective college campuses in 2012. They should, therefore, serve as useful reference points for judging whether the colleges in the drug and alcohol lists have general problems with crimes of many types, or mainly issues with drinking and drugs.

It turns out that most of the colleges on the drug and alcohol lists do not appear on any of the other crime categories’ top tens. The same ‘parent institutions’ do (like SUNY, which is in the top ten for drug and alcohol arrests, as well as illegal weapons possession), but not specific campuses that belong to those institutions. There is, however, more overlap between the six ‘other’ categories.

  • Alabama State University - #3 Burglaries  , #4  Assaults
  • New Mexico State University Dona Ana - #10 Burglaries ,#6  Arson
  • UMass Dartmouth - #1 Arson , #3  Assaults
  • Grambling State University - #4 Burglaries  , #3 Illegal Weapons Possession,    #1  Assaults

Grambling State University is the only campus that appears on three of the top ten lists. But equally surprising is not only the appearance of Stanford University on the Burglary and Forcible Sex Offenses lists, but the presence of six private universities on the Sex Offenses top ten, four of which are Ivy League schools. None of the other crime categories has as many Ivy League schools in its top ten. So why might an Ivy League school apparently have no drug or alcohol arrests but enough sex offenses to place it in the top ten nationwide?

It might be due to under-reporting of certain types of crimes that have taken place but did not make it to the categories within the OPE database that they perhaps rightfully belong to. For example, Princeton (No. 1 for per capita forcible sex offenses), reported zero on-campus alcohol arrests in 2012, but did report 28 ‘judicial referrals’ to a university committee located within the office of the dean of undergraduate students. Similarly, there were seven on-campus arrests for drug violations, yet 45 referrals to the same committee. The same thing appears to be the case for other Ivy League schools. They report few to no arrests for alcohol or drug offenses, but in many cases a high number of incidents that receive university disciplinary actions. For instance, at Columbia University, no arrests for alcohol were reported in 2012 yet the school had 136 cases of disciplinary action for that infraction. This may be a method certain colleges use to avoid their students receiving criminal records.

Even when it comes to forcible sex offenses, which are presumably harder to keep out of the OPE’s database, there have been cases of Ivy League schools getting in trouble for improper reporting of incidents. In 2013, Yale University was fined $165,000 for “very serious and numerous” violations of the Clery Act by the U.S. Department of Education. The violations were related to sex offenses that Yale failed to report, plus not appropriately defining crime statistic areas.


While 2012’s crime data confirmed certain colleges’ high per capita rates of drug and alcohol arrests, it’s still unclear whether higher levels of arrests for these types of crimes indicate a systemic problem with drugs and alcohol on campuses, or an effective and uncompromising approach to stamping such issues out.

Part one of Drugs on Campus showed some correlation between state-level illicit drug use amongst college-age individuals and arrest rates on campuses, which suggests that there may be some link between a college’s geographical location and its levels of drug and under-age drinking crimes. However, comparing top ten colleges across multiple crime categories like burglaries and sex offenses to drug and alcohol top ten rankings doesn’t show an obvious link between general on-campus crime levels and drug and alcohol arrests. This suggests that other factors influence which crimes are submitted to the OPE database, the main one of which is probably a varying attitude toward which incidents should lead to bona fide arrests and criminal records, and which are instead classed as ‘judicial referrals.’

What’s certainly clear is that when the OPE states on its welcome page that “valid comparisons of campus statistics are possible only with study and analysis of the conditions affecting each institution,” it’s not kidding. Innumerable factors appear to contribute to a college’s specific drug and alcohol culture. We intend to continue exploring what they are in future Drugs on Campus reports.


1) The Campus Safety and Security Data Analysis Cutting Tool -
2) Drugs on Campus, Part One -
3) UNCP’s Annual Security and Fire Safety Report 2013 - cached version here
4) UW Oshkosh Student Newspaper - Infamous ‘Sloshkosh’ article
5) The Princeton Packet - October 3rd 2013 article
6) US News & World Report - May 16th 2013 article