Doctoral Attainment by Wittenberg Graduates

December 21st, 2012

In a very unique and interesting data project, the National Science Foundation (NSF) tracks doctoral completions and connects them to the institution granting the original baccalaureate degree. Between 1996 and 2005, Wittenberg produced 4270 baccalaureate degrees. Between 2001 and 2010, Wittenberg graduates earned 212 research degrees (PhD, ScD and EdD). (Professional degrees are not included in the project.)

Wittenberg is a member of the Higher Education Data Sharing Consortium (HEDS). The research staff at HEDS constructed a model attempting to predict doctoral production among a sample of 1279 schools. By extension, the model was used to determine “over-performers”, i.e. those schools whose graduates earned more doctoral degrees than expected.

Using a nonlinear regression analysis, with SAT and ACT scores for entering first year students as the main predictor, the HEDS model predicted that Wittenberg graduates were expected to earn 2.18 doctoral degrees per 100 baccalaureate degrees granted. In fact, our grads earned 4.96 doctoral degrees per 100 baccalaureate degrees. Among the 1279, this result placed Wittenberg in the 40th position, 1.91 standard deviations above the predicted value.  A high proportion of the variation among schools in doctoral completion was explained by the model.

What is interesting is that by the method used in the paper, we outperformed many large research institutions.

How Diverse is the Wittenberg Student Body?

June 25th, 2012

Measuring diversity can be tricky. We could, for example, calculate the percent of whites (or blacks) in a campus population and conclude that the higher the percent, the less diverse is the campus population.

Economists use a Hirschman-Herfindahl Index (HHI) to look at concentration in an industry. The Justice Department and the Federal Trade Commission use the HHI in antitrust cases. The measure is pretty simple – calculate the sum of the squared percentage market shares. If an industry is monopolized by one firm, the HHI is 10,000. If there are 100 firms, each with a 1% market share, the HHI is 100. So the HHI is bounded below by zero and above by 10,000, creating a simple index number of concentration.

IPEDS allows us to put our students into 18 race / ethnicity boxes, 9 for each gender. If all 18 categories had exactly 5.5% of a diversity “share”, the diversity HHI for a college would be 555. If a 2000 student campus had 500 black men, 500 black women, 500 white men and 500 white women , the HHI would be 2500.

The attached spreadsheet using 2010 data calculates HHIs for about 100 institutions in an IPEDS category of “like” institutions. (I have further sorted down based on my own bias as to the definition of a “like” institution.

diversity062512

Alma and Hope are least diverse, and Earlham and Swarthmore are most diverse by this measure. Wittenberg is in the 35th position  of 53 on the list – toward the less diverse end. While we are reasonably well represented in the number of black students on campus, our Asian and Hispanic numbers are low compared to some on the list.

2012 May Grad HEDS Senior Survey

May 29th, 2012

 During the month of April, a survey was conducted asking our May graduates to assess their experiences and to indicate their plans. 408 received the survey and 210 responded. Question #22 reads as follows -

“The list below contains some abilities and types of knowledge that may be developed in a bachelor’s degree program. Please indicate the extent to which each capacity was enhanced by your undergraduate experiences.”

26 areas are assessed and the results are below.

 grad_survey_may2012_q22

Self reported results are not true measures of achievement, but the results indicate a fairly high level of satisfaction among 2012 Wittenberg graduates. Later this summer, we will be able to compare ourselves to other HEDS institutions.  I will also release other results as the summer progresses.

Who Are Our Peers?

February 4th, 2012

Every college wants to know who its peers are. Discussions about the peer group often bog down in confusion around the peer / aspirant distinction. One way to address the question is by listing some factors (endowment, faculty salaries, board scores, etc.) all could agree on. We are “like” those institutions if we are close to them on a reasonable statistical measure of closeness.

I looked at 383 institutions from the U.S. that could be broadly considered similar under a definition provided by IPEDS. I gathered data for all 383 on full time enrollment, endowment per student, net tuition revenue per student, the six year graduation rate, faculty salaries, tuition and fees, retention, the student to faculty ratio and ACT 25th and 75th percentile measures. I then constructed what economists call a “gravity” model, measuring distance from Wittenberg on those 10 factors for each of the other 382.

Here are the top 22 on the closeness measure:

Lake Forest College **
Roanoke College **
McDaniel College
Juniata College **
Knox College **
Birmingham Southern College **
Moravian College
Washington & Jefferson College **
Saint Anselm College
Ohio Wesleyan University **
Assumption College
Susquehanna University **
Central College
Emmanuel College
Luther College **
Hartwick College
Cornell College
Mount St. Mary’s University
Hiram College **
Saint Johns University
Marietta College **
Allegheny College **

 

The highlighting (**) represents my musing…. I had never heard of McDaniel College (my apologies to our friends out there). The others not highlighted seem to be outside our geographic area.

So If I were the peer group czar, I would choose our peer group by picking the first 12 colleges on the list that pass the laugh test. A general approach would be to start at the top and uses a consensus method. Lake Forest (most like us on the gravity measure) is either in or out by consensus vote…. etc.  

Who is at the bottom of the list, using the above described model? (These schools are apparently not like us.)

Vassar College
Trinity University
Hamilton College
Claremont McKenna College
Bowdoin College
Wellesley College
Berea College
Cooper Union for the Advancement of Science and Art
Grinnell College
Williams College
Swarthmore College
Amherst College
Pomona College

 How would you pick an aspirant group, using this method? Drop down to say, the 100th position on the list, slice off the next 25 or so names, remove those who are unlike us because they are “lesser” institutions, and pick the dozen or so who you would really like to be.  

Perhaps a different list of factors, or unequal weighting of the 10 factors used would generate a different peer list. But the bloodless approach I have described here removes the impressionistic factors often used to construct peer lists.

Liberal Arts at the Brink?

August 29th, 2011

A provocative new book of the same title by Victor E. Ferrall, Jr. is not always fun to read. Ty Buckman has arranged a discussion of the book for FA11, and the book is also being read by some members of our board of directors at the suggestion of David Boyle, our board chair.

Ferrall’s thesis is that our beloved institutional type is in danger of slipping away into vocational education or disappearing altogether. His logic is relentless, and he says we need to get our act together before it is too late. What we offer is too costly, no longer sufficiently in demand (or poorly understood and explained), or structurally inefficient.

Ferrall is not a crackpot. As past president of Beloit College, he has thought about the plight of liberal education a lot. He loves liberal education and wants it to survive into the future. In other words, he is like you and me.

One especially provocative suggestion of the book is that we need to collaborate rather than compete with other liberal arts institutions. What would this mean? Consider our sister institution down the road – Capital University. At least with respect to departmental structure, we are pretty similar. We both have Music, Math, Economics, etc. A boring but critically important ECON 101 concept – economies of scale – reveals the problem Ferrall takes up.

Here is a thought experiment I have contemplated recently. What if we “merged” with Capital but managed somehow to keep our separate identities? Lest you think that farfetched, what were two Lutheran seminaries are now one. In another venue, school consolidation at the K-12 level has proceeded at a relentless pace for 50 years.

We could teach Music here and it would be discontinued at Capital. They could teach Economics or French and we would discontinue those departments. Students and faculty might seamlessly (remember, I am an economist, and we love frictionless planes, too!) go back and forth between Columbus and Springfield. Remember, the idea is to exploit economies of scale and eliminate duplication.

Whether a “merger” is too wild an idea to consider, Ferrall  thinks we need to reconsider our business model and look for kinds of innovation we would not have considered in earlier times -  all in the cause of saving what we love.

Please join the discussion led by Dar Brooks Hedstrom and me, upcoming in September. Look for a detailed announcement soon.

Assessment and Accountability

July 8th, 2011

Colleges and universities are finally getting more serious about how to assess learning outcomes. When I began teaching, assessment meant giving exams and doling out grades. Professors were independent authorities, standing above any other arbiter.

High, middle and elementary schools have been involved in assessment for some time now. Do you remember the Iowa exams? Now, the teaching careers of “lower” education faculty are staked on scores amassed by their minions.

Some college faculty and administrators see this new emphasis as a good thing, while some see it as an unwelcome intrusion by outsiders who know little about higher education. Possibly, these critics say, it is an attempt to replicate No Child Left Behind at the college level.

We do not do much direct assessment of learning outcomes in higher education, and that is true at Wittenberg also. The Physics department is an exception. This summer, Jeremiah Williams is involved in a grant funded effort to improve skill assessment at the course level.

Our assessment tends to be indirect. What is meant by indirect assessment? The IDEA forms administered in our classes are indirect assessment instruments. They ask students to rate their understanding of course goals instead of actually measuring those outcomes. We tout our NSSE results as proof of student engagement and rely on some studies connecting engagement to learning outomes. But again, we cannot really say that NSSE scores directly measure learning outcomes.

There are ways to measure directly some essential learning outcomes. The GRE subject exams measure endpoint assessment in fields like ECON. While an improvement over indirect assessment, they cannot really measure value added unless they are accompanied by some pre-test. The Collegiate Learning Assessment (CLA) is another direct assessment instrument, designed to assess skills not tied to particular disciplines, but rather the kinds of skills we hope to see gained in our general education curriculum.

The CLA has been in the news in connection with the book Academically Adrift. That book suggests that large numbers of college students are leaving with little or no progress on key skills we expect college students to have gained.

How should we respond to the call for assessment? We can ignore it and let our mission be shaped by others, or we can get busy and administer the CLA, the GRE or other instruments we believe measure and document our students’ outcomes. In this era of accountability our students, their parents and accreditors will demand it.

Grade Inflation, Revisited (properly, this time)

June 17th, 2011

Earlier, I posted two items on grades at Witt. One dealt with how hard (at a point in time) particular departments grade compared to each other. Another dealt with how grades at Witt compare to grades at other schools, also at a point in time. It included the phrase “grade inflation” in the title, a really bad mistake for an economist! Why? Inflation deals with change over time.

Doug Andrews posted a comment with a question about grade inflation. He provided the pre 2000 data he had gathered in his capacity as a Phi Beta Kappa adviser. It represents cumulative mean GPA for grads at the point of graduation. I have assembled data for the post 2002 mean cumulative GPA for grads.

Click on this image for a visual.

grade_inflation

It seems clear to me that grades have crept up over time.

Do Witt Faculty Have the “Right” Teaching / Research Balance?

June 8th, 2011

If you listen to the pundits in the popular press, faculty are selfishly concerned about their research and are neglectful of teaching. Is this true at Wittenberg?

87 faculty responded to the HERI Faculty Survey conducted in the SP11 semester. While the results are preliminary and do not yet allow us to compare ourselves to other colleges, we can get an early look at the views of our faculty on the teaching / research balance question. Question 9 asks

“Personally, how important to you is _______?

 
 
  Importance: Teaching Total
Very important Essential
Importance: Research Not important

1

4

5

Somewhat important

6

27

33

Very important

4

25

29

Essential

4

16

20

Total

15

72

87

 All 87 respondents say teaching is “essential” or “very important” and none say it is “not important” or “somewhat important”. The distribution of responses indicating the importance of research suggest to me that our faculty value research, but they see it as less important than teaching, our primary mission.

Applicant Career Goals vs. Graduation Major

June 8th, 2011

We ask students what their career goals are when they apply to Wittenberg. Of course, we track our students as they graduate. How well do applicant intent and graduation major match up?

This table  gives some interesting insights. (It is big and can’t really be replicated inside this window.)

The data come from the 2005 entering cohort and looks at their May 2009 graduation results. Here is how to interpret the table for one group of students – those who declared a career interest in Business and graduated with a Management degree:

40 listed Business as their career goal (go to the end of the row labelled BUSINESS)

42 from the 2005 cohort actually graduated in May 2005 with a MGT major (go to the bottom of the MGT column)

24 who declared BUSI graduated in MGT (go the the BUSI / MGT crosshairs)

The data could be used to assess the “drift” away from intended field, “attraction” to a major, and “market share” gain and loss.

Test Score Optional (TSO)

June 7th, 2011

We implemented a TSO policy in 2008. Students could choose to not report their SAT and ACT scores to us. A number of unexpected things happened. First, not as many applicants chose TSO as many had predicted. Second, the people you would expect to choose TSO did not always do so, while some chose it and you wonder why they did. Also, the difference between our TSO average and the averages for all tests  did not differ much. (A number of students chose to send scores and later chose TSO.)

The following histogram (using FA11 depositor data from June 1) illustrates my observations. The vertical distance between the red and black horizontal bars at a given ACT score shows the number who chose TSO under the ACT test. (SAT results show a similar pattern.)

(Click on the graph to enlarge it.)

You might surmise that the averages did not change much as a result of TSO, and that would be correct. The TSO average was 25.7 vs. 24.9 for the average of all scores we could get. (Similarly, for the SAT, the averages were 1137 vs. 1119.)

83 depositors chose to drop their scores and another 82 never reported them to us in the first place, out of a group of 597 June 1, 2011 depositors.