Assembly members mull tool for evaluating administrators

By Mary Jo Frank

Recommendations for the evaluation of administrative officers, including a draft of an instrument for use by faculty in evaluating deans and other high-level administrators, will be fine-tuned and brought back to Senate Assembly.

In presenting the recommendations of the faculty’s Academic Affairs Committee to the Assembly last Monday, Chemistry Prof. Thomas M. Dunn said he doesn’t believe the evaluation process will lead to “administrative bashing,” adding that such bashing would not benefit anyone.

The Academic Affairs Committee recommends that the Assembly adopt the five-page draft instrument in principle and that a standing committee be set up to administer circulation of the instrument to appropriate faculty, to correlate the returns, and to communicate the results to the faculty, the Assembly and the administrators who were evaluated.

The draft instrument, similar to the Center for Research on Learning and Teaching (CRLT) evaluation form used by students to evaluate faculty, allows faculty to assess their dean’s performance in five areas: leadership, faculty and program development, fairness and ethics, communication, and administration. The five-point respondent scale represents a continuum of performance, from excellent to very poor. Respondents also are asked to indicate whether they have adequate information, probably have adequate information or lack adequate information to respond to each item.

The Academic Affairs Committee also recommends that:

—The standing committee prepare a review schedule so that each dean and Executive Officer is evaluated biennially, with the first review to take place no later than fall term 1993.

—The standing committee report back to the Assembly after each evaluation cycle so that its operation and value can be adequately discussed.

—Individual written comments on the evaluations be read by the committee but remain the property of the administrative officer being evaluated, to whom all reply forms will be given after they have been scored.

—The report to the faculty and the Assembly present only the average of responses to individual questions in each of five categories.

—The raw data be preserved for the term of the office of the administrative officers or for a period of five years, whichever is longer, so that the effectiveness and consistency of the process can be professionally evaluated.

Assembly members raised a number of questions about the proposed draft.

Roy Penchansky, professor of health services management and policy, said he thinks the evaluation instrument would be more useful if respondents were asked to identify themselves by faculty rank, gender, and racial or ethnic origin.

Some faculty members also asked if a low response rate could skew the results negatively.

Senate Advisory Committee on Univer-sity Affairs Chair Ejner Jensen said the faculty could decide not to publish the results if the response rate were below a certain percentage.

Dunn said he believes faculty will want to participate in the evaluation and that they, like students, will highlight good performance.

Provost Gilbert R. Whitaker Jr. said that one reason the CRLT evaluation form has a high response rate is because it is delivered in class. He also said that when the student evaluation form was introduced, results were not published until it was determined the tests were statistically valid.

Dunn said he is not concerned about the validity of the evaluation form because of the extensive involvement of the Institute for Social Research in its development.

Some faculty members wanted the Assembly to adopt the recommendations and draft instrument last Monday—a motion Jensen characterized as “unnecessary and premature.” The motion was defeated.

Dunn said he and the Academic Affairs Committee welcome faculty comments on the proposed evaluation form as they work to refine it.

Tags:

Leave a comment

Commenting is closed for this article. Please read our comment guidelines for more information.