Campus Climate Study
Equity Study Committee Report March 2001
Table 1: Faculty Response Rate by Sex, Ethnicity and Unit
Table 2: Unclassified Staff Response Rate by Sex, Ethnicity and Unit
Table 3: Faculty: Comparison of Mean Responses for Male – Female and Minority – White Faculty
Table 4: Female Faculty: Mean Responses in Descending Order of Perceived Inequity
Table 5: Faculty of Color: Mean Item Responses in Descending Order of Perceived Inequity
Table 6: Faculty: Top Five Issues of Concern and Five Issues of Least Concern
Table 7: Unclassified Staff Mean Item Responses by Sex and Race/Ethnicity
Table 8: Unclassified Staff: Female Item Means in Descending Order of Perceived Inequity
Table 9: Minority Unclassified Staff Item Means in Descending Order of Perceived
Responses to the Equity Study
There were two versions of the survey–one for faculty and one for other unclassified staff. Surveys were sent to all unclassified staff on the Lawrence and Edwards campuses, on the Human Resources System in February 2000, excluding lecturers and instructors. Only unclassified staff with 12 month appointments and who did not have tenure status received surveys. The Faculty Survey was sent to all tenure-track or tenured faculty on the Human Resources System in February 2000, with either nine or 12 month appointments, including administrators.
Returned surveys were scanned and the data was provided in MS Excel format by KU Office of Institutional Research and Planning. After some minor cleaning, data were formatted as a SAS dataset with labels. The answers to the survey questions were scored with values 1 through 5, and this presented us with some methodological choices. One question was whether response 1, "don't know/don't care", should be included in the analysis. Early on, a majority of the committee elected to treat those responses as missing data. As a result, responses 2 through 5, representing "no inequity", "mild inequity", "moderate inequity", and "severe inequity" were included in the analysis. While various advanced statistical methods were considered for characterization of these categorical answers, in the end it was concluded that a simple average of the responses was sufficient to reveal the most important findings. This approach had the major advantage of simplicity, in the sense that the results would be most accessible to a broad audience. Respondents who did not indicate their race, sex, or employment unit were excluded from the analysis. We calculated the average or mean score for all survey questions for various categories of respondents. The number of respondents who did not declare themselves to be white is extremely small, complicating comparisons of the experience of racial groups. For analysis purposes, faculty of color were aggregated into one category and comparisons of white versus all other respondents were made. A few faculty and unclassified classified themselves as “other" and were included in the faculty or unclassified staff of color category.
The SAS GLM procedure was used to calculate significance tests. According to the SAS documentation, the results of GLM for a dichotomous classification (male vs female) are equivalent to a t-test, while the results for a variable with two or more categories are equivalent to a one-way analysis of variance.
Open ended comments for each question were read and placed into categories based on the central theme of the comment. Some sub-themes were identified. Responses were entered into Word and Excel tables so that comments could be sorted by theme, sex and race/ethnicity.