100%

Scanned image of the page. Keyboard directions: use + to zoom in, - to zoom out, arrow keys to pan inside the viewer.

Page Options

Download this Issue

Share

Something wrong?

Something wrong with this page? Report problem.

Rights / Permissions

This collection, digitized in collaboration with the Michigan Daily and the Board for Student Publications, contains materials that are protected by copyright law. Access to these materials is provided for non-profit educational and research purposes. If you use an item from this collection, it is your responsibility to consider the work's copyright status and obtain any required permission.

November 10, 2015 - Image 4

Resource type:
Text
Publication:
The Michigan Daily

Disclaimer: Computer generated plain text may have errors. Read more about this.

Opinion

JENNIFER CALFAS

EDITOR IN CHIEF

AARICA MARSH

and DEREK WOLFE

EDITORIAL PAGE EDITORS

LEV FACHER

MANAGING EDITOR

420 Maynard St.

Ann Arbor, MI 48109

tothedaily@michigandaily.com

Edited and managed by students at

the University of Michigan since 1890.

Unsigned editorials reflect the official position of the Daily’s editorial board.

All other signed articles and illustrations represent solely the views of their authors.

The Michigan Daily — michigandaily.com
4 — Tuesday, November 10, 2015

Claire Bryan, Regan Detwiler, Ben Keller, Payton Luokkala,

Aarica Marsh, Anna Polumbo-Levy, Melissa Scholke, Michael

Schramm, Stephanie Trierweiler, Mary Kate Winn,

Derek Wolfe

EDITORIAL BOARD MEMBERS

Protecting political expression

A

s we rapidly approach the Uni-
versity’s bicentennial, Central
Student Government is tak-

ing the chance to look
back at the historical-
ly significant changes
that University stu-
dents have inspired
in higher education
for our country over the years. One of
those major changes was the collection
and release of course evaluation data.

The history of course evalua-

tions at the University is undeniable.
Course evaluations were established
for students, by students, in 1969.
Anyone who indicates otherwise is
spreading a false narrative.

What’s more, Prof. Tim McKay

provided members of the Senate
Advisory Committee on University
Affairs, and subsequently CSG, with a
1969 student viewpoint in The Michi-
gan Daily that clearly explains the
true history of course evaluations.
Students, with the University’s sup-
port, established the Association for
Course Evaluation, which offered stu-
dents access to course evaluation data
and counseling to help their peers
obtain a reliable student perspective
on courses in their selection process.

After seeing the success of the ACE

office, every single department in LSA
— then referred to as the Literary Col-
lege — requested copies of the results,
which were then used for promotion
and tenure decisions. Individual pro-
fessors also began approaching the
ACE office hoping to obtain the stu-
dent feedback for their own benefit,
and their graduate student instruc-
tors’ benefit. The history here is
unquestionable;
students
brought

this service and offered it for the bet-
terment of all, and now students are
unable to access that very service.

On Oct. 26, the Senate Advisory

Committee on University Affairs
voted to continue denying students
access to course evaluation data, in a
shortsighted and disappointing deci-
sion that does not reflect the origi-
nal purpose of course evaluations.
Instead, they have attempted to
indefinitely postpone student access
to this data by calling for a new

instrument with no clear timeline.
While we appreciate the Faculty Sen-
ate’s willingness to collaborate mov-
ing forward, this vote represents a
roadblock to ensuring informed aca-
demic decision-making by students.

The University currently stands at

odds with its peer institutions because
it doesn’t provide course evaluation
data to its students. Harvard Univer-
sity; Yale University; Princeton Uni-
versity; the University of California,
Berkeley; Pennsylvania State Univer-
sity; University of Virginia; New York
University; Stanford University; the
University of Chicago; Massachusetts
Institute of Technology; California
Institute of Technology; Columbia
University; Duke University; the Uni-
versity of California, Los Angeles; and
countless other institutions are all
successfully providing their students
with course evaluation data, while
our own administration is not. Once
leaders in course evaluation, the Uni-
versity has since fallen far behind our
country’s other top schools.

The CSG Executive Board has never

wanted this to be a battle between
professors and students, and we have
worked hard to make that abundantly
clear. Our goal is to share the true nar-
rative surrounding course evaluation
data and its historical significance,
which clarifies why we have course
evaluation data in the first place.

We are focused on ensuring that

students ultimately have access to
course evaluation data. The current
evaluation instrument certainly has
room for improvement, and the CSG
Executive Board will readily partici-
pate in the proposed University-wide
committee to review the current
evaluation instrument. However, we
will not accept an outcome that does
not give students access to the data
that was, without question, originally
collected to support students’ course
selection decisions. That being said, it’s
also unrealistic for us to enter a course
evaluation review process and expect
that a radically different instrument
will somehow satisfy all involved.

For that reason, we will stand res-

olute in ensuring full student access
to course evaluation data by the

time of course selection for Fall 2016
courses, whether it is provided by the
University or a student organization.

We fully recognize the role that

students must play in this. Ever since
the school transitioned from paper
to electronic course evaluations,
student feedback has dropped nota-
bly. Even still, according to a report
given to SACUA ahead of their vote,
the decline in response rates has had
no statistically significant impact on
the evaluations. That being said, it’s
important to note that course evalu-
ation data will only be helpful for
future students and faculty if cur-
rent students take it seriously, but it
is equally important to recognize that
students need to be incentivized to
take course evaluations seriously.

What incentivizes students to legiti-

mately care about and be invested in
course evaluations today? Currently,
nothing tangible. What would incen-
tivize students to do so? Giving class
time for students to fill them out or
requiring students to fill out their
course evaluations before they receive
access to the data are two options.
These are solutions that CSG is ready
to support, but using statistically insig-
nificant decreases in student partici-
pation in course evaluations to argue
against their release is disingenuous.

Moving forward, CSG President

Cooper Charlton will be meeting with
representatives of SACUA and Univer-
sity administrators to discuss our next
steps regarding course evaluation data
this week. At the meeting on Wednes-
day evening, we will maintain our
commitment to the release of course
evaluation data to the students of the
University, signifying a return to the
original purpose of course evalua-
tions. Additionally, we will call for the
University to release the data to stu-
dents through academic advisors for
the current course selection period,
and for full access to the data by the
selection period for the fall of 2016.

CSG President Cooper Charl-

ton, Anushka Sarkar, CSG chief

programming officer and Sean

Pitt, CSG chief of staff on behalf

of the CSG Executive Board.

D

espite being a historically
liberal institution, the Uni-
versity fails to include the pro-

hibition of political
or intellectual dis-
crimination in its
anti-discrimina-
tion policy. Across
campus, the lack
of
institutional

protection
for

unpopular opin-
ions

includ-

ing conservative
views — can have
a chilling effect on
class
discussion

and student expression.

In all academic fields, discussion is

critical to understanding the nuance
of claims presented about the world.
In some disciplines, like public policy
— which strives to educate students
not only on current policies but also
on how to change them — understand-
ing multiple angles of divisive political
issues and working with people who
hold opposing viewpoints is essential.

Discussions in my Ford School

of Public Policy classes regularly
prompt dialogue on contentious
political issues like minimum wage
laws, detention of terror suspects and
the far right’s influence in Congress.
Here, disagreements are political,
and discussions often reveal ideol-
ogy if students participate honestly.
Group discussions can become more
heated than average if one or two
students disagree with the rest of
their peers, and students may avoid
expressing views that might conflict
with the opinions they believe their
professors hold.

Despite the opportunity for dis-

agreement, faculty members require
students to participate in discussion,
and give participation considerable
weight toward final grades. Most Pub-
lic Policy professors who I have come
into contact with have been ame-
nable to a diverse range of opinions.
But, when students can’t be sure how
accepting a professor will be, they may
elect not to express their views at all.

Susan Collins, dean of the Ford,

told me in an interview that most
policy schools across the country tend
to have greater liberal representation,
and past surveys of Ford students
show that relatively few self-identify
as Republicans. Despite this, Collins
believes that Ford students should be
exposed to a wide range of viewpoints.

“I feel very strongly, and this is a

view shared widely around the build-
ing,” Collins said, “that as a policy
school, it’s really essential that people
hear and understand and grapple with

a range of perspectives, but in particu-
lar political perspectives.”

Even still, Collins said that stu-

dents have expressed to her their
discomfort in expressing their views
in class when they didn’t think any of
their classmates shared them.

As a Republican in the Ford School,

I can certainly relate to these students.
One of my policy classes spends a
considerable amount of time in small
group discussions, where, in my expe-
riences, the majority of group mem-
bers have tended to share similar
views on topics, and I’ve tended to dis-
agree. Sharing my opinion to a group
of people who believe the opposite can
be intimidating, and has been met, on
occasion, with sarcastic, less-than-
flattering remarks.

I tend to be outgoing and outspo-

ken, and probably more willing than
average to share my thoughts with
those who might disagree. If I was
afraid of sharing my opinions with
liberals, I probably wouldn’t have
identified myself as a Republican in
the Daily so many times over the past
two years. But when my peers don’t
take my opinions seriously or make
negative comments, it makes me
think twice before choosing to par-
ticipate in class discussion again in
the future. If I have occasionally felt
too uncomfortable to share my opin-
ion in class, I can only imagine how
my Republican peers might feel.

In most classes, faculty members

seem to do their best to encourage dis-
senting opinions and highlight mul-
tiple sides of arguments. This isn’t just
inclusive; it’s good pedagogy. To the
extent that the Ford School aspires to
produce effective public servants, it
should strive to ensure its students can
effectively communicate in institu-
tional environments with far greater
intellectual diversity than the Public
Policy school itself. It can’t do that if
only one side of every issue is afforded
serious consideration.

Despite expressed openness to

ideas from all areas of the political
spectrum, it’s clear that the school
has a long way to go if it truly wants
to become as tolerant of different ide-
ologies as it strives to be. A solid first
step toward this goal doesn’t have to
be Ford-specific: Adding political and
intellectual discrimination to the Uni-
versity-wide nondiscrimination policy
could go a long way to assure students
their views will be respected.

Should the University continue to

neglect this issue, the Public Policy
School should draft an anti-discrim-
ination policy of its own that specifi-
cally prohibits discrimination based
on political expression or ideology,

and strengthens existing measures
to promote an inclusive environment
conducive to active, vocal participa-
tion from all students, regardless of
background or identity.

But making the Ford School as

diverse in political thought as the
government bodies many Ford stu-
dents want to work in after gradua-
tion will require much more than a
one-off policy change.

The Ford School should increase

representation from political con-
servatives and Republicans. Better
outreach to right-leaning student
organizations will help students who
don’t already have a network of Pub-
lic Policy students to tell them about
the program or help them through
the application process.

Even more importantly, the Pub-

lic Policy School needs to become a
place that can credibly claim to pro-
spective students an acceptance of a
wide range of viewpoints. This might
include framing class discussions to
highlight both sides of an argument,
or assigning a paper in one of the
required classes that asks students to
argue a policy position they don’t sup-
port, prompting them to give serious
consideration to an opposing point of
view. It may also encourage faculty
to avoid party-line generalizations
that tempt students to think about
Republicans and Democrats as being
homogenous groups without ideologi-
cal variation, pitted against each other
without room for compromise.

This isn’t to say that conserva-

tive students can’t find a place at
the Ford School, or benefit from the
excellent classes it offers. On the
contrary, I think that students hold-
ing
underrepresented
ideologies

have a unique role to play in making
the Ford School even better than it
already is. By creating an environ-
ment in which students challenge
each other to develop best possible
policy solutions, the Ford School can
more effectively produce leaders who
know how to work through the grid-
lock that plagues government today.

I doubt that this can happen

in classes where the viewpoints
expressed are frequently the same,
either because there are too few stu-
dents who disagree, or because the
ones who disagree don’t feel comfort-
able expressing their views. Progress
on this issue will benefit not only cam-
pus conservatives but also the Public
Policy School itself, by helping it pro-
vide a more comprehensive education
for all of its students.

— Victoria Noble can be

reached at vjnoble@umich.edu.

VICTORIA
NOBLE

“Netflix and chill” could start

getting a lot harder

FROM THE DAILY

A


solution to fix standardized testing practices has recently
bubbled up. On Oct. 24, the Obama administration admitted
it had pushed too far in holding state schools accountable

based on their students’ standardized test scores. In light of this
acknowledgment, the administration called on Congress to include
specific measures that address the overemphasis on standardized
testing in the Elementary and Secondary Education Act — a bill
previously termed No Child Left Behind. Two different versions of the
ESEA were reauthorized in July by both the House and the Senate, and
the fate of the bill lies in the two chambers’ ability to compromise.

As part of Obama’s Testing Action Plan,

which accompanied this announcement, by
January 2016, the Department of Education
will release a guidance plan for all states
and districts detailing how to assess what
standardized test practices will be fair,
valid and efficient. These developments are
especially compelling in the context of the
disagreements surrounding the Michigan
Student Test of Educational Progress, the
new state-required standardized test that
replaced the 44-year-old MEAP. The recently
released and abysmally low M-STEP results
indicate there are significant issues with
the exam. To set Michigan students up for
successful academic futures, the Michigan
Legislature should use the results of this
year’s M-STEP, combined with the guidelines

set in the Testing Action Plan, to revise this
new state-required test, and other state
legislatures should follow suit.

Unlike
the
MEAP,
the
M-STEP
is

administered
in
the
spring,
which
is

advantageous to students because it tests
them on material they just learned, rather
than testing them on material they learned
three months ago the previous spring. Another
promising attribute is that the M-STEP aimed
to incorporate Common Core standards by
allowing schools to conduct the test online and
including short-answer questions, in contrast
to the MEAP, which only used multiple choice.
These factors should have combined to create
a “better” standardized test, but as Michigan
students’ unbelievably low test scores reveal,
this new test is very far from perfect.

In a phone interview with The Michigan

Daily,
Pamela
Davis-Kean,
professor
of

psychology and education at the University,
said she takes the results with a grain of salt.
Davis-Kean explained that by the time the
federal government had made clear in 2014 the
Common Core standards that state tests had
to meet, there was little time for the state to
actually create the new test. Also concerning
is that M-STEP removed time restrictions on
students taking the test. Most students took
eight to 11 hours to complete the test, and some
took even longer. This large range inherently
causes unreliability in the data.

“In general, no matter what the results are,

they’re not going to be indicative of anything,
because it wasn’t pretested — it didn’t go
through the usual validations that most tests go
through,” Davis-Kean said.

Guidelines explained in the Testing Action

Plan could help solve some of the problems
faced in creating and administering the
M-STEP. The plan states $403 million would
go toward creating state assessments that align
with college and career-readiness standards.
A separate $25 million will go to projects that
help states develop new assessment models
that would allow them to “address pressing
needs they have identified for developing and

implementing their assessments.” Now with
the resources of both more time and more
money, the Michigan Department of Education
is better prepared to improve the state-required
test than it was at the beginning of the 2014-15
school year.

Calling for the state to place a cap on the

amount of time districts and schools can spend
testing, which the Testing Action Plan does,
would discourage redundancy in the different
tests schools are currently administering,
and would therefore serve as a good first step
toward greater efficiency. By limiting the
time spent on standardized tests, the Obama
administration will allow teachers to offer
more content-based instruction that hones
in on skills such as reading comprehension,
critical thinking and rhetorical analysis.

Rather than requiring students to spend an
excessive number of hours taking a myriad
of standardized tests, a returned emphasis
on class content will give students a greater
breadth of knowledge and skills than teaching
to a standardized test allows.

The plan makes several other much-needed

calls to action aimed at state governments,
local districts and individual schools. One
important aspect of the Testing Action Plan is
that it encourages states to release test results
in a timely manner and to make clear to parents,
students, teachers and administrators what
these results can be used for. More timely
access to statewide and individual results
can help students and teachers make changes
to curricula and individualized teaching
strategies for specific student needs.

In addition, the Testing Action Plan

emphasizes flexibility in teacher preparation for
these tests. The plan states, “as in other areas,
we believe that student learning as measured
by assessment results should be a part, not the
sole determinant, of determining the quality of
a particular program.” Instead of implying that
educators should be “teaching to a test,” the
plan seems to suggest that teacher performance
should not be assessed solely based upon their
students’ testing performance. After over a
decade of overemphasizing the importance
of student test scores in teacher evaluations,
this is a refreshing and critical point for the
government to concede.

Evaluating the results of the M-STEP

demonstrate the Testing Action Plan is a long-
overdue call for states to not only place a cap on
the amount of classroom time schools require
of students, but also create a state-administered
test that is more efficient and effective in
measuring student achievement. The ideal test
would encompass Common Core standards
and would go through extensive preparatory
procedures before being given to students as to
avoid the mistakes of the M-STEP. Streamlined
testing will produce useful data and allow for
teachers to do what’s more important — focus
on teaching content in the classroom.

Course evals were created for students

New testing plan a good (M-)STEP

Obama initiative will help fix Michigan exam

Jackie Thomas/Daily

Back to Top

© 2025 Regents of the University of Michigan