Friday, June 21, 2013
Re: SETs under attack again. . .
**************************************
ABSTRACT: POD’s Nira Hativa in her post “SETs under attack again. . .” at http://bit.ly/11GDlpH wrote (paraphrasing): “How would you respond to the recent article in Psychology Today: “Do the Best Professors Get the Worst Ratings? Do students give low ratings to teachers who instill deep learning?” by psychologist Nate Kornell (2013)] at http://bit.ly/190OXdi.
On the basis of: (a) “Appearances Can Be Deceiving: Instructor Fluency Increases Perceptions of Learning Without Increasing Actual Learning” [Carpenter et al. (2013)] at http://bit.ly/18Ruqvl, and (b) “Does Professor Quality Matter? Evidence from Random Assignment of Students to Professors” [Carrell & West (2010)] at http://bit.ly/KtOnHp, Kornell concluded that “Student Evaluations Are of Questionable Value.”
Kornell's conclusion is consistent with Linda Nilson’s (2013a) post at http://bit.ly/1asQQRL, stating that “Carrell and West’s article is just one of several recent studies showing that student ratings are no longer positively related to student learning [as found by Cohen, 1981 at http://bit.ly/11TYGjV].”
NO LONGER? In “Re: Problems with Student Evaluations: Is Assessment the Remedy?” [Hake (2002a)] at http://bit.ly/hjt9ll I wrote (paraphrasing): “Neither Cohen nor any other SET champion has countered the fatal objection of McKeachie (1987) at http://bit.ly/18Ou9t6 that the evidence for the validity of SETs as gauges of the cognitive impact of courses rests for the most part on measures of students' lower-level thinking as exhibited in course grades or exams.”
Nevertheless, the tired debate on SETs continues unabated. A search of the archives of AERA-D, ASSESS, EvalTalk, Phys-L, PhysLrnR, POD, & STLHE-L for “student evaluations” yielded hit totals of 762 on 5 April 2002 and 1987 on 17 June 2013. Has the tenor of the debate changed over those 11 years? POD’s Ed Nuhfer (2013) at http://bit.ly/15lk3IC thinks so, pointing out that his 1990 opinion “SETs Are Direct Measures of Student Satisfaction”, once regarded as heretical, now appears to be mainstream.
Hativa's post stimulated (at last count) 112 POD posts, none of them mentioning pre/post testing with Concept Inventories http://bit.ly/dARkDY (developed through arduous qualitative and quantitative research by disciplinary experts), as a way to gauge students’ higher-order learning, despite the fact that such has been occurring in physics education for almost 3 decades. In this post I reference 19 failed attempts to inform academia that “Sets Are Not Valid Measures of Students' Higher-Order Learning.”
**************************************
To access the complete 69 kB post please click on http://yhoo.it/19WKjA5 .
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to Socratic Dialogue Inducing (SDI) Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
GooglePlus: http://bit.ly/KwZ6mE
Google Scholar: http://bit.ly/Wz2FP3
Twitter: http://bit.ly/juvd52
Facebook: http://on.fb.me/XI7EKm
LinkedIn: http://linkd.in/14uycpW
“The defenders of the SET process are generally found in the colleges of education, in the national teachers' unions, and among those who consult in the area. Their positive attitude toward SETs is compatible with a holistic environment that consists of positive research findings, currently accepted educational philosophy, and a communication system largely centered within their own academic disciplines. They are confident enough in their positive conclusions to dismiss negative findings as ‘myths’ (Aleamoni, 1999; Marsh & Roche, 2000) and to wonder why negative comments continue to be found in the literature (Theall & Franklin, 2001). Because instructional-related research is the province of their occupation, it would be expected that the majority of the research on SETs would come from those in educational disciplines.”
- Dennis Clayson (2009)
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
- William Wood & James Gentile (2003)
REFERENCES [URL shortened by http://bit.ly/ and accessed on 21 June 2013.]
Hake, R.R. 2013. “Re: SETs under attack again. . .,” online on the OPEN! Net-Gold archives at http://yhoo.it/19WKjA5. Post of 20 Jun 2013 15:41:08-0700 to AERA-L and Net-Gold. The abstract and link to the complete post are being distributed to various discussion lists.
Saturday, March 19, 2011
Re: Interactive Engagement Typically Lowers Student Evaluations of Teaching?
The abstract reads:
**********************************************************
ABSTRACT: PhysLrnR’s Bill Goffe wrote (paraphrasing): “I thought I recalled reading here that interactive engagement typically lowers student evaluations of teaching, but I've not been able any such claims in the literature.”
Goffe's post initiated a 17-post thread (as of 19 March 15:47-0700) accessible at http://bit.ly/i9zBsd to those who take a few minutes to subscribe to PhysLrnR at http://bit.ly/beuikb.
Bill may have overlooked my post “Re: What if students learn better in a course they don't like?” [Hake (2006)]. Therein I wrote (condensing and paraphrasing):
“When I first started teaching an introductory physics course I followed the example of teaching-award-winning faculty and taught in a traditional manner: passive student lectures, lots of exciting demos, algorithmic problem exams, recipe labs, and a relatively easy final exam. I was gratified to receive a Student Evaluation of Teaching (SET) evaluation point average EPA = 3.38 [B plus on a scale of 1 - 4] for ‘overall evaluation of professor.’ Had I continued using traditional methods and giving easy exams I would doubtless have risen to become the U.S. Secretary of Education, or at least President of Indiana University.
Unfortunately for my academic career, I gradually caught on to the fact that students’ conceptual understanding of physics was not substantively increased by traditional pedagogy. I converted to the ‘Arons Advocated Method’ http://bit.ly/boeQQt of ‘interactive engagement.’ This resulted in average normalized gains g(ave) on ‘Force Concept Inventory’ that ranged from 0.54 to 0.65 as compared to the g(ave) of about 0.2 typically obtained in traditional introductory mechanics courses.
But my EPA’s for ‘overall evaluation of professor,’ sometimes dipped to as low as 1.67 (C-), and never returned to the 3.38 high that I had garnered by using traditional ineffective methods. My department chair and his executive committee, convinced by the likes of Peter Cohen (1981, 1990) that SET’s are valid measures of the cognitive impact of introductory courses, took a very dim view of both my teaching and my educational activities.”
**********************************************************
To access the complete 13 kB post please click on http://bit.ly/gKWO1S.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize theInvention of the Internet (PEDARRII)
rrhake@earthlink.net>
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“Few faculty members have any awareness of the expanding knowledge about learning from psychology and cognitive science. Almost no one in the academy has mastered or used this knowledge base. One of my colleagues observed that if doctors used science the way college teachers do, they would still be trying to heal with leeches.”
- James Duderstadt (2000), President Emeritus and University Professor of Science and Engineering at the University of Michigan
REFERENCES [All URL's shortened by http://bit.ly/ and accessed on 19 March 2011.]
Duderstadt, J.J. 2000. A University for the 21st Century. Univ. of Michigan Press, publisher's information at http://bit.ly/cvJ1yI. Amazon.com information at http://amzn.to/fUnbj5, note the “Look Inside” feature.
Hake, R.R. 2011. “Re: Interactive Engagement Typically Lowers Student Evaluations of Teaching?” online on the OPEN! AERA-L archives at http://bit.ly/gKWO1S. Post of 19 Mar 2011 15:51:49-0700 to AERA-L, Net-Gold, and PhysLrnR. The abstract and link to the complete 13 kB post are also being transmitted to various discussion lists.