Friday, January 31, 2014
The Defiant Parents: Testing’s Discontents – Response to Hunt
***************************************************
ABSTRACT: In a post “Re: The Defiant Parents: Testing's Discontents” [Hake (2014)] at http://bit.ly/1mYwWoa, I pointed to the “vigorous leadership, voluminous messaging, and pro-public-/anti-private-education positions of (a) Diane Ravitch and (b) FairTest http://www.fairtest.org/. Then I commented that neither appeared to be informed regarding the virtues of rigorous measurement of students' higher-order learning by means of zero-stakes formative evaluation “designed and used to improve an intervention, especially when it is still being developed” [JCSEE, copied onto p. 132 by Frechtling et al. (2010) at http://bit.ly/1aYcgYn.
In response, Russ Hunt (2014) at http://bit.ly/1hSsq6L wrote: “The virtues of rigorous testing aren't really the point: it's how the tests are administered and what uses they’re put to that Ravitch and Fair Test (and I) are concerned with.”
However, it IS to the point for many of those who wish to enhance students’ higher-level learning, as I have emphasized in, e.g.:
1. “Lessons from the Physics Education Reform Effort” [Hake (2002)] at http://bit.ly/aL87VT;
2. “The Physics Education Reform Effort: A Possible Model for Higher Education” [Hake (2005)] at http://bit.ly/9aicfh;
3. “Should We Measure Change? Yes!” [Hake (2007a)] at http://bit.ly/d6WVKO (2.5 MB);
4. “Re: pre-to-post tests as measures of learning/teaching” [Hake (2008a)] at http://bit.ly/MmPxwp;
5. “Design-Based Research in Physics Education Research: A Review” [Hake (2008b)] at http://bit.ly/9kORMZ (1.1 MB);
6. “The Impact of Concept Inventories On Physics Education and Its Relevance For Engineering Education” [Hake (2011a)] at http://bit.ly/nmPY8F (8.7 MB);
7. “SET's Are Not Valid Gauges of Students' Higher-Level Learning #2” [Hake (2011b)] at http://bit.ly/jLZaz5;
8. “The NRC Finally Comes to Its Senses on Improving STEM Education” [Hake (2013a)] at http://bit.ly/154M5yf; and
9. “Can the Cognitive Impact of Calculus Courses be Enhanced?” Hake (2013b)] at http://bit.ly/1loHgC4 (2.7 MB).
***************************************************
To access the complete 66 kB post please click on http://bit.ly/1lqiR4u.
Richard Hake, Emeritus Professor of Physics, Indiana University; Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands; President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII); LINKS TO: Academia http://bit.ly/a8ixxm; Articles http://bit.ly/a6M5y0; Blog http://bit.ly/9yGsXh; Facebook http://on.fb.me/XI7EKm; GooglePlus http://bit.ly/KwZ6mE; Google Scholar http://bit.ly/Wz2FP3; Linked In http://linkd.in/14uycpW; Research Gate http://bit.ly/1fJiSwB; Socratic Dialogue Inducing (SDI) Labs http://bit.ly/9nGd3M; Twitter http://bit.ly/juvd52.
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.” - Wood & Gentile (2003).
REFERENCES [URLs shortened by http://bit.ly/ and accessed on 31 Jan 2014.]
Hake, R.R. 2014. “The Defiant Parents: Testing's Discontents – Response to Hunt,” online on the OPEN! AERA-L archives at http://bit.ly/1lqiR4u. The abstract and link to the complete post are being transmitted to several discussion lists.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online as a 213 kB pdf http://bit.ly/SyhOvL, thanks to Ecoplexity http://bit.ly/152aFQ9.
Wednesday, January 29, 2014
Two Different Meanings of “Formative Evaluation” #2
*******************************************************
ABSTRACT: In response to my post “FairTest Appears To Be Uninformed on ‘Formative Evaluation’ In The JCSEE Sense” at http://bit.ly/1dL3c5K, Michael Paul Goldenberg and FairTest’s Monty Neil posted what appeared to be non sequitures on EDDRA2, symptomatic of the general failure of educators to recognize the existence of two very different meanings of “Formative Evaluation”:
(1) “Evaluation designed and used to improve
an intervention, especially when it is still being
developed” – JCSEE (1994) as copied on p. 132
of Frechtling et al. (2010) at http://bit.ly/1aYcgYn. . .(FE-JCSEE)
(2) “All those activities undertaken to provide
information to be used as feedback so as to adapt
the teaching to meet student needs” – paraphrased
from Black & Wiliam (1998, p. 2)
at http://bit.ly/1jTqiwK.. . . . . . . . . . . . . . . . . . . . (FE-B&W)
An example of FE-JCSEE is zero-stakes pre/post testing utilizing Concept Inventories http://bit.ly/dARkDY which are constructed by disciplinary experts through arduous qualitative and quantitative research– see e.g.: (a) “The Impact of Concept Inventories on Physics Education and Its Relevance for Engineering Education” [Hake (2011)]at http://bit.ly/nmPY8F (8.7 MB), and (b) “Can the Cognitive Impact of Calculus Courses be Enhanced?” [Hake (2013)] at http://bit.ly/1loHgC4 (2.7 MB).
An example of FE-B&W is its use in the “interactive engagement” (IE) methods that have been shown –Hake (1998a) at http://bit.ly/9484DG and many others to achieve average normalized gains g(ave) on Concept Inventories that are about two standard deviations above those of traditional passive-student lecture courses. Here IE methods are defined [Hake 1998a)] as: “methods designed to promote conceptual understanding through the active engagement of students in heads-on (always) and hands-on (usually) activities that yield immediate feedback through discussion with peers and/or instructors.”
******************************************************
To access the complete 90 kB post please click on http://bit.ly/1e8Zhpr.
Richard Hake, Emeritus Professor of Physics, Indiana University; Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands; President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII); LINKS TO: Academia http://bit.ly/a8ixxm; Articles http://bit.ly/a6M5y0; Blog http://bit.ly/9yGsXh; Facebook http://on.fb.me/XI7EKm; GooglePlus http://bit.ly/KwZ6mE; Google Scholar http://bit.ly/Wz2FP3; Linked In http://linkd.in/14uycpW; Research Gate http://bit.ly/1fJiSwB; Socratic Dialogue Inducing (SDI) Labs http://bit.ly/9nGd3M; Twitter http://bit.ly/juvd52.
“There is substantial evidence that scientific teaching in the sciences, i.e., teaching that employs instructional strategies that encourage undergraduates to become actively engaged in their own learning, can produce levels of understanding, retention and transfer of knowledge that are greater than those resulting from traditional lecture/lab classes. But widespread acceptance by university faculty of new pedagogies and curricular materials still lies in the future.” - Robert DeHaan (2005)
REFERENCES [URLs shortened by http://bit.ly/ and accessed on 29 Jan 2014.]
Hake, R.R. 2014. “Two Different Meanings of 'Formative Evaluation' #2” online on the OPEN! AERA-L archives at http://bit.ly/1e8Zhpr. The abstract and link to the complete post are being transmitted to several discussion lists.
DeHaan, R.L. 2005. “The Impending Revolution in Undergraduate Science Education,” Journal of Science Education and Technology 14(2): 253-269; online as a 152 kB pdf at http://bit.ly/ncAuQa.
Saturday, January 25, 2014
Re: The Defiant Parents: Testing’s Discontents
***********************************************
ABSTRACT: John Denker, in his PhysLrnR post “The Defiant Parents: Testing’s Discontents” called attention to Rebecca Mead's New Yorker article of the same name at http://nyr.kr/M6qz4j. Mead wrote: “. . . . there is a burgeoning opt-out movement, with parents, teachers, and administrators questioning the efficacy of the tests as they are currently administered, in measuring both the performance of teachers and the progress of students.”
Denker wrote (paraphrasing): “There's a medium-sized revolt going on. But it's not a huge revolt because of weak messaging and weak leadership. Specifically the anti-testers haven't come up with a crisp description of what they are revolting against . . . . or where they would like to go instead.”
Denker is either oblivious or dismissive of the vigorous leadership, voluminous messaging, and pro-public-/anti-private-education positions of (a) Diane Ravitch and (b) “Fair Test” http://www.fairtest.org/ with its discussion list ARN-L with OPEN! archives at http://bit.ly/jeiTPm.
Unfortunately, both Ravitch and “Fair Test” appear to be uninformed regarding the virtues of rigorous measurement of students’ higher-order learning by means of zero-stakes FORMATIVE (i.e., “designed and used to improve an object, especially when it is still being developed”) pre/post testing utilizing Concept Inventories http://bit.ly/dARkDY which are constructed by disciplinary experts through arduous qualitative and quantitative research – see e.g., (a) “The Impact of Concept Inventories On Physics Education and Its Relevance For Engineering Education” (Hake, 2011) at http://bit.ly/nmPY8F (8.7 MB); and (b) “Can the Cognitive Impact of Calculus Courses be Enhanced?” (Hake, 2013) at http://bit.ly/1loHgC4 (2.7 MB).
***********************************************
To access the complete 61 kB post please click on http://bit.ly/1mYwWoa.
Richard Hake, Emeritus Professor of Physics, Indiana University; Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands; President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII); LINKS TO: Academia http://bit.ly/a8ixxm; Articles http://bit.ly/a6M5y0; Blog http://bit.ly/9yGsXh; Facebook http://on.fb.me/XI7EKm; GooglePlus http://bit.ly/KwZ6mE; Google Scholar http://bit.ly/Wz2FP3; Linked In http://linkd.in/14uycpW; Research Gate http://bit.ly/1fJiSwB; Socratic Dialogue Inducing (SDI) Labs http://bit.ly/9nGd3M; Twitter http://bit.ly/juvd52.
REFERENCES [URL shortened by http://bit.ly/ and accessed on 25 Jan 2014.]
Hake, R.R. 2014. “Re: The Defiant Parents: Testing’s Discontents” online on the OPEN! AERA-L archives at http://bit.ly/1mYwWoa.The abstract and link to the complete post are being transmitted to several discussion lists.
Friday, September 13, 2013
Re: Are Tenure Track Professors Better Teachers?
ABSTRACT: Scott Jaschik (2013) in his Inside Higher Ed report “The Adjunct Advantage” at http://bit.ly/19PGOZn has pointed to “Are Tenure Track Professors Better Teachers?” [Figlio et al. (2013) at http://bit.ly/1erGvKp. In my opinion, the latter's attempt to indirectly measure students’ learning in introductory courses by means of their next-class-taken performance is problematic at best.
Unfortunately, most of academia is either unaware or dismissive of the direct gauging of students’ higher-order learning by means of pre/post testing with Concept Inventories http://bit.ly/dARkDY, pioneered independently by economist Rendigs Fels (1967) at http://bit.ly/162KSBv and physicists Halloun & Hestenes (1985a) at http://bit.ly/fDdJHm.
For a discussion of pre/post testing with Concept Inventories, see e.g., “Should We Measure Change? YES!” [Hake (2013)] at http://bit.ly/d6WVKO. For recent use of this method see “The Calculus Concept Inventory - Measurement of the Effect of Teaching Methodology in Mathematics” [Epstein (2013) at http://bit.ly/17a8XJd.
****************************************
To access the complete 12 kB post please click on http://bit.ly/1829UU4 .
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to Socratic Dialogue Inducing (SDI) Labs:http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
GooglePlus: http://bit.ly/KwZ6mE
Google Scholar: http://bit.ly/Wz2FP3
Twitter: http://bit.ly/juvd52
Facebook: http://on.fb.me/XI7EKm
Linked In: http://linkd.in/14uycpW
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
- William Wood & James Gentile (2003)
REFERENCES [URLs shortened by http://bit.ly/ and accessed on 13 Sept. 2013.]
Hake, R.R. 2013. “Re: Are Tenure Track Professors Better Teachers?” online on the OPEN! AERA-L archives at http://bit.ly/1829UU4. Post of 13 Sep 2013 16:41:24-0400 to AERA-L and NetGold. The abstract and link to the complete post are being transmitted to various discussion lists.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online as a 213 kB pdf http://bit.ly/SyhOvL.
Monday, May 20, 2013
Is Higher Education Running AMOOC?
*********************************************
ABSTRACT: My discussion-list post “Evaluating the Effectiveness of College” at http://yhoo.it/16cJ7HO concerned the failure of U.S. higher education to emphasize student learning rather than the delivery of instruction [Barr and Tagg (1995)] at http://bit.ly/8XGJPc. In response, a correspondent asked me “Is There Some Hope In Coursera’s Pedagogical Foundations?
Despite the serious cracks detected in all but one of Coursera’s five pedagogical foundation stones, I don’t think Coursera is necessarily doomed to pedagogic collapse. Instead I think there may actually be some hope IF its MOOCs are evaluated by measurement of pre-to-post-course student learning gains using Concept Inventories http://bit.ly/dARkDY. If the physics education reform effort is any guide, then (a) such assessment will demonstrate that MOOCs are actually MOORFAPs (Massive Open Online Repetitions of FAiled Pedagogy), and (b) there will be some incentive to transform MOOCs into MOOLOs (Massive Open Online Learning Opportunities).
But even if MOOCs fail to become MOOLOs there still may be some hope since, as Keith Devlin (2013) points out at http://bit.ly/14440kt, MOOCs have the potential to uncover individuals world-wide who have the talent to learn from MOORFAPs, in the same way that most current professional physicists were able to learn physics from FAPs (Failed Academic Pedagogy).
For those who may wish to dig deeper into the MOOC milieu I recommend Nathan Heller’s (2013) scholarly “LAPTOP U: Has the future of college moved online?” at http://nyr.kr/10MmItb (probably for a limited time).
*********************************************
To access the complete 39 kB post please click on http://yhoo.it/12nPMZB.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to Socratic Dialogue Inducing (SDI) Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
GooglePlus: http://bit.ly/KwZ6mE
Google Scholar: http://bit.ly/Wz2FP3
Twitter: http://bit.ly/juvd52
Facebook: http://on.fb.me/XI7EKm
There was a giant MOOC,
based on a little book http://nyti.ms/114klB5,
When MOOC hype was trending torrid.
And when MOOCS were good,
They were very, very good,
But when they were bad they were horrid http://nyti.ms/14ixnQ7.
Profs stood on their heads,
‘students’ watching from beds,
With nobody by for to hinder.
Peer-graded squalor,
plagiarized ‘http://bit.ly/10cZ7W0’ in the holler,
And drummed all their palms against Winders.
Foundations heard the noise,
and thought it was the boys,
Playing Coursera and edX.
They funded and Ventured,
noncredit adventures,
While we all suffer the headX.
- Slightly reformatted version of Sherman Dorn's (2013) “The MOOC Poem”
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 20 May 2013.]
Dorn, S. 2013. “The MOOC Poem,” Inside Higher Ed, 11 March; online at http://bit.ly/14wXPVG, with apologies to Henry Wadsworth Longfellow’s “There was a little girl” http://bit.ly/12M7gPM.
Hake, R.R. 2013. “Is Higher Education Running AMOOC?” online on the OPEN! Net-Gold archives at http://yhoo.it/12nPMZB. Post of 19 May 2013 18:47:06 -0700 to AERA-L and Net-Gold.
Wednesday, November 28, 2012
Concept Inventories Alone Don't Gauge “Good Teaching”
******************************************
ABSTRACT: In response to my post “The Value of Student Evaluations of Teaching” [Hake (2012b)] at http://bit.ly/XX7OUX, Math-Learn’s Ed Wall stated that: (a) he used approximations to both “Student Evaluations of Teaching” (SET’s) and “Concept Inventories” (CI’s), but wondered what a “good” evaluation of “teaching? was; and (b) he’s reasonably okay with a statement such as “I know good teaching when I see it” versus a statement such as “I know good teaching when I see a CI score).”
Let alone a single CI score, in my opinion even the normalized pre-to-posttest GAIN
In the editor suppressed “Interactive-engagement methods in introductory mechanics courses” [Hake (1998b)] at http://bit.ly/aH2JQN I pointed out on p. 14 that among desirable outcomes of the introductory physics course that
******************************************
To access the complete 13 kB post please click on http://bit.ly/VeDTWI.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to Socratic Dialogue Inducing (SDI) Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
GooglePlus: http://bit.ly/KwZ6mE
Twitter: http://bit.ly/juvd52
“What we assess is what we value. We get what we assess, and
if we don't assess it,we won't get it.”
- Lauren Resnick [quoted by Grant Wiggins (1990)]
REFERENCES [URL shortened by http://bit.ly/ and accessed on 28 Nov 2012.]
Hake, R.R. 2012a. “Concept Inventories Alone Don't Gauge ‘Good Teaching’, ”online on the OPEN! AERA-L archives at http://bit.ly/VeDTWI. Post of 28 Nov 2012 13:24:52-0800 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Hake, R.R. 2012b. "The Value of Student Evaluations of Teaching" online on the OPEN! AERA-L archives at http://bit.ly/XX7OUX. Post of 26 Nov 2012 13:35:35-0800 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists and are also on my blog "Hake'sEdStuff" at http://bit.ly/TqLDls with a provision for comments.
Wiggins, G. 1990. “The Truth May Make You Free, But the Test May Keep You Imprisoned: Toward Assessment Worthy of the Liberal Arts," AAHE Assessment Forum: 17-31; online at http://bit.ly/a7g09T.
Monday, November 26, 2012
The Value of Student Evaluations of Teaching
*******************************************
ABSTRACT: Bill Brescia of the DrEd list wrote: “We are looking for best practices to solicit student feedback on the teaching quality of individual faculty members where in some cases a fairly large number of faculty are involved in a single course. What mechanism do you use to survey the students?”
The value of student evaluations is a hotly contested topic, witness the 496,000 hits at http://bit.ly/TqEilE generated by a Google search for “Student Evaluations” (with the quotes) on 26 Nov 2012 12:58-0800
Judging from my own experience, and after careful consideration of the above hits, my opinion is that:
a. Student Evaluations of Teaching (SET’s) are useful for gauging the affective impact of teaching, but are worse than useless for gauging the cognitive impact - see e.g. “Student Evaluations of Teaching Are Not Valid Gauges of Teaching Performance - Yet Again!” [Hake (2012)] at http://bit.ly/KGK687.
b. The cognitive impact of teaching is best measured by average normalized pre-to-post-test gains on "Concept Inventories" http://en.wikipedia.org/wiki/Concept_inventory - see e.g., “The Impact of Concept Inventories On Physics Education and It’s Relevance For Engineering Education” [Hake (2011)] at http://bit.ly/nmPY8F.
*******************************************
To access the complete 6 kB post please click on http://bit.ly/XX7OUX.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to Socratic Dialogue Inducing (SDI) Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
GooglePlus: http://bit.ly/KwZ6mE
Twitter: http://bit.ly/juvd52
REFERENCES [URL shortened by http://bit.ly/ and accessed on 26 Nov 2012.]
Hake, R.R. 2012. “The Value of Student Evaluations of Teaching,” online on the OPEN! AERA-L archives at http://bit.ly/XX7OUX. Post of 26 Nov 2012 13:35:35-0800 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Wednesday, September 5, 2012
Assessment in Higher Education (was Re: Grading, Evaluation and Bonus Points)
*******************************************
ABSTRACT: Chris Rust (2012), in a POD post “The unscholarly use of numbers in our assessment practices; what will make us change?” at http://bit.ly/PIwxUd, raised the “fundamental question . . . what are marks, points or grades actually meant to represent?”
Consistent with Rust's concern for meaning of grades, pre/post testing – see e.g., http://bit.ly/aH2JQN with Concept Inventories http://bit.ly/dARkDY has strongly suggested that most course grades in traditional passive-student introductory physics lecture classes (not to mention most “Student Evaluations of Teaching”- see e.g., at http://bit.ly/jLZaz5) are essentially meaningless as gauges of students’ higher-order learning, since:
(a) students in such courses attain pre-to-posttest gains that average only about 23% of the maximum possible gain; while at the same time
(b) it's probably safe to say that well over half of the students in those courses had received course grades of A, B, or C, normally (but erroneously) considered to mean, respectively, “excellent,” “good,” and “fair.”
Rust (2012) references his earlier article “Towards a scholarship of assessment” [Rust (2007)], with a preview at http://bit.ly/RErhV7 in which he stated “it is vital that we explicitly articulate and establish a scholarship of assessment, which should be at the very heart of our scholarship of teaching and learning” . . . .[[my bold text: NO! the bold text does NOT mean that Rust was “shouting”]]. . . . As a guide to such articulation, I recommend Peggy Maki’s excellent book Assessing for Learning: Building a Sustainable Commitment Across the Institution
*******************************************
To access the complete To access the complete 15 kB post please click on http://bit.ly/TXF4nx.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
Twitter http://bit.ly/juvd52
GooglePlus: http://bit.ly/KwZ6mE
“What we assess is what we value. We get what we assess,
and if we don’t assess it, we won’t get it.”
Lauren Resnick [quoted by Grant Wiggins (1990)]
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 05 Sept 2012.]
Hake, R.R. 2012. “Assessment in Higher Education (was Re: Grading, Evaluation and Bonus Points),” online on the OPEN AERA-L archives at http://bit.ly/TXF4nx. Post of 5 Sep 2012 12:12:19-0700 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Wiggins, G. 1990. “The Truth May Make You Free, But the Test May Keep You Imprisoned: Toward Assessment Worthy of the Liberal Arts,” online at http://bit.ly/a7g09T on the MAA’s SAUM (Supporting Assessment in Undergraduate Mathematics) page “Getting Started With Assessment” http://bit.ly/LR1Exe.
Monday, August 13, 2012
What Mathematicians Might Learn From Physicists
*********************************************'
ABSTRACT: Mary Shepherd of the RUME list has called attention to David Bressoud's recent MAA “Launchings” columns (a) “Learning from the Physicists” [Bressoud (2012a)] at http://bit.ly/MrAuyZ, and (b) “Barriers to Change” [Bressoud (2012b)] at http://bit.ly/NkW9dE.
Unfortunately, Bressoud neglects to point out that the most important lesson mathematicians might learn from physicists is the advantage of discovering what instructional methods do and do not work by means of pre/post testing with Concept Inventories http://en.wikipedia.org/wiki/Concept_inventory - see e.g., “Lessons from the Physics Education Reform Effort” [Hake (2002) at http://bit.ly/aL87VT and “Bioliteracy and Teaching Efficiency: What Biologists Can Learn from Physicists” [Klymkowsky et al. (2003)] at http://bit.ly/9A1Arx. Pre/post testing with Concept Inventories has only recently been brought to math education by Jerry Epstein http://bit.ly/bqKSWJ with his “Calculus Concept Inventory.”
In my opinion, Bressoud is handicapped by the neglect of any mention of pre/post testing in his primary source “The Use of Research-Based Instructional Strategies in Introductory Physics: Where do Faculty Leave the Innovation-Decision Process?” [Henderson, Dancy, & Niewiadomska-Bugaj (2012)] at http://bit.ly/MWSxIU. The second sentence of their abstract reads: “Significant empirical research has shown that student learning can be substantially improved when instructors move from traditional, transmission-style instruction to more student-centered, interactive instruction [Bransford et al. (2000), Handelsman et al. (2004)].”
In referencing Bransford et al. (2000) and Handlesman et al. (2004)], Henderson et al. carry on the PER tradition of mindlessly dismissing the breakthrough research of Halloun & Hestenes (1985a) - see e.g. “The Initial Knowledge State of College Physics Students” http://bit.ly/b1488v (scroll down to “Evaluation Instruments”). As far as I know, that research constituted the first “significant empirical research [showing] that student learning can be substantially improved when instructors move from traditional, transmission-style instruction to more student-centered, interactive instruction.” Instead of emphasizing the preeminent role of PER in education research Henderson et al. erroneously imply that physicists simply followed the lead of cognitive scientists [Bransford et al. (2000)] and biologists [Handlesman et al. (2004)].
*********************************************
To access the complete 22 kB post please click on http://bit.ly/ROjN2T.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
Twitter http://bit.ly/juvd52
GooglePlus: http://bit.ly/KwZ6mE
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
- William Wood & James Gentile (2003).
REFERENCES [URL's shortened by http://bit.ly/ and accessed on 13 August 2012.
Hake, R.R. 2012. “What Mathematicians Might Learn From Physicists” online on the OPEN AERA-L archives at http://bit.ly/ROjN2T. Post of 13 Aug 2012 16:59:34 -0700 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Wood, W.B. & J.M. Gentile. 2003. "Teaching in a research context," Science 302: 1510; 28 November; online to subscribers at
Monday, June 25, 2012
In Defense of Pre/Post Testing With Concept Inventories
********************************************
Bruce Sherwood and Paul Camp, in recent posts on the “Physics Education Research Topical Group” (PERTG) discussion list, denigrate pre/post testing with concept inventories, typifying a trend in some physics education research circles to discount such practice as naive and useless.
I think such sentiment betrays an ignorance of the history of PER. In “Lessons from the Physics Education Reform Effort” [Hake (2002)] at http://bit.ly/aL87VT I wrote:
“For more than three decades, physics education researchers have repeatedly shown that traditional introductory physics courses with passive student lectures, recipe labs, and algorithmic problem exams are of limited value in enhancing students’ conceptual understanding of the subject. Unfortunately, this work was largely ignored by the physics and education communities until Halloun and Hestenes (1985a,b) devised the Mechanics Diagnostic test of conceptual understanding of Newtonian mechanics.”
For a review of pre/post testing with concept inventories in physics and engineering see “The Impact of Concept Inventories On Physics Education and It’s Relevance For Engineering Education” [Hake (2011)] at http://bit.ly/nmPY8F (8.5 MB).
********************************************
To access the complete 12 kB post please click on http://bit.ly/LaPNbm.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Blog: http://bit.ly/9yGsXh
Twitter http://bit.ly/juvd52
GooglePlus: http://bit.ly/KwZ6mE
"He who knows only his own generation
Remains always a child."
- Cicero (in Orator)
REFERENCES [URL shortened by http://bit.ly/ and accessed on 25 June 2012.
Hake, R.R. 2012. “In Defense of Pre/Post Testing With Concept Inventories,” online on the OPEN! AERA-L archives at http://bit.ly/LaPNbm. Post of 25 Jun 2012 12:31:15-0700 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Monday, May 21, 2012
Re: How Reliable Are the Social Sciences?
****************************************************
ABSTRACT: Rick Froman of the TIPS discussion list has pointed to a New York Times Opinion Piece “How Reliable Are the Social Sciences?” by Gary Gutting at http://nyti.ms/K0xVQL. Gutting wrote that Obama, in his State of the Union address http://wapo.st/JnuBCO cited “The Long-Term Impacts of Teachers: Teacher Value-Added and Student Outcomes in Adulthood” (Chetty et al., 2011) at http://bit.ly/KkanoU to support his emphasis on evaluating teachers by their students' test scores. That study purportedly shows that students with teachers who raise their standardized test scores are “more likely to attend college, earn higher salaries, live in better neighborhoods, and save more for retirement.”
After comparing the reliability of social-science research unfavorably with that of physical-science research, Getting wrote [my italics): “is there any work on the effectiveness of teaching that is solidly enough established to support major policy decisions? the case for a negative answer lies in the [superior] predictive power of the core natural sciences compared with even the most highly developed social sciences.”
Most education experts would probably agree with Getting's negative answer. Even economist Eric Hanushek http://en.wikipedia.org/wiki/Eric_Hanushek, as reported by Lowery http://nyti.ms/KnRvDh, states: “Very few people suggest that you should use value-added scores alone to make personnel decisions.”
But then Getting goes on to write (slightly edited): “While the physical sciences produce many detailed and precise predictions, the social sciences do not. The reason is that such predictions almost always require randomized controlled trials (RCT’s) which are seldom possible when people are involved. . . . . . Jim Manzi . . . . . . . . . . .[[according to Wikipedia http://bit.ly/KqMf1M, a senior fellow at the conservative Manhattan Institute http://bit.ly/JvwKG1 ]]. . . . in his recent book Uncontrolled http://amzn.to/JFalMD offers a careful and informed survey of the problems of research in the social sciences and concludes that non-RCT social science is not capable of making useful, reliable, and nonobvious predictions for the effects of most proposed policy interventions.” BUT:
(1) Randomized controlled trails may be the “gold standard” for medical research, but they are not such for the social science of educational research - see e.g., “Seventeen Statements by Gold-Standard Skeptics #2” (Hake, 2010) at http://bit.ly/oRGnBp .
(2) Unknown to most of academia, and probably to Getting and Manzi, ever since the pioneering work of Halloun & Hestenes (1985a) at http://bit.ly/fDdJHm, physicists have been engaged in the social science of Physics Education Research that is “capable of making useful, reliable, and nonobvious predictions,” e.g., that “interactive engagement” courses can achieve average normalized pre-to-posttest gains which are about two-standard deviations above comparison courses subjected to “traditional” passive-student lecture courses. This work employs pre/post testing with Concept Inventories http://en.wikipedia.org/wiki/Concept_inventory - see e.g., (a) “The Impact of Concept Inventories on Physics Education and It’s Relevance For Engineering Education” (Hake, 2011) at http://bit.ly/nmPY8F, and (b) “Why Not Try a Scientific Approach to Science Education?” (Wieman, 2007) at http://bit.ly/anTMfF.
****************************************************
To access the complete 26 kB post please click on http://bit.ly/K432fC.
Richard Hake, Emeritus Professor of Physics, Indiana University
rrhake@earthlink.net
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Blog: http://bit.ly/9yGsXh
Academia: http://iub.academia.edu/RichardHake
Twitter https://twitter.com/#!/rrhake
“In some quarters, particularly medical ones, the randomized experiment is considered the causal ‘gold standard.’ It is clearly not that in educational contexts, given the difficulties with implementing and maintaining randomly created groups, with the sometimes incomplete implementation of treatment particulars, with the borrowing of some treatment particulars by control group units, and with the limitations to external validity that often follow from how the random assignment is achieved.”
- Tom Cook & Monique Payne (2002, p. 174)
“. . .the important distinction. . .[between, e.g., education and physics]. . . is really not between the hard and the soft sciences. Rather, it is between the hard and the easy sciences.”
- David Berliner (2002)
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
- Wood & Gentile (2003)
REFERENCES [All URL’s shortened by http://bit.ly/ and accessed on 21 May 2012.]
Berliner, D. 2002. “Educational research: The hardest science of all,” Educational Researcher 31(8): 18-20; online as a 49 kB pdf at http://bit.ly/GAitqc.
Cook, T.D. & M.R. Payne. 2002. “Objecting to the Objections to Using Random Assignment in Educational Research” in Mosteller & Boruch (2002).
Hake, R.R. 2012. “Re: How Reliable Are the Social Sciences?” online on the OPEN! AERA-L archives at http://bit.ly/K432fC. Post of 20 May 2012 20:08:07-0700 to AERA-L and Net-Gold. The abstract and link to the complete post are also being transmitted to several discussion lists.
Mosteller, F. & R. Boruch, eds. 2002. Evidence Matters: Randomized Trials in Education Research. Brookings Institution. Amazon.com information at http://amzn.to/n6T0Uo. A searchable expurgated Google Book Preview is online at http://bit.ly/mTcPIE.
Wood, W.B. & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online to subscribers at http://bit.ly/9izfFz. A summary is online to all at http://bit.ly/9qGR6m.
Monday, May 14, 2012
Re: texts on educational assessment
***********************************************
ABSTRACT: In an EvalTalk thread “texts on educational assessment,” Teri Lyn Hinds wrote (paraphrasing) “I'm fond of Linda Suskie’s (2009) book Assessing Student Learning: A common sense guide http://bit.ly/mA5nVq."
I THINK A WORD OF CAUTION IS IN ORDER. Judging from her ASSESS post “Re: pre- post testing in assessment” at http://bit.ly/9yIOyf [Suskie (2004)], Suskie, like many education specialists, appears to have little use for pre/post testing.
In sharp contrast Peggy Maki http://www.peggymaki.com/, former Director of Assessment at the American Association for Higher Education (AAHE), realizes the value of pre/post testing, as evidenced in Chapter 4 “Identifying or Designing Tasks to Assess the Dimensions of Learning” of her excellent book Assessing for Learning: Building a Sustainable Commitment Across the Institution
In “Should We Measure Change? Yes!” [Hake (2011)] at http://bit.ly/d6WVKO (2.5 MB) I wrote: “Formative pre/post testing is being successfully employed to improve the effectiveness of courses in undergraduate astronomy, biology, chemistry, economics, engineering, geoscience, math, and physics. But such testing is still anathema to many members of the psychology-education-psychometric (PEP) community. I argue that this irrational bias impedes a much needed enhancement of student learning in higher education.”
Fortunately, the importance of the pre/post testing with Concept Inventories http://en.wikipedia.org/wiki/Concept_inventory has NOT been ignored by the NRC - see e.g., Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops [NRC (2011)] at http://bit.ly/nCMLk7.
***********************************************
To access the complete 14 kB post please click on http://bit.ly/KnlDmy.
Richard Hake, Emeritus Professor of Physics, Indiana University
rrhake@earthlink.net
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Blog: http://bit.ly/9yGsXh
Academia: http://iub.academia.edu/RichardHake
Twitter https://twitter.com/#!/rrhake
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
Wood & Gentile (2003)
REFERENCES [All URL’s shortened by http://bit.ly/ and accessed on 14 May 2012.]
Hake, R.R. 2012. “Re: texts on educational assessment,” online on the OPEN! AERA-L archives at http://bit.ly/KnlDmy. Post of 14 May 2012 12:30:04-0700 to AERA-L and Net-Gold. The abstract and link to the complete post are also being transmitted to several discussion list.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online to subscribers at http://bit.ly/9izfFz. A summary is online to all at http://bit.ly/9qGR6m.
Monday, December 5, 2011
Re: Research on the Extent of Active Learning
***************************************************
ABSTRACT: POD’s Rae Jean Goodman, in her post “Research on extent of active learning,” evidently equating active learning with collaborative work, posed this question (paraphrasing): “It is common knowledge that more ‘collaborative work’ is being assigned and carried out by students, but can anyone recommend authoritative reports or articles that attest to changing learning/teaching modalities?”
In “The Impact of Concept Inventories On Physics Education and It’s Relevance For Engineering Education” [Hake (2011) http://bit.ly/nmPY8F (8.7 MB)] I discussed the implementation of non-traditional reform pedagogy in higher education - relevant to Goodman’s post because reform methods often involve collaborative work and/or active learning. Therein I:
(1) EMPHASIZED economist Bill Goeff’s complaint that psychologists Banta & Blaich (2011), evidently unaware of Physics Education Research, find few cases of improved learning after a teaching innovation despite the work of e.g., Hestenes et al. (1992), Hake (1998a), Crouch et al. (2007), and Deslauriers et al. (2011);
(2) POINTED OUT that:
(a) the glacial inertia of the educational system, though not well understood, appears to be typical of the slow Diffusion of Innovations [Rogers (2003)] in human society;
(b) there are at least “Eleven Barriers to Change in Higher Education”;
(c) even so, for physics education, Rogers’ early adopters of reform have now appeared at e.g., Harvard, North Carolina State University, MIT, the Univ. of Colorado, California Polytechnic at San Luis Obispo, and the Univ. of British Columbia, possibly presaging a Rogers take off for physics education reform, about two decades after the first use of Concept Inventories http://en.wikipedia.org/wiki/Concept_inventory ; and
(3) CONCLUDED that:
(a) Concept Inventories can stimulate reform, but judging from the results in physics it may take about two decades before even early adopters become evident;
(b) there are at least seven reasons why the rate of adoption of reforms may be greater in engineering education than in physics education.
***************************************************
To access the complete 27 kB post please click on http://bit.ly/u63GbO.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References
which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
Links to Articles http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Blog: http://bit.ly/9yGsXh
Academia: http://iub.academia.edu/RichardHake
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
Wood & Gentile (2003)
REFERENCES [All URL's shortened by http://bit.ly/ and accessed on 5 Dec 2011.]
Hake, R.R. 2011. “Re: Research on the Extent of Active Learning,” online on the OPEN! AERA-L archives at http://bit.ly/u63GbO. Post of 4 Dec 2011 19:01:51 -0800 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online as a 209 kB pdf at http://bit.ly/oK46p7.
Sunday, October 16, 2011
No Standard Outcome Measures For Science Education? #2
The abstract reads:
*************************************************
ABSTRACT: Robin Millar and Jonathan Osborne in Chapter 3, “Research and Practice: A Complex Relationship” of Shelley et al. (2009). claimed that: (a) NO STANDARD OR COMMONLY AGREED OUTCOME MEASURES EXIST FOR ANY MAJOR TOPIC IN SCIENCE EDUCATION. . . . [[my CAPS]]. . . ; (b) the Force Concept Inventory (FCI) reflects a choice of values that is arguable; and (c) the FCI has not been subjected to the same rigorous scrutiny of factorial structure and content validity as have standard measures in psychology.
That no standard outcome measures exist for any major topic in science education is negated by the existence of Concept Inventories
That the FCI reflects values that are arguable is correct only if the arguers think that there’s little value in students’ learning the basic concepts of Newtonian mechanics.
That the FCI has not been subjected to rigorous scrutiny of factorial structure ignores the 1995 factor analyses of Huffman & Heller and Heller & Huffman; and responses to those analyses by Hestenes & Halloun and Halloun & Hestenes.
That the FCI has not been as not been subjected to rigorous scrutiny of content validity ignores section IIB. “Validity and reliability of the mechanics test” (Mechanics Diagnostic) in Halloun & Hestenes (1985a) - that verification of validity applies also to the FCI since it's almost the same as the Mechanics Diagnostic.
*************************************************
To access the complete 24 kB article please click on http://bit.ly/rfyamc.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References
which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“50 years of research, curriculum development, and implementation have not presented consistent and compelling patterns of outcomes.”
Shelley et al. (2009, p. 4), summarizing a claim by Osborne (2007)
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
Wood & Gentile (2003) "Teaching in a research context"
REFERENCES [All URL’s shortened by http://bit.ly/ and accessed on 16 Oct 2011.]
Hake, R.R. 2011. “No Standard Outcome Measures For Science Education? #2” online on the OPEN! AERA-L archives at http://bit.ly/rfyamc . Post of 16 Oct 2011 11:04:41-0700 to AERA-L and Net-Gold. The abstract and link to the complete post were transmitted to various discussion lists.
Osborne, J. 2007. “In praise of armchair science education,” contained within E-NARST News 50(2), online as a 3.2 MB pdf at http://bit.ly/qsRwaK . The talk itself is online as a 112 kB pdf at http://bit.ly/r4Khl7.
Shelley, M.C., L.D. Yore, & B. Hand, eds. 2009. Quality Research in Literacy and Science Education: International Perspectives and Gold Standards. Springer, publisher's information at http://bit.ly/b58vbP. Amazon.com information at http://amzn.to/97OVJx, note the searchable “Look Inside” feature. Barnes & Noble information at http://bit.ly/p40bKu. An expurgated (teaser) version is online as a Google “book preview” at http://bit.ly/qK8T9P.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online as a 209 kB pdf at http://bit.ly/oK46p7.
Wednesday, June 22, 2011
The Science/Math Education Shift from Teaching to Learning
The abstract reads:
*******************************************
ABSTRACT: This post is a slight expansion and improvement of an earlier post “Anatomy Education.” The revision was stimulated by the interesting 11-post POD thread “Re: Open inquiry is bad? (in some intro tech courses)” at http://bit.ly/miR63T, especially Anton Tolman’s (2011) emphasis on John Tagg’s (2003) important book The Learning Paradigm College.
Robin Hopkins in a POD post “SHIFT IN THE TEACHING OF SCIENCE” wrote: “I'm interested in the shift that is required of traditional anatomists as the medical school moves toward a curriculum that requires anatomy to be taught/learned in ways that are more aligned with the clinical application of anatomy than simply ‘knowing’ anatomy (usually for tests).”
If the tests are of the usual classroom type then they require only the regurgitation of memorized material rather than higher-order learning such as the understanding of scientific concepts. I suspect that that higher-order learning is required for the effective clinical application of anatomy just as it is for the successful pursuit of science/math professions generally.
In my opinion, THE MAJOR SHIFT IN SCIENCE/MATH EDUCATION IS THE SHIFT “FROM TEACHING TO LEARNING: A NEW PARADIGM FOR UNDERGRADUATE EDUCATION” [Barr & Tagg (1995), Tagg (2003)].
But unknown to most of academia, education researchers have developed “Concept Inventories” http://en.wikipedia.org/wiki/Concept_inventory that can be used in formative pre/post testing to gauge the impact of courses on students' learning and understanding of scientific concepts. At least in physics such testing demonstrates that “Interactive Engagement” (IE) courses result in course-averaged normalized learning gains g(ave) that are about two-standard deviations above those of “Traditional” (T) passive-student lecture courses [Hake (1998a,b; 2008)].
I give 31 hot-linked references to some of the relevant literature.
*******************************************
To access the complete 24 kB post please click on http://bit.ly/ijJeCm.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“…I point to the following unwelcome truth: much as we might dislike the implications, research is showing that didactic exposition of abstract ideas and lines of reasoning (however engaging and lucid we might try to make them) to passive listeners yields pathetically thin results in learning and understanding - except in the very small percentage of students who are specially gifted in the field.”
Arnold Arons in Teaching Introductory Physics (p. vii, 1997)
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 22 June 2011.]
Arons, A.B. 1997. Teaching Introductory Physics. Wiley. Amazon.com information at http://amzn.to/bBPfop. Note the searchable “Look Inside” feature.
Hake, R.R. 2011. “The Science/Math Education Shift from Teaching to Learning” online on the OPEN! AERA-L archives at http://bit.ly/ijJeCm. Post of 22 Jun 2011 08:13:15-0700 to AERA-L and NetGold. The abstract and link to the complete 24 kB post are being transmitted to various discussion lists.
Sunday, June 19, 2011
Anatomy Education
The abstract reads:
****************************************
ABSTRACT: Robin Hopkins in a POD post “Shift in the teaching of science” wrote: “I'm interested in the shift that is required of traditional anatomists as the medical school moves toward a curriculum that requires anatomy to be taught/learned in ways that are more aligned with the clinical application of anatomy than simply ‘knowing’ anatomy (usually for tests).”
If the tests are of the usual classroom type then they require only the regurgitation of memorized material rather than higher-order learning such as the understanding of scientific concepts. I suspect that higher-order learning is required for the effective clinical application of anatomy.
In my opinion, the major shift is the teaching of science is the shift “From Teaching to Learning: A New Paradigm for Undergraduate Education” [Barr & Tagg (1995)]. Unknown to most of academia, education researchers have developed “Concept Inventories” http://en.wikipedia.org/wiki/Concept_inventory that can be used in formative pre/post testing to gauge the impact of courses on students' learning and understanding of scientific concepts.
At least in physics such testing demonstrates that “Traditional” (T) passive-student lecture courses result in course-averaged normalized learning gains g(ave) that are about two-standard deviations below those of “Interactive Engagement” (IE) courses. I give 28 hot-linked references to some of the relevant literature.
*******************************************
To access the complete 22 kB post please click on http://bit.ly/m8e4v2.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“[Educating Physicians] is a very important book that comes at a critical time in our nation’s history. We will not have enduring health care reform in this country unless we rethink our medical education paradigms. This book is a call to arms for doing just that.”
George E. Thibault, president, Josiah Macy, Jr. Foundation
REFERENCES [URL's shortened by http://bitly.com/ and accessed on 19 June 2011.]
Cooke, M., D.M. Irby, & B.C. O'Brien. 2010. Forward by Lee S. Shulman. Educating Physicians: A Call for Reform of Medical School and Residency. Jossey-Bass, publisher's information at http://bit.ly/d52HEP. Amazon.com information at http://amzn.to/jhZJ0l. Note the searchable “Look Inside” feature.
Hake, R.R. 2011. “Anatomy Education,” online on the OPEN! AERA-L archives at http://bit.ly/m8e4v2. Post of 19 Jun 2011 14:15:36-0700 to AERA-L and NetGold. The abstract and link to the complete 22 kB post are being transmitted to various discussion lists.
Monday, March 14, 2011
Culture of Science Education - Response to Woods
Some blog followers to might be interested in a recent post “Culture of Science Education - Response to Woods” [Hake (2011b)]. ["Woods" is Don Woods http://bit.ly/etekAw, Emeritus Professor of Chemical Engineering at McMaster University and problem-based-learning pioneer.]
The abstract reads:
************************************
ABSTRACT: In response to “Changing the Culture of Science Education at Research Universities #3” [Hake (2011a] STLHE-L's Don Woods wrote (paraphrasing):
" . . . . . by my latest count there are at least 20 valid forms of evidence that can be used for measuring teaching 'productivity.' These include Concept Inventories . . . . . as well as a well-designed course evaluations, . . . . exams and assignments,. . . . . . More details are given in my forthcoming book Motivating and Rewarding University Teachers to Improve Student Learning: A Guide for Faculty and Administrators.
I comment with regard to (1) neglect of Campbell’s and Dunkenfield’s Laws, (2) “well-designed course evaluation,” (3) Concept Inventories, and (4) yet more ways to gauge “teaching productivity” or “student learning.”
************************************
To access the complete 19 kB post please click on http://bit.ly/fetCy6.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“What we assess is what we value. We get what we assess, and if we don't assess it, we won't get it.”
-Lauren Resnick [quoted by Grant Wiggins (1990)]
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 14 March 2011.]
Hake, R.R. 2011a. “Changing the Culture of Science Education at Research Universities #3,” online on the OPEN! AERA-L archives at http://bit.ly/gSNTGi. Post of 12 Mar 2011 16:53:33-0800 to AERA-L & Net-Gold. The abstract and link to the complete post were transmitted to various discussion lists and are also online on my blog “Hake'sEdStuff” at http://bit.ly/hmX5GL with a provision for comments.
Hake, R.R. 2011b. “Culture of Science Education - Response to Woods” online on the OPEN! AERA-L archives at http://bit.ly/fetCy6. Post of 14 Mar 2011 15:30:33-0700 to AERA-L & Net-Gold. The abstract and link to the complete post were transmitted to various discussion lists.
Wiggins, G. 1990. “The Truth May Make You Free, But the Test May Keep You Imprisoned: Toward Assessment Worthy of the Liberal Arts,” AAHE Assessment Forum: 17-31; online at http://bit.ly/a7g09T.
Friday, February 18, 2011
Are Ed Administrators Aware of Physics Ed Research?
Some blog followers might be interested in a recent post “Are Ed Administrators Aware of Physics Ed Research?” [Hake (2011)]. The abstract reads:
ABSTRACT: PhysLrnR’s John (Texas) Clement posed the question (paraphrasing): “Are Ed Administrators Aware of Physics Ed Research (PER)?”
Bill Goffe responded (paraphrasing): “I would guess that PER is rarely mentioned in conferences that Ed administrators might attend or material they're likely to read: e.g., American Educator http://bit.ly/exyqTr or the Chronicle of Higher Education's ProfHacker blog http://bit.ly/ejFDah.”
There have been a few isolated mentions of PER in the:
a. American Educator: Paul Gross’ “Learning Science: Content - with Reason” at http://bit.ly/fGnzMp.
b. Chronicle's ProfHacker blog: Heather Whitney's “Just-in-Time Teaching: Beyond Physics” at http://bit.ly/dX1hRW and “Concept Inventories: Beyond Physics” at http://bit.ly/hlgmJ1.
********************************************
To access the complete 13 kB post please click on http://bit.ly/fyACYK.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“An important scientific innovation. . .[or ‘unorthodox idea’] . . . rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that ITS OPPONENTS GRADUALLY DIE OUT and that the growing generation is familiarized with the idea from the beginning: another instance of the fact that the future lies with the young.”
Max Planck (1936)
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 18 Feb 2011.]
Hake, R.R. 2011. “Are Ed Administrators Aware of Physics Ed Research?”online on the OPEN! AERA-L archives at http://bit.ly/fyACYK. Post of 18 Feb 2011 14:23:50-0800 to AERA-L, Net-Gold, and PhysLrnR. The abstract and link to the complete post are being transmitted to various discussion lists.
Planck, M. 1936. The Philosophy of Physics. W.W. Norton. Amazon.com information at http://amzn.to/e0Ao1G.
Wednesday, January 5, 2011
Learning Outcomes: Face-to-Face vs Online #3
Some blog followers might be interested in a recent post “Learning Outcomes: Face-to-Face vs Online #3” [Hake (2011)]. The abstract reads:
*************************************
ABSTRACT: In an earlier post, “Learning Outcomes: Face-to-Face vs Online,” I responded to a question posed by STLHE-L’s Martin Rosenzweig: “Does anyone know of any published studies comparing online to face-to-face instruction with regards to learning outcomes?” I wrote: “As far as I know the answer is ‘NO.’ The reason is [as pointed out in ‘Can Distance and Classroom Learning Be Increased?’ (Hake, 2008a)] ‘scholars of teaching and learning continue to rely on low-resolution gauges of students’ learning.’ ”
In response, several discussion-list subscribers called attention to “Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies” [USDE (2009)] at http://bit.ly/e1VXvA, of which I had been unaware.
In my opinion, the USDE (2009) study is yet another example of reliance on low-resolution gauges of students’ learning. On pages 11-12 it is stated that examples of learning outcome measures included: (a) scores on standardized tests, (b) scores on researcher-created assessments, (c) grades/scores on teacher-created assessments (e.g., assignments, midterm/final exams), and (d) grades or grade point averages.
But among lessons of the physics education research effort [Hake (2002)] are that: (1) “c” and “d” are *invalid* measures of students' *higher-order* learning; and (2) analyses of “a” and “b” are best carried out in terms of the course average *normalized* gain [g], ignored in USDE (2009).
Furthermore, on page 18 of http://bit.ly/e1VXvA the USDE report states: “The mean effect size for all 50 contrasts of online vs face-to-face instruction was +0.20.”
Contrast the above with the effect size d = +2.43 for the superiority of [[g]] (the average of the course average [g]) for 48 face-to-face “interactive engagement” physics courses vs 14 face-to-face “traditional”introductory physics courses [Hake (1998a,b; 2002; 2008b)].
I suspect that similar large effect sizes would be found for the superiority of online “interactive engagement” courses vs online “traditional” courses.
In my opinion it makes little sense to meta-analyze online vs face-to-face instruction without taking into account the relatively large effects on higher-order learning of “interactive-engagement” vs “traditional” instruction.
*************************************
To access the complete 20 kB post please click on http://bit.ly/egC8I3.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The
Netherlands
President, PEdants for Definitive Academic References which Recognize the
Invention of the Internet (PEDARRII)
rrhake@earthlink.net>
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively incourses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
Wood & Gentile (2003)
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 05 Jan 2010.
Hake, R.R. 2002. “Lessons from the physics education reform effort,” Ecology and Society 5(2): 28; online at http://bit.ly/aL87VT.
Hake, R.R. 2008a. “Can Distance and Classroom Learning Be Increased?” IJ-SoTL 2(1): January; online at http://bit.ly/98dL0Y.
Hake, R.R. 2011. “Learning Outcomes: Face-to-Face vs Online #3,” online on the OPEN! AERA-L archives at http://bit.ly/egC8I3. Post of 5 Jan 2011 13:54:14 -0800 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to various discussion lists.
