Friday, September 13, 2013
Re: Are Tenure Track Professors Better Teachers?
ABSTRACT: Scott Jaschik (2013) in his Inside Higher Ed report “The Adjunct Advantage” at http://bit.ly/19PGOZn has pointed to “Are Tenure Track Professors Better Teachers?” [Figlio et al. (2013) at http://bit.ly/1erGvKp. In my opinion, the latter's attempt to indirectly measure students’ learning in introductory courses by means of their next-class-taken performance is problematic at best.
Unfortunately, most of academia is either unaware or dismissive of the direct gauging of students’ higher-order learning by means of pre/post testing with Concept Inventories http://bit.ly/dARkDY, pioneered independently by economist Rendigs Fels (1967) at http://bit.ly/162KSBv and physicists Halloun & Hestenes (1985a) at http://bit.ly/fDdJHm.
For a discussion of pre/post testing with Concept Inventories, see e.g., “Should We Measure Change? YES!” [Hake (2013)] at http://bit.ly/d6WVKO. For recent use of this method see “The Calculus Concept Inventory - Measurement of the Effect of Teaching Methodology in Mathematics” [Epstein (2013) at http://bit.ly/17a8XJd.
****************************************
To access the complete 12 kB post please click on http://bit.ly/1829UU4 .
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to Socratic Dialogue Inducing (SDI) Labs:http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
GooglePlus: http://bit.ly/KwZ6mE
Google Scholar: http://bit.ly/Wz2FP3
Twitter: http://bit.ly/juvd52
Facebook: http://on.fb.me/XI7EKm
Linked In: http://linkd.in/14uycpW
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
- William Wood & James Gentile (2003)
REFERENCES [URLs shortened by http://bit.ly/ and accessed on 13 Sept. 2013.]
Hake, R.R. 2013. “Re: Are Tenure Track Professors Better Teachers?” online on the OPEN! AERA-L archives at http://bit.ly/1829UU4. Post of 13 Sep 2013 16:41:24-0400 to AERA-L and NetGold. The abstract and link to the complete post are being transmitted to various discussion lists.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online as a 213 kB pdf http://bit.ly/SyhOvL.
Monday, August 13, 2012
What Mathematicians Might Learn From Physicists
*********************************************'
ABSTRACT: Mary Shepherd of the RUME list has called attention to David Bressoud's recent MAA “Launchings” columns (a) “Learning from the Physicists” [Bressoud (2012a)] at http://bit.ly/MrAuyZ, and (b) “Barriers to Change” [Bressoud (2012b)] at http://bit.ly/NkW9dE.
Unfortunately, Bressoud neglects to point out that the most important lesson mathematicians might learn from physicists is the advantage of discovering what instructional methods do and do not work by means of pre/post testing with Concept Inventories http://en.wikipedia.org/wiki/Concept_inventory - see e.g., “Lessons from the Physics Education Reform Effort” [Hake (2002) at http://bit.ly/aL87VT and “Bioliteracy and Teaching Efficiency: What Biologists Can Learn from Physicists” [Klymkowsky et al. (2003)] at http://bit.ly/9A1Arx. Pre/post testing with Concept Inventories has only recently been brought to math education by Jerry Epstein http://bit.ly/bqKSWJ with his “Calculus Concept Inventory.”
In my opinion, Bressoud is handicapped by the neglect of any mention of pre/post testing in his primary source “The Use of Research-Based Instructional Strategies in Introductory Physics: Where do Faculty Leave the Innovation-Decision Process?” [Henderson, Dancy, & Niewiadomska-Bugaj (2012)] at http://bit.ly/MWSxIU. The second sentence of their abstract reads: “Significant empirical research has shown that student learning can be substantially improved when instructors move from traditional, transmission-style instruction to more student-centered, interactive instruction [Bransford et al. (2000), Handelsman et al. (2004)].”
In referencing Bransford et al. (2000) and Handlesman et al. (2004)], Henderson et al. carry on the PER tradition of mindlessly dismissing the breakthrough research of Halloun & Hestenes (1985a) - see e.g. “The Initial Knowledge State of College Physics Students” http://bit.ly/b1488v (scroll down to “Evaluation Instruments”). As far as I know, that research constituted the first “significant empirical research [showing] that student learning can be substantially improved when instructors move from traditional, transmission-style instruction to more student-centered, interactive instruction.” Instead of emphasizing the preeminent role of PER in education research Henderson et al. erroneously imply that physicists simply followed the lead of cognitive scientists [Bransford et al. (2000)] and biologists [Handlesman et al. (2004)].
*********************************************
To access the complete 22 kB post please click on http://bit.ly/ROjN2T.
Richard Hake, Emeritus Professor of Physics, Indiana University
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Academia: http://bit.ly/a8ixxm
Blog: http://bit.ly/9yGsXh
Twitter http://bit.ly/juvd52
GooglePlus: http://bit.ly/KwZ6mE
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
- William Wood & James Gentile (2003).
REFERENCES [URL's shortened by http://bit.ly/ and accessed on 13 August 2012.
Hake, R.R. 2012. “What Mathematicians Might Learn From Physicists” online on the OPEN AERA-L archives at http://bit.ly/ROjN2T. Post of 13 Aug 2012 16:59:34 -0700 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Wood, W.B. & J.M. Gentile. 2003. "Teaching in a research context," Science 302: 1510; 28 November; online to subscribers at
Wednesday, December 7, 2011
Physics Education Research (PER) Could Use More PR
***********************************************
ABSTRACT: PhysLrnR’s Bill Goffe wrote (paraphrasing): “PHYSICS EDUCATION RESEARCH (PER) COULD USE MORE PR. In the last year I've only seen PER in the popular press twice: (1) a slew of reports on ‘Improved Learning in a Large Enrollment Physics Class’ [Deslauriers, Schelew, and Wieman (2011) http://bit.ly/sNVYKI, and (2) ‘Don't Lecture Me’ http://bit.ly/vw3b5H broadcast on local NPR stations. As I understand it, JOURNALISTS DON'T SO MUCH READ THE SCIENTIFIC LITERATURE (OR LISTSERVS!) BUT GET IDEAS PITCHED TO THEM. I would bet that an awful lot of pitching was done for Deslauriers et al. - it suddenly appeared in numerous publications. It would seem that more needs to be done along these lines."
Among the reports on Deslauriers et al. were: (a) “Study: It’s not teacher, but method that matters” [Borenstein (2011)] in the Associated Press; (b) “Less Talk, More Action: Improving Science Learning” [Carey (2010)] in the New York Times; (c) “An Alternative Vote: Applying Science to the Teaching of Science” in The Economist (2011)]; (d) “A Better Way to Teach?” [Mervis (2011)] in ScienceNOW; (e) “The Worst Way to Teach” [Bressoud 2011a)]; and (f) “The Best Way to Learn” [Bressoud 2011b)]; the last two in the Lauchings Column of the Mathematical Association of America.
Consistent with Goffe’s idea that PER needs more PR, the non-physicists Daniel Willingham http://bit.ly/p8aPpM and James Stigler http://bit.ly/ofJSwU interviewed by Carey (2011); and Jere Confrey http://bit.ly/pZXKm1 interviewed by Mervis (2011) revealed no acquaintance with any physics education research other than Deslauriers et al., even despite many references to such research in: (1) Deslauriers et al. (2011); and (2) many articles dating back to 2001 in influential journals including Science.
Unfortunately, the two examples of PER in the popular press cited above by Goffe both contain substantive errors: (a) Deslauriers et al. erroneously claim that “As reviewed by Froyd (2007) other science and engineering classroom studies report effect sizes less than 1.0”; (b) David Hestenes at http://bit.ly/ncfVQI in the “Don't Lecture Me” http://bit.ly/vw3b5H broadcast, erroneously states “. . .Eric Mazur was unusual. He was the first one who took it. . . . .[[Halloun & Hestenes (1985a)]]. . . . to heart.”
***********************************************
To access the complete 25 kB post please click on http://bit.ly/uQ7X5U.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References
which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
Links to Articles: http://bit.ly/a6M5y0
Links to SDI Labs: http://bit.ly/9nGd3M
Blog: http://bit.ly/9yGsXh
Academia: http://iub.academia.edu/RichardHake
“There is substantial evidence that scientific teaching in the sciences, i.e., teaching that employs instructional strategies that encourage undergraduates to become actively engaged in their own learning, can produce levels of understanding, retention and transfer of knowledge that are greater than those resulting from traditional lecture/lab classes. But widespread acceptance by university faculty of new pedagogies and curricular materials still lies in the future.”
Robert DeHaan (2005) in “The Impending Revolution in Undergraduate Science Education”
REFERENCES [All URL's shortened by http://bit.ly/ and accessed on 7 Dec 2011.]
DeHaan, R.L. 2005. “The Impending Revolution in Undergraduate Science Education,” Journal of Science Education and Technology 14(2): 253-269; online as a 152 kB pdf at http://bit.ly/ncAuQa.
Hake, R.R. 2011. “Physics Education Research (PER) Could Use More PR,”online on the OPEN! AERA-L archives at http://bit.ly/uQ7X5U. Post of 7 Dec 2011 13:45:18-0800 to AERA-L and Net-Gold. The abstract and link to the complete post are being transmitted to several discussion lists.
Sunday, October 16, 2011
No Standard Outcome Measures For Science Education? #2
The abstract reads:
*************************************************
ABSTRACT: Robin Millar and Jonathan Osborne in Chapter 3, “Research and Practice: A Complex Relationship” of Shelley et al. (2009). claimed that: (a) NO STANDARD OR COMMONLY AGREED OUTCOME MEASURES EXIST FOR ANY MAJOR TOPIC IN SCIENCE EDUCATION. . . . [[my CAPS]]. . . ; (b) the Force Concept Inventory (FCI) reflects a choice of values that is arguable; and (c) the FCI has not been subjected to the same rigorous scrutiny of factorial structure and content validity as have standard measures in psychology.
That no standard outcome measures exist for any major topic in science education is negated by the existence of Concept Inventories
That the FCI reflects values that are arguable is correct only if the arguers think that there’s little value in students’ learning the basic concepts of Newtonian mechanics.
That the FCI has not been subjected to rigorous scrutiny of factorial structure ignores the 1995 factor analyses of Huffman & Heller and Heller & Huffman; and responses to those analyses by Hestenes & Halloun and Halloun & Hestenes.
That the FCI has not been as not been subjected to rigorous scrutiny of content validity ignores section IIB. “Validity and reliability of the mechanics test” (Mechanics Diagnostic) in Halloun & Hestenes (1985a) - that verification of validity applies also to the FCI since it's almost the same as the Mechanics Diagnostic.
*************************************************
To access the complete 24 kB article please click on http://bit.ly/rfyamc.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References
which Recognize the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“50 years of research, curriculum development, and implementation have not presented consistent and compelling patterns of outcomes.”
Shelley et al. (2009, p. 4), summarizing a claim by Osborne (2007)
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
Wood & Gentile (2003) "Teaching in a research context"
REFERENCES [All URL’s shortened by http://bit.ly/ and accessed on 16 Oct 2011.]
Hake, R.R. 2011. “No Standard Outcome Measures For Science Education? #2” online on the OPEN! AERA-L archives at http://bit.ly/rfyamc . Post of 16 Oct 2011 11:04:41-0700 to AERA-L and Net-Gold. The abstract and link to the complete post were transmitted to various discussion lists.
Osborne, J. 2007. “In praise of armchair science education,” contained within E-NARST News 50(2), online as a 3.2 MB pdf at http://bit.ly/qsRwaK . The talk itself is online as a 112 kB pdf at http://bit.ly/r4Khl7.
Shelley, M.C., L.D. Yore, & B. Hand, eds. 2009. Quality Research in Literacy and Science Education: International Perspectives and Gold Standards. Springer, publisher's information at http://bit.ly/b58vbP. Amazon.com information at http://amzn.to/97OVJx, note the searchable “Look Inside” feature. Barnes & Noble information at http://bit.ly/p40bKu. An expurgated (teaser) version is online as a Google “book preview” at http://bit.ly/qK8T9P.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online as a 209 kB pdf at http://bit.ly/oK46p7.
Monday, September 19, 2011
The Impact of Concept Inventories On Physics Education and It’s Relevance For Engineering Education
The abstract reads:
*********************************************
I review the:
(a) Before Concept Inventory (BCI) dark ages of post-secondary introductory physics education;
(b) 1985 advent of the first physics CI, the Mechanics Diagnostic (MD) by Halloun & Hestenes (HH);
(c) 1987-90 early research use of the (MD) by HH, Hake, and Crouch & Mazur;
(d) 1992 Force Concept Inventory (FCI), successor to the MD, and the Factor Analysis Debate (1995);
(e) 1995 revision of the FCI by Halloun, Hake, Mosca, and Hestenes;
(f) 1998 meta-analysis of FCI/MD results on 62 introductory physics courses (N = 6542) showing about a two-standard-deviation superiority in average normalized gains
I then indicate:
(a) fourteen hard lessons from the physics education reform effort;
(b) suggestions for the administration and reporting of CI’s;
(c) listings of CI’s, including those for physics and engineering; and comment that:
(d) for physics education the road to reform has been all uphill;
(e) the glacial inertia of the educational system, though not well understood, appears to be typical of the slow Diffusion of Innovations [Rogers (2003)] in human society;
(f) there are at least “Eleven Barriers to Change in Higher Education”;
(g) but, even so, for physics education, Rogers’ “early adopters” of reform have now appeared at Harvard, North Carolina State University, MIT, the Univ. of Colorado, California Polytechnic at San Luis Obispo, and the Univ. of British Columbia, possibly presaging a Rogers “take off” for physics education reform, about two decades ACI (After Concept Inventory).
I conclude that:
(a) CI’s can stimulate reform, but judging from the results in physics it may take about two decades before even early adopters become evident;
(b) there are at least seven reasons why the rate of adoption of reforms may be greater in engineering education than in physics education.
In an Appendix I respond to criticisms of the FCI and the average normalized gain g(ave).
*********************************************
To access the complete 8.7 MB article please click on http://bit.ly/nmPY8F.
Richard Hake, Emeritus Professor of Physics, Indiana University 24245 Hatteras Street, Woodland Hills, CA 91367
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands.
rrhake@earthlink.net http://www.physics.indiana.edu/~hake/ http://www.physics.indiana.edu/~sdi/ http://HakesEdStuff.blogspot.com/ http://iub.academia.edu/RichardHake
REFERENCES
Hake, R.R. 2011. “The Impact of Concept Inventories On Physics Education and It's Relevance For Engineering Education,” invited talk, 8 August, second annual NSF-sponsored “National Meeting on STEM Concept Inventories,” Washington, D.C., online as an 8.7 MB pdf at http://bit.ly/nmPY8F or as ref. 64 at http://bit.ly/a6M5y0.
Tuesday, May 17, 2011
SET's Are Not Valid Gauges of Students’ Higher-Level Learning #2
Some blog followers might be interested in discussion-list post “SET’s Are Not Valid Gauges of Students’ Higher-Level Learning #2” [Hake (2011)].
The abstract reads:
****************************************
ABSTRACT: In response to “Changing the Culture of Science Education at Research Universities #3” [Hake (2011a) http://bit.ly/gSNTGi], problem-based-learning pioneer http://bit.ly/etekAw Don Woods (2011a) wrote at http://bit.ly/h1VrME [my CAPS; my insert at ". . . . .[[insert]]. . . . .]:
“. . . . there are at least 20 valid forms of evidence that can be used for measuring teaching ‘productivity.’ These include . . . . . well-designed COURSE EVALUATIONS. . . . .[[I shall assume (please correct me if I’m wrong) that Woods uses ‘course evaluations’ as shorthand for ‘Student Evaluations of Teaching (SET’s)’]]. . . . . , exams and assignments, . . . . . More details are given in my forthcoming book ‘Motivating and Rewarding University Teachers to Improve Student Learning: A Guide for Faculty and Administrators’. . . . . .[[Woods, 2011b)]]. . . . .”
In “Culture of Science Education - Response to Woods” [Hake (2011b) http://bit.ly/fetCy6] I wrote (paraphrasing):
“I disagree that SET’s are a valid method of measuring ‘teaching productivity’ IF ‘teaching productivity’ means ‘student learning’ - see e.g., ‘Re: Problems with Student Evaluations: Is Assessment the Remedy?’ [Hake (2002a)], ‘SET’s Are Not Valid Gauges of Teaching Performance #4’ [Hake (2006e)], and ‘Effectiveness of Student Evaluations’ [PhysLrnR (2011)].”
In the present post I give 7 EXHIBITS suggesting that “SET’s ARE NOT VALID GAUGES OF STUDENTS' HIGHER-LEVEL LEARNING”: (1) Halloun & Hestenes (1985a); (2) Crouch & Mazur (2001); (3) Eric Mazur (1997, 2009); (4) John Belcher (2003); (5) Richard Hake (2006f); (6) Richard Hake (2011c); (7) Russ Hunt (2011); and (8) David Gavrin (2003).
****************************************
To access the complete 76 kB post please click on http://bit.ly/jLZaz5.
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize
the Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
“Physics educators have led the way in developing and using objective tests to compare student learning gains in different types of courses, and chemists, biologists, and others are now developing similar instruments. These tests provide convincing evidence that students assimilate new knowledge more effectively in courses including active, inquiry-based, and collaborative learning, assisted by information technology, than in traditional courses.”
Wood & Gentile (2003)
REFERENCES [URL’s shortened by http://bit.ly/ and accessed on 17 May 2011.]
Hake, R.R. 2011. “SET’s Are Not Valid Gauges of Students' Higher-Level Learning #2,” online on the OPEN! AERA-L archives at http://bit.ly/jLZaz5. Post of 17 May 2011 09:47:36-0700 to AERA-L and Net-Gold. The abstract and link to the complete post are also being distributed to various discussion lists.
Wood, W.B., & J.M. Gentile. 2003. “Teaching in a research context,” Science 302: 1510; 28 November; online to subscribers at http://bit.ly/9izfFz. A summary is online to all at http://bit.ly/9qGR6m.
Saturday, June 5, 2010
Is a Valid and Reliable Concept Test as Impossible as Perpetual Motion?
************************************************
ABSTRACT: In a post "Why Password Protection of Concept Tests is Critical: A Response to Klymkowsky" [Hake (2010a)] I wrote ". . . once arduous qualitative and quantitative research by disciplinary experts has culminated in a valid and reliable concept test I think it should be password protected." Marion Brady quipped "Yes. Certainly. Along with password protection for perpetual motion and eternal life, which are likely to appear simultaneously." But Marion's apparent belief in the impossibility of a valid and reliable concept test is contradicted by the evidence [see e.g., Halloun & Hestenes (1985a), Savinainen & Scott (2002), and Hake (2010b)].
************************************************
To access the complete 14 kB post please click on http://tinyurl.com/38equwj .
Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize the
Invention of the Internet (PEDARRII)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake
http://www.physics.indiana.edu/~sdi
http://HakesEdStuff.blogspot.com
http://iub.academia.edu/RichardHake
REFERENCES [Tiny URL's courtesy http://tinyurl.com/create.php .]
Hake, R.R. 2010a. "Why Password Protection of Concept Tests is Critical: A Response to Klymkowsky," online on the OPEN! AERA-L archives at http://tinyurl.com/26r8bhh . Post of 4 Jun 2010 15:56:17-0700 to AERA-L and Net-Gold. The abstract and link to the complete post were distributed to various discussion lists and are also online at
Hake, R.R. 2010b. "Is a Valid and Reliable Concept Test as Impossible as Perpetual Motion?" online on the OPEN! AERA-L archives at http://tinyurl.com/38equwj . Post of 5 Jun 2010 14:59:29 -0700 to AERA-L and Net-Gold. The abstract and link to the complete post are being distributed to various discussion lists.
Tuesday, April 6, 2010
Re: Multiple Choice Exam Questions #2
Some blog followers might be interested in a post of the above title. The abstract reads:
***********************************************
ABSTRACT: Karol Dean of the POD list asked: "Is there any research or folklore to support the 1 question/minute formula [for multiple-choice questions] that I've heard?"
To which Ken Bain replied: ". . . . . multiple-choice questions that simply require the regurgitation of isolated information, or worse yet, the ability to recognize correct answers . . . tend to foster surface or strategic rather than deep approaches to learning. . . . .THIS DOESN'T MEAN THAT YOU CANNOT DEVELOP MULTIPLE-CHOICE EXAMINATIONS THAT CAN FOSTER DEEP APPROACHES. Look, for example, at the way Eric Mazur develops what are basically multiple-choice questions for his Peer Learning approach. But that approach is embedded in an environment designed to promote deep considerations. . . . . . To understand and appreciate Mazur's approach, you must understand both the way he develops the questions and how he uses them. Once you understand that (and both the need to promote deep approaches to learning and the research on what fosters deep approaches), I think you will quickly see that requiring students to answer multiple choice questions in less than a minute each (130 questions in 90 minutes) will foster the most shallow of approaches to learning, and cannot possibly foster deep approaches."
In this post I:
(a) quote psychometricians Mark Wilson and Meryl Bertenthal in support of Bain's claim that "multiple-choice examinations that can foster deep approaches," and
(b) elaborate on the physics education environment in which Mazur came to desert the traditional passive student lecture for an "Interactive Engagement" method.
***********************************************
To access the complete 28 kB post, please click on http://groups.yahoo.com/group/Net-Gold/message/32417 .
Regards,
Richard Hake, Emeritus Professor of Physics, Indiana University
24245 Hatteras Street, Woodland Hills, CA 91367
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References (PEDAR)
rrhake@earthlink.net
http://www.physics.indiana.edu/~hake/
http://www.physics.indiana.edu/~sdi/
http://HakesEdStuff.blogspot.com/
http://iub.academia.edu/RichardHake
REFERENCES [Tiny URL's courtesy http://tinyurl.com/create.php .]
Hake, R.R. 2010. "Re: Multiple Choice Exam Questions #2," online on the OPEN! Net-Gold archives at http://groups.yahoo.com/group/Net-Gold/message/32417. Post of 6 April 2010 17:57:00-0700 to AERA-L and Net-Gold.
Thursday, July 9, 2009
Is Scientifically-based Education an Oxymoron?
Some blog followers may be interested in a recent post of the above title [Hake (2009)]
The abstract reads:
*************************************
ABSTRACT: Jerry Bracey in his book Education Hell: Rhetoric vs. Reality listed what he regarded as 10 lessons from the “Eight-Year Study” of 1942, in which more than 30 high schools in the 1930s were encouraged to try non-traditional approaches to teaching. Washington Post education columnist Jay Mathews then (a) repeated Bracey's 10 lessons along with comments by Bracey and by himself, and (b) bravely invited his readers to kick sand in the faces of Bracey and himself by letting him know which of the Bracey/Mathews comments were most inane.” Taking Mathews at his word, in my view the most inane Bracey/Mathews comments center around Bracey's Lesson #8 that SCIENTIFICALLY BASED EDUCATION IS AN OXYMORON. If this lesson is correct then it would appear that the following authors all have their heads buried in the sand: David Hestenes (1979), Edward (Joe) Redish (1999), Richard Shavelson & Lisa Towne (2002) and members of the National Academy's "Committee on Scientific Principles for education research," Paula Heron & David Meltzer (2005), Carl Wieman (2007), and Richard Hake (2007).
*************************************
To access the complete 24 kB post, please click on http://tinyurl.com/n9cyjy .
REFERENCES
Hake, R.R. 2009. “Is Scientifically-based Education an Oxymoron?” online on the OPEN! AERA-L archives at http://tinyurl.com/n9cyjy . Post of 7 Jul 2009 17:03:51-0700 to AERA-L and Net-Gold. The abstract only was transmitted to various discussion lists.