Monday, September 14, 2009
Once again, my workload is interfering with my blogging, but I came across a 2007 article that resonates with my interests in higher education assessment. This report by Richard Fliegel and John Holland at the University of Southern California describes one university's approach to assessing students' development of critical thinking through college. The authors question the value of large-scale measurement instruments and intend their instrument for local use by faculty within their own university. They also provide a useful, if brief, overview of the many different critical thinking assessment instruments out there. These efforts contrast with our work on domain-specific assessment. We're defining the forms of reasoning that emerge from understanding fundamental non-intuitive concepts unique to specific disciplines, such as evolution for biology or supply-and-demand models for economics. Although there are some aspects of reasoning well within a domain that likely contribute to overall improved critical thinking, we think a clear analysis of cognitive aspects of reasoning well within a specific domain might be useful too. We have seen how common sense critical thinking does not work well if you're asked to analyze a problem through a lens of a specific domain's core concepts. At best, generic critical thinking can get you positioned to learn the "big ideas" in a domain, but does not substitute for that content knowledge.