5 – Discussion of Our Flipping Paper

During May 2014, the ACS DivChEd Committee on Computers in Chemical Education (CCCE) is hosting an online conference (a “ConfChem”) on flipping chemistry:  the moving a part of traditional lecture content to student study time, to allow more time in class for demonstrations, discussions, and problem solving.

In the ConfChem format, authors submit papers and for a one week period, readers submit questions online which the authors answer (other readers may contribute as well). The Q&A period moves to new papers each week, but each paper and its week-long discussion remain online for future reference.

The Hartman/Dahm/Nelson group was invited to discuss the results of our experiments in shifting lecture content to homework in first-year chemistry — without videotaping lectures.  Our paper with comments and discussion is posted at

http://confchem.ccce.divched.org/2014SpringConfChemP2 .

The period for CCCE comment is closed, but if you wish to submit further questions or comments on the paper or its discussion, please submit a Comment for this blog post.

 All of the conference abstracts and papers are posted at http://confchem.ccce.divched.org/2014SpringConfChem.  Readers interested in exploring flipped instruction will find quite a few innovative ideas in the papers and their discussion.

We wish to thank Jennifer Muzyka, Chris Luker, Bob Belford and all of the CCCE volunteers for their many weeks of work to promote discussion of these issues.

 #   #   #   #   #

4 – Comprehension and Cognition

 … much of the information needed to understand a text is not provided in the information expressed in the text itself but must be drawn from the language user’s knowledge…..

— Van Dijk and Kintsch, Strategies of Discourse Comprehension

The previous post included 4 short experiments. If you have not yet tried those, please do so and return here.

In this post, interpretations of the results of the experiments in post #3 will be proposed. In the Comments, you will be invited to agree or disagree with those views.

The model in cognitive science that explains comprehension includes the following:

  • We talk and write in code. Often subconsciously, whether speaking or writing, we leave out detail and assume that the background knowledge of our audience will be able to fluently fill in gaps. When reading a science text, comprehension is heavily dependent on the reader’s ability to fluently recall domain fundamentals.
  • To gain background knowledge, an individual must construct long-term memory: a slow process involving physiological changes in the brain. For most topics, persuading the brain to build initial memory requires substantial effort and practice.
  • Experts can fluently construct “mental models” of problem scenarios because they can fluently recall linkages among the facts and rules of their field: just a few words can cue linked memories that allow technical text to be understood (see Experiment 4), but expertise across a scientific domain takes years of study to achieve.
  • Scientific reading and problems that experts find easy, non-experts often find nearly impossible, even if they have good reasoning skills, because they lack fluent recall of assumed background knowledge.
  • Standard general chemistry texts are an incredible resource for interesting problems, 4-color diagrams, fascinating sidebars, and “refreshing the memory” when content is later needed. However, those reference texts generally proceed at the fast pace appropriate to refresh prior memory, rather than the slower pace required to construct initial memory.

To find more time in lecture for activities that build both interest and conceptual understanding, we will need to find ways to explain the “code” of chemistry during study time – using materials designed to promote initial building of memory.

In upcoming posts, we’ll talk about how that challenge might be met – without massive demands on limited instructor time.

References:

At the “Read Recs” tab at the top, for more on memory, see the short Clark article. On oral and reading comprehension, The Knowledge Gap by E. D. Hirsch (who suggested the sports scenarios in Post #3 and epigraph above) is superb.

For Comment:

  • Is the analysis above consistent with your experience as an instructor?
  • What are the best ways to help students learn new content from general chemistry reference texts?   Are “study guides” useful?  Is an “introduction to the topic in lecture” enough so that most students can read their text with the comprehension needed to gain additional learning?

#   #   #   #   #

3 – Reading Science With Comprehension

To free more time in lecture for demonstrations and higher-level activities, one strategy is to have students read their textbook to learn the content of lecture, but many instructors report that a high percentage of students are either unwilling or unable to do so.  To explore why this may be the case, try these brief experiments in reading comprehension.

Experiment 1:  Explain the meaning.

The tracheal chimera was fully lined with mucosa, which consisted of respiratory epithelium from the donor and buccal mucosa from the recipient.  (NEJM, 1/14/10)

*  *  *  *  *

Easier reading is on the sports page.

Experiment  2:  Read, then answer the questions below.

When Sarwan and Chanderpaul were going on strongly, England were looking down the barrel. But they came back with Broad removing both of them within 8 overs of taking the 2nd new ball.   It was always going to be difficult to survive with that kind of a batting line up and England then seemed to be on top. But the last pair hung around for ages to ensure that light is offered and they walk off.   (Times, New Delhi)

 What happened?  Who won?  Which words in the passage are unfamiliar?

*  *  *  *  *

For probably 1/3 of our worldwide English-speaking audience, that was easy!  Now try

Experiment  3:   For this historical anecdote, answer the questions below.

With the game tied in the bottom of the ninth, Jeter scored on a sacrifice by Rodriguez to the warning track in right.

What happened?  What does “sacrifice” mean?  How many “outs” were there?  If this was a regular season game, exactly where in the universe was Derek’s right foot planted at the moment the ball was caught?  Why did he not run until the fly was caught?

Where are those answers supplied?  To answer each question, about how long did you take?

*  *  *  *  *

Finally, here are comprehension questions that less than 1% of most audiences can complete.

Experiment  4:

For carbon tetrachloride:

What is its shape?  What are its bond angles?  What is its molecular dipole moment in debyes?  Does it tend to dissolve in oils or water?

How long did it take you to answer these questions?  Were you slowly reasoning — or answering pretty much instantly?

In the next post, we will note what cognitive science has to say in interpreting the results above.  But first, let us invite you to Comment on one or more of these questions:

Given time, “using the internet and reasoning skills” would likely work on Experiment 1.  For the Experiment (s) 2 and/or 3 that you found difficult, would that strategy work?  Why or why not?

On the question of the ability of students to read general chemistry texts with comprehension, what are some possible implications of the experiments above?

Finally, on Experiment 2, would someone kindly explain to the blog author who won?

 #   #   #   #   #

2 – The Necessity for Initial Memorization

 “At all ages, there are several ways to improve the functional capacity of working memory.  The most central of these is the achievement of automaticity, that is, the fast, implicit, and automatic retrieval of a fact or a procedure from long-term memory.”

                —      Final Report of the National Mathematics Advisory Panel (2008).

The goal of learning is to be able to solve problems. As instructors, the most important questions we face are:

  • How does the student brain solve problems in chemistry?
  • What actions should we advise students to take to learn to solve problems?

Over the past decade, research in cognitive science has reached agreement on how students between about age 12 and graduate school solve “well-structured problems” (those with clear “right answers”) in math and science.  A brief summary of that model follows.  Citations and opportunity for discussion are provided at the end of this post.

How the Brain Solves Problems

To solve problems, the brain uses two types of memory:  working memory (WM) and long-term memory (LTM).  WM is where the brain manipulates knowledge: where you think and reason.  WM can accept input from your senses (such as seeing an object or reading a problem) and from your LTM.

We will define LTM very roughly for now (more in later postings) as where the brain holds knowledge that you can recall, but your have not seen or heard in the past 2 minutes. An example is: 6 x 7 = 42   As knowledge in LTM is encountered in a variety of contexts, it is tagged with meaning, via links to other knowledge, to form a conceptual framework (also termed a schema).

Since 2001, the characteristics of WM have been scientifically measured and verified. The key finding is this:

During problem solving, working memory can manipulate

Virtually all related elements of knowledge that can be recalled quickly and accurately from LTM based on “element cues” in the problem, plus

Up to 3-5 elements of knowledge, each for up to 30 seconds, that cannot be fluently recalled from LTM based on element cues.

To help with understanding, let’s restate this law and its implications in different ways.

  • Working memory is very limited when dealing with information that has not previously been well memorized.
  • In the working memory where you think, space for non-memorized information is minimal, but space for well-memorized relationships is enormous.
  • When trying to solve a problem, if just a few elements of knowledge needed to solve the problem cannot be recalled from LTM, limits on the capacity of WM are exceeded. This will likely lead to confusion that prevents the problem from being solved.
  • Your ability to solve problems depends primarily on how much knowledge you have “automated” — how much you can “recall with automaticity” (fluently) from LTM.
  • To get around the quantified bottleneck in the ability of the mind to reason, students must begin by memorization:  They must work to move elements relationships into a long-term memory that is resistant to change. Then, as they work to gain procedural fluency and conceptual understanding in the use of new knowledge, automaticity in recall from LTM is gradually achieved.
  • The strength of LTM is that it is long term. Once information is well memorized and well organized, the ability of WM is able to recall and apply that knowledge fluently (automatically and effortlessly) often lasts for decades.

In a nutshell, that’s the foundation for learning math and science.  The unexpected finding that the brain is very limited when reasoning with what has not been well memorized is now verified, accepted science, and it will force a re-examination of many recent theories on teaching and learning.

Some educators will be dismayed with a finding that memorization is required for problem-solving, but the value of science is that it measures what is true, whether we like that truth or not.   Thanks to recent research, instructors now have a much better understanding of how to advise our students to learn efficiently and effectively.  That’s the best possible scientific progress.

At this point, we’d like to ask our readers to answer some questions in the Comments that will “set the agenda” for this blog.  (Click on Comment below the title of this post.)

  • After reading one or two of the brief references below, do you see any discrepancies between the statements above and what cognitive science is saying?
  • What questions would you like to see addressed to further explore these data?

In upcoming posts, we will explore these issues in additional detail. We will also try to suggest answers to your questions based on our reading of cognitive research.

* * * * *

References:

For short summaries in non-technical terms of recent research in cognition, see the brief Clark and NMAP readings in the “Read Recs” tab above.

Make It Stick, a new book describing the importance of memorization in learning, was recently reviewed in the Chronicle of Higher Education  at this link:   MakeItStick .

For a recent review on the NIH website of the “3-5 chunk limit” research, see http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2864034/

Any first-year textbook in cognitive psychology with a copyright date in the past 5 years will detail the interaction between WM and LTM in its chapter titled “Problem Solving” or “Cognitive Architecture.”

# # # # #

1 – First-Year Chemistry: The Great Debate

To solve problems in first-year chemistry, should instructors encourage students to learn concepts and then apply reasoning skills?  Or must students begin by memorizing fundamental facts and algorithms?

How can we help students to improve their skills in solving scientific calculations?  Given computer software to solve problems, are calculation skills still important?

Which strategies can best help instructors “flip” instruction — to find more time in lecture for activities that build interest and understanding? 

Which reform proposals are most likely to improve achievement and retention in first-year courses? 

Since 1990, those and related questions have been debated in chemistry education.  In this blog, we hope to offer a forum to discuss these issues from two scientific perspectives:

  •  The findings of recent cognitive research on how the brain solves problems, and
  •  Your experience as instructors in first-year chemistry (including all college and high school courses in, and to prepare for, college general chemistry).

A special emphasis will be in addressing the apparent disconnect between many of the reform strategies suggested over the past decade in “chemical education research” and the recommendations of cognitive scientists — whose area of expertise is how the human brain works, and in particular how it learns.

For the author of this post (Eric Nelson), interest in the intersection of chemistry education and cognitive science grew out of a collaboration with Dr. Don Dahm at Rowan University (NJ) and Dr. JudithAnn Hartman at the Naval Academy.  Our goal was to find ways to transfer parts of the traditional content of first-year chemistry lecture to study time, to gain more time in lecture to work with students on higher-level problems, discuss issues of chemistry in society, and conduct more of the “wet chem demonstrations” that deepen student understanding of the science we love.

Research summaries by cognitive scientist Daniel Willingham suggested that if we gave students “typed lecture notes that included clicker questions,” they could learn content effectively during homework.  We tried this approach and met with measurable success.

Since then, we have prepared additional materials aimed at helping instructors gain additional time in lecture for higher-level activities. In this blog, we will discuss how these or similar materials might provide you with time for instructional experiments with activities of your choosing. We have also done considerable reading in cognitive research, looking for findings that are especially applicable to chemistry instruction. We will share what we have found that we think can help  instructors to better help students succeed in chemistry.

In the Comments section of the blog, we hope to entice you into sharing your experiences and discussing what science says on the “great debate” issues above.

To start, you are invited to click on Comments (below the title of this post) and share your ideas on the following:

To introduce a topic, cognitive studies recommend an activity that sparks student curiosity.  Given that goal, on your first day with a first-year section, if you were free of pressure to cover content, what demonstration/discussion/inquiry would you recommend — or like to try?  What would your “first-day dream lesson” be?

* * *

(Comments are moderated only to assure that they pertain to science education.)