8 – The AP Chemistry Reforms

The new 2013 AP Chemistry Framework and its 2014 Exam asked students to solve problems by de-emphasizing factual recall/memorization and practicing inquiry/reasoning. Recent studies in cognitive science have proven that those strategies simply do not work for students in first-year college-level science courses. The result was a 2014 AP Chemistry Exam in which 73% of students received a 3 or below.  Directing students to use problem-solving methods that science has proven do not work was profoundly unfair to students and their instructors.

The College Board (CB) is evaluating the 2013/14 changes to the AP Chemistry program over the next several months. Below are findings from cognitive research that during that review hopefully they may consider.

The 2013 AP Chemistry reforms focused on 4 areas:

  1. Limiting content but improving understanding.
  2. Decreasing reliance on memorization/recall.
  3. Increasing reliance on reasoning and guided inquiry in learning, and
  4. Applying knowledge to new contexts.

Over the past 4 years, researchers in cognitive science (the field of how the brain works and how it learns) have agreed on recommendations for how students can best be taught to solve problems in math and the physical sciences. Unfortunately, in 3 of the 4 areas above, the AP Curriculum Framework gives direction that is the opposite of the recommendations of cognitive science.

Going forward, to be in agreement with science’s understanding of how the brain solves problems, the AP Chemistry Framework will need to change. Let’s examine the “reforms” above one at a time.

1.  The new AP framework (p. 8) calls for “balancing breadth of content coverage with depth of understanding.” Cognitive research supports this finding. During initial learning, the primary goal is to increase knowledge in a student’s long-term memory (LTM). To build LTM requires repeated practice spaced over multiple days. If students are asked to learn too much new information at once, they tend to “cram” and learning does not “stick.” Limiting, mastering, and focusing on the connotations of new content is change consistent with cognitive science.

2.  The new AP framework (p. 8) advocates that students “spend less time on factual recall” and disparages “mere rote recall” (p. 16). A “Course Planning and Pacing Guide” on the CB website states: “The framework emphasizes moving students beyond formulaic algorithms and focusing instead on conceptual reasoning.”

Cognitive scientists are in agreement that students must devote years to moving facts and problem solving algorithms into memory before that can “reason conceptually” in the way experts in science can do.  In pre-graduate-school math and science, students must be able to recall, quickly and automatically from LTM, nearly all of the facts and algorithms needed to solve a problem. This is because working memory (WM), where the brain solves problems, can “conceptually reason” with essentially all information recallable from LTM, but with very limited amounts of non-memorized knowledge.

At this link, on pages 8-12, leading cognitive scientists document these findings.

 Putting Students on the Path to Learning: The Case for Fully Guided Instruction, at:


As of  2014, the strengths and weaknesses of working memory, among scientists who study how the brain solves problems, are undisputed facts. In disparaging factual recall, the College Board was advising students based on outdated understandings no longer in agreement with what experts in how the brain works say is true.

In the book Make It Stick and articles by Daniel Willingham (see the Read Recs tab above), researchers in cognition advise instructors on ways to help students gradually construct both memory and understanding.

For AP teachers, the message from cognitive science is that memorization is even more important than we thought.

3.  The AP framework advises instructors and students to “spend more time on inquiry-based learning.” The website’s “AP Science Inquiry Statement” (4/21/11) advocates both “guided” and “open inquiry” using “student-designed procedures.”  The first three pages of the Clark summary document the overwhelming evidence that for the learning of new content to be efficient, students need to be clearly informed of what experts over centuries have found to be true.  These studies show that “Open inquiry,” in which students’ formulate their own questions and study procedures, is particularly inefficient.

As Clark, et. al. note (page 7):

 “Researchers…   tested whether those who had learned through discovery were better able to transfer their learning to new contexts…. Direct instruction involving considerable guidance … resulted in vastly more learning than discovery.”

That said, as the Clark authors note, guided inquiry can be a way to practice the application of newly acquired knowledge. In addition, inquiry can “prepare students for learning,” spark curiosity, and foster essential motivation. To the extent that explaining and committing content to memory can be done during study time, cognitive studies indicate that inquiry guided by instructors can be a good use of class time.

Again, these research conclusions are likely not surprising to AP instructors.

4.  The CB staff explanation of the 2014 Exam Results, posted on the AP Chemistry bulletin board on June 16, states (p. 6):

 The redesigned AP Chemistry Exam questions each require students to go beyond factual recall and calculations …, demonstrate a clear understanding … and apply that understanding to new scenarios, just as they will be required to do in science majors and careers.

How well students can transfer  facts and procedures to new situations has been found to be dependent on the number of elements of a situation that are familiar in memory. That expert knowledge takes years of study to acquire (Clark, p. 9-11). “Far transfer” is a key skill for professionals, but to find employment in chemistry generally requires at least 30 college chemistry credits, not 8. If we push students to use strategies which do not fit what their brain at that point can do, many aspiring science students will be handicapped, discouraged, and frustrated (Clark, p. 8).

Additional evidence that the type of instruction now being advocated by the CB does not work is provided by the 2014 AP exam results. The AP framework (page 3) states that questions are reviewed to ensure they are “fair and that there is an appropriate spread of difficulty.” That’s the right goal. However, on the 2014 free response questions, the score average ranged from 11% to 42%, with a median 33%. Is that “an appropriate spread of difficulty?”

The percentage of students scoring 4 or 5 was 40.4% in 2013 but  26.7% in 2014:  73% this year were given scores of 3 or below. Do the 2014 scores, for an academically select group with special interest in chemistry, indicate a fair test of what students should be able to do?

NSF data show that the percentage of U. S. college graduates who received bachelor’s degrees in chemistry fell from 1.61% in 1969 to 1.07% in 1990 to 0.74% in 2010. That’s a 54% decline since 1969, a serious concern for our nation. Will the 2014 AP results encourage more students to major in chemistry?

Five years ago, when CB leaders first proposed these reforms, many educational philosophies proposed that students could be spared the hard work of memorization if they instead learned reasoning skills. Those proposals were well-intended, but science is about what is true.  Those who study the brain scientifically have proven that those hopes were wishful thinking. As a result of new measurements of working memory, many past learning theories are being re-examined.

Cognitive science says that students can learn to solve problems by inquiry the way an expert can, but it requires constructing the memory of a scientist, which takes many years of effort.

In designing the new framework and exam, the CB cited the expectations of college professors — who are experts in chemistry. When shown the new data from cognitive science, it is likely that those professors will say, “Respect the domain expertise of those who study the brain. Don’t deny the science.”

Some without credentials in cognitive science continue to advocate for their philosophies, but in making decisions that decide the welfare of young people, the findings of experts in a scientific domain must prevail over beliefs.

For our best high school students, seeking to learn chemistry, willing to take a demanding AP course,  the CB has given 73% of those students a 3 or below – what the world calls a C or below – because students could not learn in one course what science has proven takes years of intense study. That is profoundly unfair.

The CB is using its Curriculum Framework and  exam to pressure teachers to teach in a certain way.   The methods the CB is pressuring teachers to use cognitive science has found do not work.

But the cognitive science findings are recent, and the CB deserves the opportunity to make this right.  An easy way to do so is for the CB to reverse its decision to tell teachers how to teach.

Until 2014,  the College Board provided a fair test of the range of problems that students can be taught to solve in a first course in college chemistry. That rigorous external standard motivated instructors to find ways to improve achievement.  To return to providing such a test would continue the College Board’s substantial heritage of service to students, instructors, and the nation.

 # # # # #


7 -Automaticity For Elements Encountered Frequently

“In each field, certain procedures are used again and again. Those procedures must be learned to the point of automaticity so that they no longer consume working memory space. Only then will the student be able to bypass the bottleneck imposed by working memory and move on to higher levels of competence.”

 —    UVa. cognitive scientist Daniel Willingham in Practice Makes Perfect. Am. Educator 2004, 28(1), 31-33

In Post #2, we examined the bottleneck that cognitive science has discovered when students try to solve problems in science and math: the severe constraints on working memory when reasoning with information that has not previously been well-memorized.

When solving a problem, if a student mustlook up” the symbol for a potassium ion, the answer takes up one of the “3-5 slots” in working memory (WM) available for non-memorized elements of knowledge. If a student must look up any fact or procedure, for the other data that must be held in WM to solve the problem, the “30 seconds or less” limit on WM retention  ticks away.

In contrast, when knowledge is well memorized,  slots in WM open up that allow the student to focus on the characteristics of facts and procedures: the associations (cues) that assist in recalling background information needed to solve a problem.  When knowledge elements in LTM are accessed at the same time during problem solving, connections grow between neurons storing those elements (‘neurons that fire together, wire together”). Cues which activate one neuron activate others in the framework, and activated  elements  can be recalled into WM to guide problem solving.  Those links are the physical substance of conceptual frameworks.

For instructors, what are the implications of this research? If we identify for our students at a gradual pace the relationships used most often in chemistry, and encourage self-testing until they can recall these fundamentals automatically, their success in problem solving should significantly improve.

Here’s an experiment applying this research.  First-year problems generally involve about 40 elements: those in the first 3 rows and first and last two columns of the periodic table, plus those elements “known to the ancients” with symbols based on latin names.

“Memorizing to automaticity” the “name/symbol correlation” for these elements will free capacity in WM during problem solving. Knowing element location in the periodic table will speed finding atomic numbers, molar masses, and predicted monatomic charges, and during problem solving in WM, speed is important.

To promote automaticity in recall of fundamentals, in our tutorials we ask students to memorize the name, symbol, and (for most) the location of the “top 40” elements. An example one of these assignments is on page 49 (pdf page 55) of our “preparatory chem” sample chapters at


The 40 elements are assigned in 4 batches in our prep chem lessons and 3 in gen chem, with a short quiz on the assignments to encourage completion. A typical quiz is posted here: www.ChemReview.Net/PeriodicQuiz.pdf .  and blank practice forms are posted at www.ChemReview.Net/BlankPeriodicTable.pdf .

You may want to hand out a “blank table” copy with each element assignment, and let them know that the same blank table will  be part of an upcoming quiz.  The last quiz might be scheduled just before the names and symbols are needed for compound nomenclature.

Does fast recall of fundamentals help during problem solving?  We look forward to hearing your observations and results.

#   #   #   #   #

6 – Finding Time For A First Day Activity

In Post #1 we asked:

 On your first day with a section, if you were free of pressure to cover content, what demonstration/discussion/inquiry would you recommend — or like to try?  What would your “first-day dream lesson” be?

In this post, we’d like to suggest a way to find time for an activity of your choosing on Day One without falling behind in a tight schedule for covering content.

“Exponential Notation” and “The Metric System” are covered near the beginning of most courses. Our recommendation would be: ask students to complete online tutorials that cover those two topics as homework. To do this, the following procedure has been used with success in a variety of first-year settings.

  1. Review the tutorials posted for free student use at http://www.chemreview.net/ChemFreeCh1and2.pdf . Decide which topics you would like students to complete during single or multiple assignments.

If access to computer printers is an issue for some of your students, you may want to print and hand out a packet of the small number of “printed pages” that are needed for topics you assign.

  1. A key to encouraging completion of the homework is a short, announced “closed notes quiz.” Quiz options on this assignment are posted at www.ChemReview.Net/Mods1and2Quiz.docx in a format that allows you to select, add, and edit questions.

One or two “online homework” questions due each class may also encourage students to work at the steady pace that promotes long-term retention of learning in memory.  However, online homework is inherently “open notes.”  In our experience, a “closed notes” quiz on content is also necessary to convey that the goal is quick recall from memory of  fundamentals (see Post #2).

  1. Try your favorite Day One activity. On the day before the quiz on the homework tutorials, you might put up a problem or two covering the topics for the class to do and discuss. Or you might hand out one of the 3 quiz versions as a practice quiz to then go over, with opportunity for questions.

The intent of the tutorials is not to replace discussing a topic in class, but to permit coverage more quickly as a review of homework, freeing time in lecture for additional higher-level activities.

Review copies of additional tutorials for preparatory and general chemistry are available to instructors (see the Resources tab above). Editable quizzes on all of the tutorials are provided to instructors using the lessons with classes.

(Additional “First Week” activities were posted 8/8/2018, see WeekOneFiles ).

In the Comments, we’d like to hear:

 If you tried the above experiment, what were student outcomes?  Next time, what would you do differently?

 # # # # #

5 – Discussion of Our Flipping Paper

During May 2014, the ACS DivChEd Committee on Computers in Chemical Education (CCCE) is hosting an online conference (a “ConfChem”) on flipping chemistry:  the moving a part of traditional lecture content to student study time, to allow more time in class for demonstrations, discussions, and problem solving.

In the ConfChem format, authors submit papers and for a one week period, readers submit questions online which the authors answer (other readers may contribute as well). The Q&A period moves to new papers each week, but each paper and its week-long discussion remain online for future reference.

The Hartman/Dahm/Nelson group was invited to discuss the results of our experiments in shifting lecture content to homework in first-year chemistry — without videotaping lectures.  Our paper with comments and discussion is posted at

http://confchem.ccce.divched.org/2014SpringConfChemP2 .

The period for CCCE comment is closed, but if you wish to submit further questions or comments on the paper or its discussion, please submit a Comment for this blog post.

 All of the conference abstracts and papers are posted at http://confchem.ccce.divched.org/2014SpringConfChem.  Readers interested in exploring flipped instruction will find quite a few innovative ideas in the papers and their discussion.

We wish to thank Jennifer Muzyka, Chris Luker, Bob Belford and all of the CCCE volunteers for their many weeks of work to promote discussion of these issues.

 #   #   #   #   #

4 – Comprehension and Cognition

 … much of the information needed to understand a text is not provided in the information expressed in the text itself but must be drawn from the language user’s knowledge…..

— Van Dijk and Kintsch, Strategies of Discourse Comprehension

The previous post included 4 short experiments. If you have not yet tried those, please do so and return here.

In this post, interpretations of the results of the experiments in post #3 will be proposed. In the Comments, you will be invited to agree or disagree with those views.

The model in cognitive science that explains comprehension includes the following:

  • We talk and write in code. Often subconsciously, whether speaking or writing, we leave out detail and assume that the background knowledge of our audience will be able to fluently fill in gaps. When reading a science text, comprehension is heavily dependent on the reader’s ability to fluently recall domain fundamentals.
  • To gain background knowledge, an individual must construct long-term memory: a slow process involving physiological changes in the brain. For most topics, persuading the brain to build initial memory requires substantial effort and practice.
  • Experts can fluently construct “mental models” of problem scenarios because they can fluently recall linkages among the facts and rules of their field: just a few words can cue linked memories that allow technical text to be understood (see Experiment 4), but expertise across a scientific domain takes years of study to achieve.
  • Scientific reading and problems that experts find easy, non-experts often find nearly impossible, even if they have good reasoning skills, because they lack fluent recall of assumed background knowledge.
  • Standard general chemistry texts are an incredible resource for interesting problems, 4-color diagrams, fascinating sidebars, and “refreshing the memory” when content is later needed. However, those reference texts generally proceed at the fast pace appropriate to refresh prior memory, rather than the slower pace required to construct initial memory.

To find more time in lecture for activities that build both interest and conceptual understanding, we will need to find ways to explain the “code” of chemistry during study time – using materials designed to promote initial building of memory.

In upcoming posts, we’ll talk about how that challenge might be met – without massive demands on limited instructor time.


At the “Read Recs” tab at the top, for more on memory, see the short Clark article. On oral and reading comprehension, The Knowledge Gap by E. D. Hirsch (who suggested the sports scenarios in Post #3 and epigraph above) is superb.

For Comment:

  • Is the analysis above consistent with your experience as an instructor?
  • What are the best ways to help students learn new content from general chemistry reference texts?   Are “study guides” useful?  Is an “introduction to the topic in lecture” enough so that most students can read their text with the comprehension needed to gain additional learning?

#   #   #   #   #

3 – Reading Science With Comprehension

To free more time in lecture for demonstrations and higher-level activities, one strategy is to have students read their textbook to learn the content of lecture, but many instructors report that a high percentage of students are either unwilling or unable to do so.  To explore why this may be the case, try these brief experiments in reading comprehension.

Experiment 1:  Explain the meaning.

The tracheal chimera was fully lined with mucosa, which consisted of respiratory epithelium from the donor and buccal mucosa from the recipient.  (NEJM, 1/14/10)

*  *  *  *  *

Easier reading is on the sports page.

Experiment  2:  Read, then answer the questions below.

When Sarwan and Chanderpaul were going on strongly, England were looking down the barrel. But they came back with Broad removing both of them within 8 overs of taking the 2nd new ball.   It was always going to be difficult to survive with that kind of a batting line up and England then seemed to be on top. But the last pair hung around for ages to ensure that light is offered and they walk off.   (Times, New Delhi)

 What happened?  Who won?  Which words in the passage are unfamiliar?

*  *  *  *  *

For probably 1/3 of our worldwide English-speaking audience, that was easy!  Now try

Experiment  3:   For this historical anecdote, answer the questions below.

With the game tied in the bottom of the ninth, Jeter scored on a sacrifice by Rodriguez to the warning track in right.

What happened?  What does “sacrifice” mean?  How many “outs” were there?  If this was a regular season game, exactly where in the universe was Derek’s right foot planted at the moment the ball was caught?  Why did he not run until the fly was caught?

Where are those answers supplied?  To answer each question, about how long did you take?

*  *  *  *  *

Finally, here are comprehension questions that less than 1% of most audiences can complete.

Experiment  4:

For carbon tetrachloride:

What is its shape?  What are its bond angles?  What is its molecular dipole moment in debyes?  Does it tend to dissolve in oils or water?

How long did it take you to answer these questions?  Were you slowly reasoning — or answering pretty much instantly?

In the next post, we will note what cognitive science has to say in interpreting the results above.  But first, let us invite you to Comment on one or more of these questions:

Given time, “using the internet and reasoning skills” would likely work on Experiment 1.  For the Experiment (s) 2 and/or 3 that you found difficult, would that strategy work?  Why or why not?

On the question of the ability of students to read general chemistry texts with comprehension, what are some possible implications of the experiments above?

Finally, on Experiment 2, would someone kindly explain to the blog author who won?

 #   #   #   #   #

2 – The Necessity for Initial Memorization

 “At all ages, there are several ways to improve the functional capacity of working memory.  The most central of these is the achievement of automaticity, that is, the fast, implicit, and automatic retrieval of a fact or a procedure from long-term memory.”

                —      Final Report of the National Mathematics Advisory Panel (2008).

The goal of learning is to be able to solve problems. As instructors, the most important questions we face are:

  • How does the student brain solve problems in chemistry?
  • What actions should we advise students to take to learn to solve problems?

Over the past decade, research in cognitive science has reached agreement on how students between about age 12 and graduate school solve “well-structured problems” (those with clear “right answers”) in math and science.  A brief summary of that model follows.  Citations and opportunity for discussion are provided at the end of this post.

How the Brain Solves Problems

To solve problems, the brain uses two types of memory:  working memory (WM) and long-term memory (LTM).  WM is where the brain manipulates knowledge: where you think and reason.  WM can accept input from your senses (such as seeing an object or reading a problem) and from your LTM.

We will define LTM very roughly for now (more in later postings) as where the brain holds knowledge that you can recall, but your have not seen or heard in the past 2 minutes. An example is: 6 x 7 = 42   As knowledge in LTM is encountered in a variety of contexts, it is tagged with meaning, via links to other knowledge, to form a conceptual framework (also termed a schema).

Since 2001, the characteristics of WM have been scientifically measured and verified. The key finding is this:

During problem solving, working memory can manipulate

Virtually all related elements of knowledge that can be recalled quickly and accurately from LTM based on “element cues” in the problem, plus

Up to 3-5 elements of knowledge, each for up to 30 seconds, that cannot be fluently recalled from LTM based on element cues.

To help with understanding, let’s restate this law and its implications in different ways.

  • Working memory is very limited when dealing with information that has not previously been well memorized.
  • In the working memory where you think, space for non-memorized information is minimal, but space for well-memorized relationships is enormous.
  • When trying to solve a problem, if just a few elements of knowledge needed to solve the problem cannot be recalled from LTM, limits on the capacity of WM are exceeded. This will likely lead to confusion that prevents the problem from being solved.
  • Your ability to solve problems depends primarily on how much knowledge you have “automated” — how much you can “recall with automaticity” (fluently) from LTM.
  • To get around the quantified bottleneck in the ability of the mind to reason, students must begin by memorization:  They must work to move elements relationships into a long-term memory that is resistant to change. Then, as they work to gain procedural fluency and conceptual understanding in the use of new knowledge, automaticity in recall from LTM is gradually achieved.
  • The strength of LTM is that it is long term. Once information is well memorized and well organized, the ability of WM is able to recall and apply that knowledge fluently (automatically and effortlessly) often lasts for decades.

In a nutshell, that’s the foundation for learning math and science.  The unexpected finding that the brain is very limited when reasoning with what has not been well memorized is now verified, accepted science, and it will force a re-examination of many recent theories on teaching and learning.

Some educators will be dismayed with a finding that memorization is required for problem-solving, but the value of science is that it measures what is true, whether we like that truth or not.   Thanks to recent research, instructors now have a much better understanding of how to advise our students to learn efficiently and effectively.  That’s the best possible scientific progress.

At this point, we’d like to ask our readers to answer some questions in the Comments that will “set the agenda” for this blog.  (Click on Comment below the title of this post.)

  • After reading one or two of the brief references below, do you see any discrepancies between the statements above and what cognitive science is saying?
  • What questions would you like to see addressed to further explore these data?

In upcoming posts, we will explore these issues in additional detail. We will also try to suggest answers to your questions based on our reading of cognitive research.

* * * * *


For short summaries in non-technical terms of recent research in cognition, see the brief Clark and NMAP readings in the “Read Recs” tab above.

Make It Stick, a new book describing the importance of memorization in learning, was recently reviewed in the Chronicle of Higher Education  at this link:   MakeItStick .

For a recent review on the NIH website of the “3-5 chunk limit” research, see http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2864034/

Any first-year textbook in cognitive psychology with a copyright date in the past 5 years will detail the interaction between WM and LTM in its chapter titled “Problem Solving” or “Cognitive Architecture.”

# # # # #

1 – First-Year Chemistry: The Great Debate

To solve problems in first-year chemistry, should instructors encourage students to learn concepts and then apply reasoning skills?  Or must students begin by memorizing fundamental facts and algorithms?

How can we help students to improve their skills in solving scientific calculations?  Given computer software to solve problems, are calculation skills still important?

Which strategies can best help instructors “flip” instruction — to find more time in lecture for activities that build interest and understanding? 

Which reform proposals are most likely to improve achievement and retention in first-year courses? 

Since 1990, those and related questions have been debated in chemistry education.  In this blog, we hope to offer a forum to discuss these issues from two scientific perspectives:

  •  The findings of recent cognitive research on how the brain solves problems, and
  •  Your experience as instructors in first-year chemistry (including all college and high school courses in, and to prepare for, college general chemistry).

A special emphasis will be in addressing the apparent disconnect between many of the reform strategies suggested over the past decade in “chemical education research” and the recommendations of cognitive scientists — whose area of expertise is how the human brain works, and in particular how it learns.

For the author of this post (Eric Nelson), interest in the intersection of chemistry education and cognitive science grew out of a collaboration with Dr. Don Dahm at Rowan University (NJ) and Dr. JudithAnn Hartman at the Naval Academy.  Our goal was to find ways to transfer parts of the traditional content of first-year chemistry lecture to study time, to gain more time in lecture to work with students on higher-level problems, discuss issues of chemistry in society, and conduct more of the “wet chem demonstrations” that deepen student understanding of the science we love.

Research summaries by cognitive scientist Daniel Willingham suggested that if we gave students “typed lecture notes that included clicker questions,” they could learn content effectively during homework.  We tried this approach and met with measurable success.

Since then, we have prepared additional materials aimed at helping instructors gain additional time in lecture for higher-level activities. In this blog, we will discuss how these or similar materials might provide you with time for instructional experiments with activities of your choosing. We have also done considerable reading in cognitive research, looking for findings that are especially applicable to chemistry instruction. We will share what we have found that we think can help  instructors to better help students succeed in chemistry.

In the Comments section of the blog, we hope to entice you into sharing your experiences and discussing what science says on the “great debate” issues above.

To start, you are invited to click on Comments (below the title of this post) and share your ideas on the following:

To introduce a topic, cognitive studies recommend an activity that sparks student curiosity.  Given that goal, on your first day with a first-year section, if you were free of pressure to cover content, what demonstration/discussion/inquiry would you recommend — or like to try?  What would your “first-day dream lesson” be?

* * *

(Comments are moderated only to assure that they pertain to science education.)