Buy that special someone an AP Physics prep book, now with five-minute quizzes aligned with the exam: 5 Steps to a 5 AP Physics 1

Visit Burrito Girl's handmade ceramics shop, The Muddy Rabbit: Mugs, vases, bowls, tea bowls...

29 July 2015

Why do I teach: a rather prickly response

My school posed the entirely reasonable, in context, question: Why do you teach?  Each faculty member was asked to respond in a narrative and post to our faculty development site.

The folks who asked the question are friends; they're not just colleagues, they're the best teachers at the school.  As you see below, I take offense at the question, but not offense toward the people asking it.  They didn't know, were unlikely to know, why they pushed my buttons.  

Me, I still don't understand why teachers are expected to have kumbaya moments around the fireplace in regard to their employment, yet e.g. bankers, videogame programmers, and professional athletes are not.  Nevertheless... and with advance recognition that I very much enjoy my job and my school...

Why do I teach?

This answer is going to be quite a bit prickly.  I know this exercise was intended in good faith and without ulterior motives.  Yet, the question itself hits a major nerve.  


Short answer: None of your business. The question is offensive to me, though I know you intended no offense.

Most folks in academia are aware that seemingly every woman physicist has a story about someone in her life – family, professors, colleagues – who made an extraordinarily rude statement suggesting that she is in the wrong profession for a girl.  “You’ll never get a husband as a physics major,” or “Why don’t you take this lower-level class, the girls usually need a bit of catch-up,” or, famously and recently, “Three things happen when [girls] are in the lab: you fall in love with them, they fall in love with you, and when you criticize them they cry."

What folks don’t often recognize is a different social problem faced by physics teachers, especially male physics teachers.  Many of us have stories of family, administrators, and colleagues who don’t quite understand what a person with a degree in a “real” or “useful” field – and a man, to boot – is doing as a teacher. 

In interviews I was asked the question, “They say those who can, do, but those who can’t, teach.  So what made you decide to teach?”  The automatic assumption was that a person with physics and engineering degrees must be a crappy engineer indeed if he must resort to teaching for a living.  I’ve been asked repeatedly over the years, “why don’t you use your physics degree?” as if I’m making waste of said degree by teaching.  And, of course, “Gregory Charles, we paid all that money to send you away to an elite college, and you’ve decided to teach?  What are you thinking?!?”  Somehow, my sister with a theater degree from Dartmouth never got those sorts of questions when she began her teaching career.

And these questions are from well-meaning people.  I’m not even including the outright condescending discrimination from female colleagues and administrators who have no use for men in education.  I mean, obviously, a man who teaches is either gay, perverted, or on a power trip to become an administrator, right?

Then there’s the Soviet undertones of the question.  “Let’s all share why we’re so happy living in a Workers’ Paradise.”  What, this person isn’t happy?  He wants more of a say in how paradise is run?  He mentioned that perhaps the education establishment subjects us to vacuous dogma rather than encouraging engagement in substantive, intellectual discussion with creative, professional craftspeople?  He’s obviously a troublemaker sowing dissent and discord.  Take him to Lefortovo. 

I guess I’d like to rephrase the question… of course smart, interesting people choose to teach.  That’s obvious.  The question should instead be directed to the less intelligent or less dedicated folks, and should say “How in the hell are you a teacher?”

Why do I teach?  Because it pays the bills, I’m good at it, and I usually enjoy my students and my colleagues.  That’s all you need to know.  That’s all I’m willing to state publicly.  Actions speak louder than words: I encourage you to judge my commitment to my profession not by this sort of essay, but rather by the feedback from two decades of students, from colleagues at our school, and from fellow physics teachers around the country. 


17 July 2015

Rule 3 of teaching: Your students don't listen to you. (And a non-ohmic light bulb.)

Rule 3 of teaching, as described in the 5 Steps to a 5: AP Physics 1 teacher's manual:  Your students don't listen to you.  Don't worry, they don't listen to me, either.  

I hear regularly from physics and non-physics teachers fretting over the material they "cover" in class, over the precise content and activities they do.  I suggest taking a holistic view of a course as a whole, recognizing that students will rarely remember a specific classroom event more than a week or so later.  The College Board has gone over-the-top with this philosophy, prioritizing "science practices" and "big ideas" over content.  Their heart is in the right place.  An understanding of experimental physics isn't about "spit back the procedure, analysis, and results from this experiment you did six months ago."  It's more about, "here's a new situation that you've never seen; how would you answer a well-formed question with an experiment?"

More on-point, did you do an experiment measuring the resistance of a light bulb this year in AP Physics 1?  Did you show that the bulb's resistance changes depending on the voltage across it?  Did you have students design and carry out an experiment to determine whether, and to what extent, the bulb obeys Ohm's law?

Some of you are hanging your heads in shame, because you didn't -- and this very experiment showed up as free response problem 2 on the 2015 AP Physics 1 exam.  My big, friendly point is, don't worry about trying to match experiments with what might show up on the exam.  Not only is it an impossible fool's errand, but it doesn't even matter.

My class did this exact experiment in January.  I even made it what education professors would call an "open inquiry" exercise.  Toward the end of the circuits unit, during which we had always treated electronic devices as having constant resistance, I pointed out that some books suggest that a light bulb under some conditions might not obey Ohm's law.  It was each lab group's job to test the validity of those books' contention.

Oh what wonderful results we got!  Most groups figured out quickly and independently to graph voltage as a function of current.  You can see one of the graphs in the picture at the top of the post.  The curve is apparent as soon as you smack a ruler down on the page.  The slope varied from 51 ohms at about 2 V, to 77 ohms at about 8 V.  The bulb is non-ohmic, with a 30%-plus difference in resistance across a useful range of voltages.

Since we did such a good job with this experiment, one might expect that my students kicked arse on 2015 free response problem 2.  Um, nope.  My students performed, by far, worse on that problem compared to the others.  The College Board just released some class statistics, showing which quartile our students fell into on each of the free response problems.  On problems 1, 3, 4, and 5, the vast majority of my class performed in the top 25% nationally.  On problem 2, more than a third of my students were in the bottom half.  

Back in 2003, the same sort of thing happened in reverse.  I remember kicking myself because that was the first year ever when I didn't do an optics-bench-style experiment with my class; sure enough, that was the year when problem 4 was a laboratory-based optics bench question.  Turned out, though, my students did fine, indistinguishably from other years when I had sometimes done the very experiments that showed up on the exam.

The precise lab exercises you do don't matter.  And that's because of Rule 3. Don't take this rule as a complaint, or as the "get off my lawn" ramblings of an old man carrying on about the danged kids these days.  It's just a well-verified observation.  I see it as my job to be sure that my students succeed despite Rule 3.


12 July 2015

Do NOT allow questions during tests... repost

Never even allow a student to ask a question during a test or quiz.  This is perhaps the most important piece of teaching advice I can give.  

I am utterly convinced that your school could raise your SAT scores by 20 points across the board, and your AP math/science scores by a third of a grade, merely if your math and science departments never allowed questions during tests.

It is a dirty little secret that no one ever discusses... so many teachers talk their students through difficult problems.  No wonder those students struggle when they're faced with standardized tests, when their friendly lifeline is taken away.

I hear people argue with me, saying that they answer questions on tests because they want to help the students succeed.  Well, so do I -- and I take offense to the ridiculous connection that refusing to answer test questions equates to not caring about students.  I want my students to succeed over the time frame of their physics course.  That doesn't mean they must ace every individual test or quiz.  It is crucial that we allow our students to make mistakes, and then to learn from those mistakes.  I judge my success by how well students perform at year's end, not by whether one student got one question right on one test.

Here is the critical post explaining my approach, including some help with the issue that I know many of you already brought up, that "I could never get away with this at my school."  :-)


11 July 2015

AP Physics 1 scores 2015 -- more people passed Physics 1 than Physics B, and other commentary.

By now those of you who taught the inaugural year of AP Physics 1 have seen how their students did.  Not like the old physics B, eh?  Let's talk about the reasons for the ostensible precipitous decline in the scores.

Firstly, the raw score necessary to earn each AP grade has increased, by about 5-6% across the board.  Trinna Johnson and Trevor Packer sent a letter to the "AP Teacher Community" discussion group describing the score-setting process in tremendous detail.  In that letter, they revealed the grade cutoffs, which I've converted to percentage of available points necessary for each grade:

AP PHYSICS 1 GRADE          Percentage of available points on the test
     5                                                       71%
     4                                                       55%
     3                                                       41%
     2                                                       26%

The old physics B exam, typically, had cutoff scores of 65%-50%-35%-25%.  It takes more correct answers to pass now.

The AP Physics 1 exam, though, is considerably more difficult that Physics B.  There are no pity points available for simple calculations.  Synthesis is prized over recall.  There's no room to hide -- the questions probe for explanations rather than answers.  Due to the higher raw score cutoffs, we would expect fewer of our students to pass even on an exam of equivalent difficulty to AP Physics B. Now we have two effects that combine to reduce overall exam grades: a harder exam AND higher cutoff scores.

And finally, consider the population of students who took AP Physics 1 this year.

In 2014, 90,000 students worldwide took the AP Physics B exam; of these, 60% passed, and 14% earned 5s.  (That itself is a bit down from previous years, because the number of students taking AP B doubled over the previous decade.)  That works out to about 13,000 students earning 5s on Physics B, and 55,000 passing.

In 2015, 170,000 students took the AP Physics 1 exam -- just about double the population who previously took AP B.  Part of the intent of the redesign was to increase the pool of students who could handle an AP physics course.  Physics B was intended as a second-year course, and was so broad that it did not encourage serious, deep understanding.  Physics 1 is in fact for first-time advanced physics students.  Many schools appropriately replaced their "honors physics" courses with AP Physics 1.  Good.

But this twofold expansion in the student -- and teacher -- pool means a much broader range of student -- and teacher -- ability.  Many of the 80,000 additional students taking the exam were intrinsically weaker students.  And a bunch of teachers who were not experienced with college-level physics, or who were simply not yet capable of teaching college-level physics, were nevertheless thrown into an AP 1 course.  No wonder only 4% of the country earned 5s; no wonder only 37% passed.

Let's look at raw numbers now, not percentages.  On AP Physics 1, about 7,000 students earned 5s.  This is about half as many as earned 5s on Physics B.

But 63,000 students passed the AP Physics 1 exam -- that's considerably more than the 55,000 who passed AP Physics B  the previous year.  Even on a more difficult exam, even with higher standards for passing, more students passed this year than last.  Of course... because Physics 1 is intended as a first year course.  Sure, a bunch of folks tried this exam who weren't ready (or whose teachers weren't ready).  So what.  Thousands of folks who were ready just fine tried the new exam, and found out that they could do it.

As teachers forcibly learn that physics is about more than plugging numbers into equations, as students figure out that they can't write a bunch of baloney and expect to earn credit, AP Physics 1 scores should eventually improve.  It's on us to adjust our teaching to help these scores improve.

09 July 2015

Open Lab 2015 -- Last Call

The Manning Family Science Building, site of the Open Lab
This year's Open Lab will run Sunday to Tuesday, July 19-21, at Woodberry Forest School. Participants have requested things to do such as:

* discuss materials and laboratory ideas for conceptual physics; 

* do quantitative demonstrations and experiments with waves;

* show some options for computer simulations beyond the typical Phet; 

* set up each of the five AP Physics 1 free response problems experimentally while brainstorming short- and long-form lab exercises -- for all levels of physics class -- with these problems.

My goal is to do all these things and more, with the opportunity to improvise further with whatever equipment we have in the storeroom or the hardware store.  

The best part of any gathering of physics teachers is the shop talk and sharing.  To that end, on the first night the Woodberry Forest science department is providing dinner at my house.  Everyone's welcome; I just need to know soon if you'd like to come.  Send me an email, and I'll get you on the list.

Festivities start around 4:00 (pm, of course) on Sunday afternoon in my classroom, followed by dinner around 6:30.  Monday we'll work 8:30-4:00 or so; Tuesday we'll be done at noon.  Feel free to attend for all or part of this time.  No fees, no hassle; an email telling me your plans is the complete registration process.  You'll need to find a place to stay... I recommend the Holiday Inn Express in Orange.  Several folks will also be staying at the Doubletree in Charlottesville and carpooling up.

GCJ

06 July 2015

Mail Time: Is it "fair" to evaluate students on the quality of their homework?

As I was going through emails in preparation for the 2015 open lab (please let me know if you'd like to attend!), I found this:

I saw in your "Less is More" article that homework [was at that time] 25% of the total grade for your classes.  I was considering making homework a much lower percentage but mostly a "good faith effort completion" grade, since I've found it difficult to justify to myself grading students on their knowledge of a material while their still in the process of learning it, rather than an exam where they are reviewing the material.  What are your thoughts on this?

That's an important question for any physics teacher to be able to answer.  Remember that physics teaching is art, not science -- there are few hard truths of physics teaching, only ideas that work or do not work for each of us.  Would pointillism have worked for Picasso?  Could Rodgers and Hammerstein have written about singing cats?  Maybe, maybe not; yet pointillism and Cats! are indisputably successful things that other artists should at least be aware of.  I have my answer to the homework question that indisputably has worked wonders for me and many others.  Some good teachers may disagree with me on principle, or may choose not to use my approach in their teaching environment.  Yet everyone should acknowledge, whether they use it or not, that my approach does in fact produce considerable success for me and my students.  

To the question, then:  When you grade homework, you're not grading students on their knowledge of the material; you're grading the skill of problem solving with new concepts, along with students' diligence in seeking the correct answer.  Fact is, homework (or any work) is worthless if it's not done carefully with a full effort toward getting the correct answer and approach.  "Good faith effort completion" sounds great, but ask yourself -- if you graded students' homework carefully, would they do a better job?  Would they perform better on tests?  

My answer is, I grade homework carefully and thoroughly on a regular basis, especially early in the year.  I grade such that the students expect that their work will be judged, such that the students do their work to the highest standard they can.  And therefore, my students don't have to study for tests.  And, they perform well on those tests, because they've practiced carefully.  The one year when I didn't carefully grade homework, many students did a half-arsed job on the homework, then were upset when their test performance was poor, then complained to all who would listen that physics was too hard and that I was mean and unreasonable in my expectations, that I didn't understand my students.

As for the "fairness' issue, is it fair for the football coach to choose a starting quarterback based on his performance in practice?  I mean, practice is when players are supposed to develop their skills, right, and only the game really matters?  Yes, but everything's a test, everyone is evaluated all the time.  If you grade homework regularly -- even every other night, even only part of one problem, even a grade on a 0-1-2 scale, then you'll have enough data that one bad performance on something a student couldn't grasp quickly will be a mere blip.  

I now am counting homework and daily quizzes as half of the student's term grade, with the other half coming from monthly tests.  Not surprisingly, there is a very high -- nearly 1.0 -- correlation between homework and test grades.  I am virtually certain that the correlation between homework and test performance exists independent of how much you grade, or how much you count the grade.  The goal, therefore, is to create an incentive mechanism so that students do everything they can to get homework right.  Then test and exam performance will take care of itself.  

05 July 2015

AP Physics 1 Lab Ideas: ticker-tape machine to determine acceleration of a cart (and preparing students for open-ended labs)

Tape Timer from Sargent Welsh
The College Board has released an official lab manual for AP Physics 1 and 2.  It's important to understand that, though they call it a teacher's manual for laboratory investigations, the experiments listed are not "required" for the AP exam.  Your choice of experiments that you do in class should be based on your interest, available equipment, etc.  

The manual might be 348 pages long, but no worries.  Just read the 30 pages or so that describe the actual suggested experiments.  These are as gold to the AP physics teacher, except more practically useful than gold.  I don't suggest you use these good 30 pages exactly as described, but that you use the activities described in these 30 pages as the basis for a couple of ideas in your course.

One of the activities in the book asks students to determine whether a wind-up toy car moves with constant acceleration.  What a great question!  Acceleration by itself is a difficult enough concept, but then understanding what is meant by "constant" acceleration is tougher still.

However.  Were I to ask that open-ended question early in the year, right after finishing the kinematics unit, I'd get such poor lab performance as to make the activity worthless.  "Open inquiry," as the College Board calls it, is a waste of time if your students aren't ready for it.  An open-ended problem followed by incessant questions about what to do, followed by frustration on your and the students' part and you finally just giving them step-by-step directions, isn't really what's intended by "open inquiry."  

Students must be carefully prepared throughout the year for open-ended laboratory exercises.  Early on, you need to teach some laboratory skills that they can eventually fall back on when it's time to answer a truly free-form question in lab.  For example, using a motion detector to measure distance, instantaneous speed, and acceleration is a skill that students must be taught.  Similarly, it's important to get your class practice in using photogates, spring scales, video analysis, ammeters and voltmeters, and other basic equipment.  I'm not suggesting one of those beginning of the year "let's measure a bunch of random stuff and talk about error" exercises, I'm suggesting that you teach such skills in context.  Do demonstrations with this equipment.  Have students use the equipment to verify the answers to homework problems.  Do a long-form lab with graph linearization where they must use equipment for multiple-data-point collection.  

Want a practical example of the difference between an early-year experiment and a late-year experiment?  

Here's the early-year version:  I have students release a PASCO cart from rest on an inclined track.  They use a tape-timer* to get the position of the cart 60 times per second.  Graphing the cart's position every 6th dot** makes a position-time graph with 0.1 s precision.

* You can buy such a timer from PASCO for $180, or you can get the cheap version from Sargent Welsh for $17.  The cheap version works fine.

** Why only every 6th dot?  Because we can decimalize every 6/60 of a second into 0.1, 0.2, 0.3 s.  Trying to graph every 1/60 of a second leads to numerical confusion, the graph taking ten times as long to make, and incorrect accelerations.  Thanks to Curtis Phillips for pointing this easy trick out to me after I had struggled with the graphical analysis of this experiment for nigh on two decades.

Next, I have students take the slope of two tangent lines to find two instantaneous speeds.  The change in speed divided by the time it took for the speed to change is the cart's acceleration.  You can see here the homework assignment that students fill out.  I determine the "theoretical acceleration" by measuring the angle of each group's track with an angle indicator, and using gsinθ.

This experiment takes a full 90-minute lab period plus a night's homework assignment to complete.

Then the late-year version:  In the last month of the course, I assign the homework problem with a direct measurement video that you can read here.  Everyone can view the video, then determine for himself how he's going to check for constant acceleration.  No one really, truly remembers the tape timer experiment from October.  However, they now have a reasonable understanding of what acceleration is, and they have used multiple methods of finding instantaneous speeds all year.

This assignment provoked such a wonderful in-class discussion.  Some folks compared the change in speed over two time intervals.  Others used four time intervals.  Some made a velocity-time graph for four or eight data points and looked for a straight line.  An argument ensued as to what the distance scale on the video was; a student pointed out that it didn't matter, as arbitrary distance units work just fine to answer the question.  The one confused student who confused speed and acceleration discovered his mistake quickly and authentically, without me having to say a word.

In other words, at the end of the course, my class not only could perform a complicated, creative experimental task... they had the skills to discuss the merits of different methods.  That's the holy grail of introductory physics laboratory work.  But, searching for the literal Holy Grail requires a long, difficult journey filled with peril.  Don't expect to hold the grail immediately -- guide your class through the journey.  Can't I have just a bit more peril?


04 July 2015

AP Physics 1 lab ideas: Spring constant of a hopping spring toy

spring toy available
from Oriental Trading
For the first time in ten months I'm not in constant preparation mode -- preparation for class, for workshops, for the USIYPT, for department meetings... I have a few weeks off with no immediate obligations.  Now is a good time to take stock of the activities I used in my first year of teaching AP Physics 1.

My advice about laboratory in AP Physics 1 is to teach the material up front and quickly, such that students know and can use basic facts, equations, and problem solving techniques.  Then do experimental work.  Set up problems you've solved for homework or on quizzes as laboratory activities.  Some of these will be quick and dirty -- did the speed at the bottom of the hill double when the hill's height quadrupled?  Some will be more involved, with extensive data collection and graphical analysis.  

Over the next few posts, I'll describe some of the extensive, long-form laboratory activities I used this year that were successful.  Many of these are based on old AP Physics B questions.  I'd suggest that scouring the released AP Physics B free response questions since 1996 could provide a fantastic lab manual for any advanced physics course.  

Today's experiment comes directly from the 2009 AP Physics B exam problem 1.  I show the class how the pictured pop-up spring toy works: push it down, and when the suction cup loses suckiness, the toy pops up.  I show them that we can use flexible aluminum wire wrapped around the top to change the toy's mass while still allowing it to pop up.  I ask them to graph the height to which the toy pops as a function of the toy's mass.  As always, I give no handout with instructions or prompts.  Each group is to produce and turn in a raw graph of the experimental data along with a data table.

When I'm satisfied with the data collection, I xerox the data tables so that each student in the group has his own copy.  Then I hand out the linked homework assignment.  Each individual student now must linearize the graph, take the slope, and use the slope to determine the spring constant of the toy.  
Before you try this experiment, be sure you've done at least two or three graph linearization experiments, and be sure everyone can deal with basic energy conversion problems.  I never did assign the official 2009 AP Physics B problem.  However, if you use that problem on a test, then this experiment could be a perfect follow-up.  Or, do the experiment in November, and put the 2009 problem on the semester exam in late January.*

* That's not as crazy as it sounds.  Students don't remember your lab exercises as much as you think.  For example, my class did the "does a light bulb obey ohm's law?" experiment this year, with everyone making a graph of voltage vs. current to see if the slope was constant.  Nevertheless, when we debriefed and discussed the 2015 AP Physics 1 free response, no one at all mentioned that we had done that very experiment.  Most of my students got it essentially right, but without the elegant graphical solution -- they said they checked several times to see if the V/I ratio was constant.  That's Rule 3 of teaching: Your students don't listen to you.  (But no worries, they don't listen to me, either.)