180: Day 9: Whiteboard walk

photo of whiteboard walk

I usually do whiteboards as “meetings,” but hear recently of “whiteboard walks.” These walks are designed to mimic scientific poster sessions, where one student stays with their board, and the others walk to other boards and have conversations with the “authors” of the other boards.

I tried this with my AP Physics C class today, and it seemed to go well. I like it because it takes less time than a full “board meeting,” and also allow students to engage in small groups with others who are not in their lab group. I told them that each member of their group should go to a different group.

We were whiteboarding a few problems they had for homework, so after the walk was finished, I did a quick review of the important items they should have noticed. I’m still a bit worried about me “summarizing”/”highlighting” at the end, fearing that students may not take the board sessions as seriously since they can “get the answers at the end,” but I guess that’s what high school teaching is all about–transforming mindsets slowly 🙂

photo of whiteboard walk

180: Day 2: Learning whiteboards

Bouncy ball lab whiteboard samples

Today my AP Physics C students got through creating whiteboards for the bouncy ball lab, and it’s clear I didn’t give quite enough instructions…but it is only their first “modeling” whiteboard. Items for me to work on with them:

  • Graph axis are labeled with variable and units.
  • Equations are ‘translated’ from ‘math’ (e.g. y=3.6x+1.2) to ‘science:’ x and y are replaced with the variable, and all numbers need units.
  • Conclusions are different from predictions.

Here are a couple of resources I created to help: A PowerPoint slide show with examples (feel free to edit/steal), and a PDF version of this presentation, fit to one page. I probably won’t show the slide show, but will  post the PDFs around the class and on my Schoology page.

Bouncy ball lab whiteboard samples

180: Day -2: Updating the physics mechanics inventory

FCI sample image

Like many physics teachers, I use a pre and post “test” Concept Inventory to help me understand how and where my students are learning and struggling. The concept inventory is an amazing tool, but the images in it are of poor quality (think ’90’s computer art). I decided this year to update many of them, as well as tweak some of the wording. This means I won’t necessarily be able to compare with other teachers (since they are using the old version), but I’ll still be able to look at my pre and post scores (I haven’t changed any of the actual problems, just some of the wording within each). Here’s a copy of a couple of the images I updated (I have blurred out the words, since I don’t want students to be able to “prepare” for the “test.” If you have the old version, you’ll probably see what I’m talking about in terms of clarity of images.

image showing circular paths

How close are you? The “Why” behind Percent Error

Photo of flasks with different colors

Photo of two men playing horseshoesThere’s an old saying, “Close only counts in horseshoes and hand grenades.” People would say this in response to someone saying “Well, I was close.” So, why does close count in horseshoes and hand grenades, and why is this the introduction to a science paper?

Horseshoes: If you’ve never played before (I bet most of you haven’t), the goal is to throw a horseshoe towards a pipe in the ground and have the shoe end up around the pipe. You score if the shoe is around the pipe, but you also score if your horseshoe is closer than one horseshoe away. Thus, close counts in the game of horseshoes.

I think we can all understand how close could count with a hand grenade.

“Close” in science

In science, you may run an experiment in which you make measurements and are able to compare that to some “true” value. This tends to be one of two types: Continue reading “How close are you? The “Why” behind Percent Error”

Bouncing Ball Lab Introduces Models and Foreshadows Future Physics Concepts

One photo of a group's whiteboard

I use this lab as an introduction to my physics class at all levels: First year and AP. The goal of the lab is to introduce students to the role of physics in making predictions, and for me to see where they have strengths and weaknesses in lab procedures and data analysis.

The Challenge

Students are presented with a variety of balls and a meterstick. They are challenged to develop a method of determining how high their ball will bounce if dropped from 1.5 meters and from five meters.

collection of various balls

Experimental Design

  1. Students spend about ten minutes with their lab group discussing their design.
  2. After this, a class-wide discussion helps generate:
    • “Rules” around what is fair (e.g. you can’t measure the height of a table, then place the meterstick on the table to get 1.5 meters–but congratulate any group that develops this or similar “illegal” procedure as being creative!)
    • Some class-wide practices (we’ll all drop on the floor so everyone has a similar bounce surface).
    • Demonstration that there are different ways to approach the challenge.

Goals of discussion

In the class discussion, the goals are:

  • Students define the “Independent variable” as the drop height, and the “Dependent variable” as the bounce height.
  • Develop a “best practice” of collecting ten data points to analyze:
    • Some groups will say “bounce it from 1 meter and from 0.5 meters, then add them.” I usually show two points on a graph, then draw a straight line through them and a parabola through them, and ask which trend is the best predictor.
    • Students may suggest using camera phones to record the height of the bounce—work out with your students what you will accept—but be open to accept creativity.
    • Here is where you can be more “inquiry” or “guided inquiry,” e.g. do you want to develop procedures that we want many (10?) data points, and/or many trials for each height (3?), or have that come out as groups share their experimental designs in their whiteboards. I prefer the “best practice” of three trials for each of ten initial conditions.
  • It is important to bring up in the discussion where on the ball the group is going to record from. Some may want to record from the top of the ball, some from the bottom. It is best not to force either way upon them, but to allow each group to determine and record where they will be measuring from. The location of the measurement will not affect the slope of their graph, but will affect the y-intercept. Allowing for different groups to use different measuring points will allow this to be revealed in the whiteboard meeting. (Hint: If you measure from the top of the ball, how high will the ball bounce if you drop the ball from zero…and can you drop the ball from zero?)

General experimental design

  • Most students will develop an experiment that involves dropping the ball from 1.0 meters, 0.9 meters, 0.8, etc.
  • Three trials at each drop height ensures accuracy of measurement.
  • Students are told that the accuracy of their prediction will be part of their grade, so they should be careful with their measurements!

Data analysis

One photo of a group's whiteboardAfter conducting their experiment, each group analyses their data by plotting a graph, then presents their analysis to the class on a whiteboard. Results include:

  • Shape of graph.
  • Slope of graph.
  • Y-intercept.

In the “Board meeting,” the class develops the meaning of each of these concepts.

Each group then determines a predicted bounce heights.In determining their bounce height, students are encouraged to use both their graph (find 1.5 m on the horizontal axis, go up to the line, then go over to the vertical axis) and their equation (input 1.5 and find the answer).

Once we finish the discussion, we are ready for the testing!Another photo of a whiteboard

Graphing

Here are a couple of computer generated graphs. The one on the left shows one group’s results, while the one on the right shows the slope of each type of ball (I have students turn in as homework via the internet their slope and type of ball, so I can put the results from all my classes on the same graph).
graph showing one group's resultsgrph comparing ball type with slope

Testing:

1.5 meter challenge

  • I have two metersticks taped together, and each group brings up their ball for the test.
  • Students find that their prediction matches their results, usually within 10%. (Depending on the level of your class, you can have them calculate a “percent error” or simply have them calculate their prediction as a percent of their actual.)

5 meter challenge

  • We then go out to our courtyard with a stairwell where students line up to observe the height of the bounce. This presents an opportunity to discuss parallax (whose eyes are most “lined up” with the bounce height). Finding a good location for your five meter challenge is important, you need a location where students can line up along the stairs so at least a few are near where the ball will rise to.
  • I hang a tape measure with every meter marked off with blue tape, and every half-meter marked off with another color. This allows them to see where they line up on the tape, as they are likely to be too far away from it to be able to read the numbers.
  • About half the groups find their prediction is not close to their results (their prediction is too high).
  • Generally, it’s the balls with low density that have the worst results (usually a student notices this in the discussion, but be ready to prompt it if it does not come out).

Concluding discussion

  • The first question I ask when we have our concluding discussion is “Would it be fair for me to grade you based on the 1.5 m drop?” The class generally says “Yes!” since they all got A’s on it.
  • My next question is “Would it be fair to grade you on the 5 meter drop?” The answer is a resounding “No.
  • I inform them that their accuracy grade will only be based on the 1.5 m drop, and that those who got 90%+ accuracy get an A.

Moral of the story: Physics works.

  • We then address the meaning of the slope and y-intercept:
    • Since balls that “bounce high” have high slopes, we define the slope as the “bounciness” of the ball.
    • The y-intercept is “how high the ball will bounce if dropped from 0 m.”
    • This leads to discussion of experimental design: Most students get a y-intercept close to zero, but some students get a significant positive y-intercept. Usually, on examination of their procedures, it becomes clear that they were measuring from the top of their ball, and thus it physically impossible to drop from 0.0m–but there is still a mathematical y-intercept.
  • We then discuss why the five meter challenge wasn’t successful. Students bring up different ideas:
    • The floor outside is different (one year, after this variable was raised, I had a group to go back out and test the floor to determine this is not a problem).
    • Air resistance must be playing a role.
    • More energy is lost when the ball falls further.
  • Most of these are left to “we’ll come back and look at this lab when we address these concepts later in the semester,” but some can be resolved right there. I had one group who went out with their ball and a meterstick to check to see if the floor was different, and proudly came back to say it was not.

Pedagogy

What I value most about this lab:

  • It starts physics by presenting students with a challenge that they can successfully complete.
  • It lets me see who is having difficulty with graphing before I get to the “real physics.”
  • It introduces data analysis without introducing new physics concepts. It allows students to use their own language, which sometimes allows me to say “Many terms often mean something different in physics than in everyday language. Let’s leave the physics terms out of our discussion until we have defined them in class.
  • It creates a foundational experience that we can return to throughout the year.

Acknowledgements

I would like to thank the American Modeling Teachers Association and Arizona State University for their work in designing and training teachers in the use of a rich curriculum that develops thinkers rather than memorizers.

Frictional force on a block on a ramp that has a wall holding it up

Ramp with a block on it and a wall holding the block from sliding
Here is the drawing of the problem.

Today in class, students were asked to draw a force diagram for the drawing shown on the right. The drawing shows a block on a ramp with a “wall” holding it back. The question that we couldn’t answer easily is,

‘Does the friction between the ramp’s surface and the block need to be accounted for in the force diagram.’

Then, we can take this a step further, and ask

‘Does the frictional force between the wall and the block decrease the normal force provided by the ramp?’

We couldn’t decide, so this afternoon I created a quick test to see if I could answer it. Here is my experimental design:

Book on ramp with force sensor holding it from sliding.
Book on ramp with force sensor holding it from sliding.

I started with just the book on the metal ramp. I placed the book against the force sensor, then tapped a bunch on the ramp above the book. I then collected data for ten seconds and recorded the mean of the all the data collected. I repeated this three times. Here are the results:

  • The ramp is 73º.
  • Mass of book: 1.85 kg

Book on metal track

Trial 1Trial 2Trial 3Average
Mean4.3705.0015.2754.882
St Dev0.07730.06950.0410
Reading on LabQuest. The numbers are negative because it is pushing on the sensor.
Reading on LabQuest. The numbers are negative because it is pushing on the sensor. Range is only 0.05 N.

Book on sandpaper on track

Trial 1Trial 2Trial 3Average
Mean2.5412.8212.7962.719
St Dev0.03060.01290.0073

Other results

I tried measuring the force of static friction between the book and the ramp when the ramp was horizontal. My results weren’t very consistent, but I got 3.06, 3.96, and 3.47.

Conclusion

The difference between the force exerted by the force sensor (acting as the “wall” in the problem) were significantly different when comparing the metal surface to sandpaper, and both are less than the “x component” of the force of gravity (“x” being parallel to the ramp), which should be 5.3 N:

So, while the problem may be assuming you can ignore the friction force between the ramp and the block, I conclude that this is not wise unless the problem specifically states to ignore friction.

Other thoughts

I noticed that on every trial, the value of the force decreased during the ten seconds I collected data (the LabQuest screen above shows negative numbers, so the force is getting less negative).

Correlations and Scatter Plots

After a couple of hand-graphs, my students use their TI 83/84 calculator for graphing. We set the calculators so they give an “R squared” correlation, which we often use to help determine which regression type (linear, quadratic, power) fits the data the best. Many of my students have a hard time understanding the nature of the correlation, and I have developed a presentation that I hope will help them.

The slideshow above advances every ten seconds, but feel free to click the link above to give yourself control over the timing. Continue reading “Correlations and Scatter Plots”

Next Generation Science Standards, second draft published

Next Generation Science StandardsThe second draft of the Next Generation Science Standards were released this week. You can find them all here:

www.nextgenscience.org/next-generation-science-standards

You can search them here:

www.nextgenscience.org/search-standards-dci

I have exported pages containing all DCI Arranged Standards – Public Release High School only.

Grading labs: A faster method using pre-printed return address labels.

lab grading label

Grading tends to be the biggest time consuming task for teachers. For those teachers who want students to complete lab write-ups that truly reflect discover and learning, providing feedback on labs can be an even more daunting task. This year I started using a system that has significantly sped up grading the mechanics of the lab, allowing me to spend more time on their analysis. I use address labels with seven check boxes to allow me to quickly look over a lab to see what parts are missing. Then I’m able to go in and make more detailed comments about the sections I want to focus on.

lab grading label
Lab grading label (return address size)

I use Modeling Instruction (this is an older site, AMTA will be launching an updated site in the next week or so) in my class, which focuses on analyzing data to develop models and reach conclusions. In most labs students plot data to look for trends, then develop the equation of the line/curve. As you can see above, my check list focuses on their data organization and plotting skills. Again, this frees me up to spend my time on their analysis and conclusions.

As I grade, I check off each item that I see completed, and can circle items the student hasn’t included (e.g. those dreaded missing units). Then I peel off the label and stick in on their report. The label is only an abbreviation, and needs to be backed up with more detailed expectations for the check list items. I provide students with a longer “Lab grading guidelines” page that they can refer to. My latest version is shown below:

lab grading guidelinesCopies of Word documents for both of these can be found here (both are .docx format):

Modeling Instruction: Review of acceleration lab

I’ve been using Modeling Instruction in my physics class for the past five years, and keep wondering how to handle the problem of students who miss the whiteboard sessions (where students share their results and we reach class conclusions). These sessions are critical for student growth, as this is where they are challenged to look for patterns in their data, reflecting on it and comparing it with other groups’ results.

I have created a video PDF file that walks students through the process we would do as a class, which you can find here (PDF format, 3.9 MB). This is an animated PDF file, with voice-over as I write. You can move forward or backward using the navigator buttons, or click on any section of the document to jump to that place in the timeline.

This being my first such attempt at such a video, I would appreciate people’s feedback.

A bit of background on my class graphing methodology

  1. I don’t ‘linearize’ data (a method where students graph p/t^2 to get a linear relationship).
  2. I start with hand graphing constant velocity cars, then students use their calculators to conduct a linear regression on the data. They generally like the calculator option because it’s so much faster than hand graphing. (All our students have TI calculators, so we use these instead of Excel–I have a handout that walks them through the process, you can download it here.
  3. When we graph the data from the accelerating wheel on a ramp, students notice that the relationship is not linear (and they had predicted this from the pre-lab observations). At this point I introduce Quadratic Regression on their calculators, which presents them with y=ax^2+bx+c as the solution. They can determine that “c” is the initial position (plug in t = 0, and y=c), but they do not know what “a” and “b” represent. To solve for that, we move to creating a velocity/time graph:
  4. Using the “Draw tangent” feature on their calculator, students draw velocities at increasing times (and notice that their velocity values are increasing, again, as they predicted).
  5. Finally, with a velocity/time graph, we see a linear relationship, and can define the slope as acceleration and the y-intercept as the initial velocity.
  6. Their final step is to determine what “a” and “b” on the position/time graph equation represent. You’ll have to watch the video to get the answer (or, at least how we decide what you already know–that “a”= 0.5 acceleration and “b” = vi).
  7. I use “p” for position in my equations and graphs. My students seem to have too big a problem with “x” for position, especially when we plot it on the “y” axis. (When we get to two-dimensional motion, I introduce “x” and “y” as positions.)

Hardware/software

I’m using a Livescribe Echo Smartpen and their Livescribe Desktop to generate the file. The pen allows you to write with normal ink and record all your keystrokes while simultaneously recording your voice. I don’t like that it doesn’t allow different colors, but it’s fairly simple to use, and doesn’t take a lot of post-production work (just save as PDF and the file is created). I also don’t like that it appears to not allow landscape orientation, which would display much nicer on monitors/tablets/etc.

This hardware does not project live, so it’s not something that lends itself to use during class, but it seems to give decent results in just a short amount of time outside of class. This video runs about 15 minutes, and it probably took me 30-45 minutes to create and publish.

There are multiple ways of saving/exporting the final products, and I selected PDF because it is so widely readable, regardless of Mac/Windows/iThing/etc.