Since we have covered the three NGSS Units that we are covering this year, for the remainder of the year we will be doing supplemental activities that may or may not be structured like typical NGSS assessments.
I will be posting these agendas here in this one post with a brief desciption of the aim for the assignments, updating as we go.
Week 26 Marine Plastics and Plastic Chemistry Week
- For this series of assignments I wanted to continue on from their capstone projects, looking at resource consumption. I also wanted to bring back in some chemistry as a review for the CAST and for the students to refresh on for high school science.
Week 27 Finishing Plastics and Video Project
- For this assignment I wanted them to sort of review material from the year or work on answering a science question that interested them, but was never covered in class. So the aim is both to give them support in researching an interesting topic and somewhat indirectly review the material from the year.
Week 28 Finishing Video Project
Week 29 Cast Review
Week 30 Cast Testing and Bio Activity
- For this week I am bringing in some biology material with a simple engineering challenge. The students will be designing beaks that could be used to pick up rice and beans.
Week 31 Bio Activity
- This week they'll be testing out their beak designs and then we will move onto an inquiry topic about the structure of DNA.
Week 32 Bio Activity and Random
- This week will have the students finishing with DNA and then has some random activities and challenges built in. The schedule can be somewhat unpredictable and the students will not have iPads this week.
On this blog you can find example, agendas, and reflections on transitioning fully to the NGSS 8th grade integrated science standards.
Wednesday, April 26, 2017
Monday, April 17, 2017
Week 25 Agenda
Monday, April 3, 2017
Week 23 and 24 Agendas and Assessment 4.3 Overview
Below you can find links to the Week 23 and 24 Agendas. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post.
Week 23 Agenda
Week 24 Agenda
Week 24 Agenda
Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)
Reflecting on Unit 4 Assessment 1
This assessment was somewhat influenced with my students' practice with the example CAST questions online. Beyond the obvious need to incorporate more math (something I am a bit unsure where to incorporate as the SEP for mathematical and computational thinking is not really covered much in the standards about motion and force, but tested using that standard in the practice questions), I felt like my students needed more practice analyzing given data dn interpreting models or diagrams that were given to them, forcing them to make assumptions about the data. This assessment heavily looked at those two skills.
This unit was somewhat tricky to me, I am used to incorporating light boxes, lenses, and mirrors into any unit on waves, but our school doesn't have any supplies for that, so I approached it differently. This made the unit less interactive than I would have liked. Getting something for the students to interact with in this assessment is something we need to factor into the budget.
Introduction
We started with a Pear Deck about how we can use wave interactions to understand the layers of the Earth and as part of that introduced the types of waves. This allowed me to bring in more Earth Science (on the Earth, not space, level to this unit). As one of my goals I am trying to make all the phenomena and introductions have another science domain tied in.
Part 1
There is really not much to say about part one, it was designed to introduce the key vocabulary and basic concepts for the unit and did that. It wasn't particularly interesting, but it was functional.
Part 2
For part two the students worked with provided data about Tsunamis, with our focus in this part being about ocean waves. As part of this I used Google MyMaps to plot out the locations, which worked well and was a familiar interface for the students, so they had no trouble using it. The students has to use the provided data to create a graph or other visual representation and form a claim based on that data. The open ended nature of the graphs allowed me to assess which students better understood how to represent data in a graph.
Some examples of their graphs and other representations are presented below. Although they were all different, there was value in each one.
Part 3
In part 3 the students were presented with about 7 models and a data table to complete looking at the waves of the electromagnetic spectrum and how they are absorbed, reflected, or transmitted through the layers of the atmosphere. This one was, obviously, focused on light waves.
This assessment was a struggle for many of the students. The models are not always specific, and it is not as if one picture gave them all the content for a single row or column of the table: they had to put it all together and make assumptions. For their conclusion they discussed how the interaction of EM waves and the atmosphere supported life on Each (bringing a bit more about life science and earth science.) Their answers were great an varied between the protection from Gamma and UV rays, to the greenhouse effect, and to visible light.
Part 4
Part 4 was the part of the assessment I was most worried about going in. They were looking at one of a few scenarios and discussing how soundproofing could be used in that situation. This part of the assessment focused on sound.
I gave them three options to look at, sound walls by a freeway, sound proofing in the band room, and sound proofing to practice their drums in the garage. They students research materials, discussed benefits and costs, and wrote up a formal proposal. The biggest issue was that, while I want how they present the material to up to them, I want to add more structure in specifying the expectations.
Overall
I thought the assessment went well, though part 3 was rather challenging for some students. I liked having the focus on the different parts being about different categories of waves. I would like to bring in an interactive element, which could go in part 1, though part 1 still needs to have them working on constructing a basic model of a wave, so it would have to allow for that as well.
Sunday, March 19, 2017
Grading assessments as a whole or as parts?
One of the key changes that has gone along with our 4 part assessment model is the move to a sort of modified standards or mastery based grading on a 4 point scale. On an approximate level this means that a 4 is an A, 3 is a B, 2 is a C, 1 is a D, and below that is an F (though the actual comparison varies slightly). The key effect of this is that it is much more achievable for a student to not fail.
When discussing grades with struggling students we can discuss an achievable goal to get them to reach a passing grade and demonstrate understanding. On the traditional 100% scale you would have students who would work and work to try to raise their grade, but seem stuck below that 60% mark. If I have a student who wants to put in the effort and show their understanding I want to have a grade system to let them do that.
As part of this change students get one score per assessment, based on how far they demonstrated proficiency. So a student who mastered the first two parts of the assessment, but not the last two, would score a 2/4 on the assessment. A student who mastered all parts would score a 4. This system worked fine, but I felt like my students (and parents) didn't really get where that grade was coming from. It is our first year making this change to NGSS, this assessment model, and mastery grading. So I began semester 2 differently.
For the second semester, with the aim of improving understanding of the grading system, I moved to giving a score for each part. If a student received mastery on the first and second parts of the assessment, but none of the others, they scored a 4/4 on part 1, a 4/4 on part 2, and zeros on the next two. Ultimately, they scored an 8/16 on the combined assessment grades giving them the same percent of points that they would have had the first semester. I felt like this did help clarify the grading system, and now that it has done that I am happily changing my grades back to what I did semester one, the combined assessment grade.
There are a few reasons why I like this combined grade:
1) It makes student understanding on the assessment and standard more obvious. A student scoring 3.5/4 on the whole assessment is a more succinct measurement than having to consider a number of parts.
2) Not every student is able to master the whole assessment. Yes, I am there to support them and try to guide them to understanding, but the fourth part of the assessment can be rather difficult. Having a zero (or a missing assignment) in the gradebook puts a lot of focus on an assignment, rather than assessing overall understanding.
3) Finally, it is way more time consuming entering each individual score and keeping those updated with revisions and work that comes in late, than just managing one score. I honestly, didn't expect it to be so much more consuming on my end, but it has been.
I don't regret having the separate scores for a while, I feel like my students now have a better idea of how the assessment score comes together, but I am sure glad to go back to the combined score.
When discussing grades with struggling students we can discuss an achievable goal to get them to reach a passing grade and demonstrate understanding. On the traditional 100% scale you would have students who would work and work to try to raise their grade, but seem stuck below that 60% mark. If I have a student who wants to put in the effort and show their understanding I want to have a grade system to let them do that.
As part of this change students get one score per assessment, based on how far they demonstrated proficiency. So a student who mastered the first two parts of the assessment, but not the last two, would score a 2/4 on the assessment. A student who mastered all parts would score a 4. This system worked fine, but I felt like my students (and parents) didn't really get where that grade was coming from. It is our first year making this change to NGSS, this assessment model, and mastery grading. So I began semester 2 differently.
For the second semester, with the aim of improving understanding of the grading system, I moved to giving a score for each part. If a student received mastery on the first and second parts of the assessment, but none of the others, they scored a 4/4 on part 1, a 4/4 on part 2, and zeros on the next two. Ultimately, they scored an 8/16 on the combined assessment grades giving them the same percent of points that they would have had the first semester. I felt like this did help clarify the grading system, and now that it has done that I am happily changing my grades back to what I did semester one, the combined assessment grade.
There are a few reasons why I like this combined grade:
1) It makes student understanding on the assessment and standard more obvious. A student scoring 3.5/4 on the whole assessment is a more succinct measurement than having to consider a number of parts.
2) Not every student is able to master the whole assessment. Yes, I am there to support them and try to guide them to understanding, but the fourth part of the assessment can be rather difficult. Having a zero (or a missing assignment) in the gradebook puts a lot of focus on an assignment, rather than assessing overall understanding.
3) Finally, it is way more time consuming entering each individual score and keeping those updated with revisions and work that comes in late, than just managing one score. I honestly, didn't expect it to be so much more consuming on my end, but it has been.
I don't regret having the separate scores for a while, I feel like my students now have a better idea of how the assessment score comes together, but I am sure glad to go back to the combined score.
Thursday, March 16, 2017
Week 22 Agenda
Sunday, March 12, 2017
Reflecting on Unit 2 Assessment 3
This assessment ended up being both easy and hard for me to implement. I did almost the same set of activities last year when first starting to implement NGSS standards and practices, which made it familiar. At the same time, I have evolved a lot in how I implement the standards and practices and it was hard to not fall back to my previous ways.
The reason I find this demo really valuable (other than the fact that it is easy to take cool pictures through your phone with the diffraction grating) is that so much of space is distant. You can show them pictures, even have them interact with the websites to explore exoplanets, but it is all distant. This is a real application. Plus it is cool, everyone should look at emission spectra.
We started this unit with an article about a planet with three suns. We used Verso, which allows anonymous discussion, to post what we thought would be different if we lived on that planet. There was a lot of great discussion, before we covered the material in class, on how temperatures, seasons, eclipses, and more would be different. I actually used the same article last year, but in that case it was our final application project where they used the knowledge they had developed to discuss what would be different.
For the assessment itself there were, as always, four parts. I will be discussing those before. at the end of our implementation of the unit there were some major changes we made in the structure of the assessment. I will add some comments about those.
Part 1
In this part of the assessment they gathered information about an object in the solar system and added it to a set of class slides. The goal is to look at scale of objects in the solar system. I did the same activity last year and both times the students were randomly given a name as they walked in and had to find a randomly assigned partner with the same name, that mechanism of pairing up really works well for this low impact activity. I was happy with that part of the task, but at the same time this was too easy as an entire task. I want part 1 to be accessible to all students, but there needs to be more thought and not just research to the task. In the end, on reflection, we moved Part 4 to join Part 1.
Part 2
In this part of the assessment the aim was to look at how we use technology to research about space. This task is also nice because it starts to bring in light and waves which is discussed later in the year. This is also a point where I brought in some whole class activities (more on that in a sec). In the task the students picked some space mission (Cassini, Hubble, Curiosity,...) and discussed who was involved, when it occurred, the technology involved, and the major discoveries. They used this information to make a poster (on paper, PicCollage, some even used Slides) and present the information to their group.
As for the whole class activity, one of my favorite topics in space is how we know the composition of the sun using light. I begin with some quick slides discussing subtractive vs additive color and how we perceive color. As part of that I have three colored bulbs (red, green, blue) that I can use to show how, when light combines, we perceive a different color not both or all three separately. Then we look at emission spectra and discuss how the wavelengths emitted help us identify different elements and looking at it all together helps us understand what something like a star is made of.
Examples of Spectra |
Overall I thought this part of the assessment went well. I was happy with the student products and the students seemed to enjoy being able to pick whatever mission they wanted.
Part 3
In this part of the assessment we tacked the Sun, Earth, Moon models. If you haven't looked at the evidence statements for this standard, it is a beast (and that is not even including the seasons model that we tacked in assessment 4.2).
We approached this model over three days and broke down the information as shown below:
Day 1 - Basic Model (If they look at the simulations from the previous assessment this can help them think about this basic model.)
- The Sun
- The Earth
- The Moon
- The Orbital Plane for Each
- The Orbit of Each
- The Tilt of Earth's Axis
- The Distance from the Earth to the Sun and from the Moon to the Earth
- The Time it Takes for the Earth to Rotate and Revolve
- The Time it Takes for the Moon to Revolve
Day 2 - Phases Model (The teacher may want to create a real model of phases with a ball or balloon with one side colored black and have them think about how much they can see as being illuminated as it revolves around them.)
- The Sun
- The Earth
- The Moon
- The Path Light Travels
- The Names of the Phases
- The Amount of the Moon Illuminated by the Sun
- Where, As The Moon Orbits, Each Phase Is
- What Arrangement of The Sun, Earth, and Moon Causes Lunar Eclipses
- What Arrangement of The Sun, Earth, and Moon Causes Solar Eclipses
- How Much of the Earth Sees Each Type Of Eclipse
It was a lot to tackle. There were some things that worked well. We did a picture sort before starting the phases model, which elicited some great group discussions and debates. I let the students choose how they made their model, some drew on paper, some on the iPad, and some used apps like PicCollage to combine actual pictures, with drawings, text, and more to make their models. But this task was a beast, and that doesn't even include the ending paragraph where they need to describe which phases we see during which type of eclipse and explain why we don't have an eclipse twice each month. Plus, some students take a long time working on making their models look super cool, which is great, but makes the assessment drag.
As such we changed this part of the assessment to be broken down over Parts 3 and 4.
Part 4
In this last part of the assessment the students used Actively Learn (thank you Core teachers for giving me ideas on other ways to use this site) to write a three paragraph essay discussing a planet that we could use for habitation with technology and we that we couldn't. They returned to their data from the slides to do this. To set up the assignment on Actively Learn I uploaded directions for each part and then inserted questions for each paragraph. This let me give them quick feedback as they worked. Plus, having it broken down into paragraph tasks made the overall task less imposing. I loved this task, but it didn't really fit in Part 4 and thus it moved to join Part 1.
Overall
Overall I was happy with how the assessment turned out, but as I said there were changes. One of the big things that didn't work how I wanted it was the models. Because the task was so overwhelming in complexity of what was involved, it was hard to implement revision and reflection (a key part of modeling). As a result for many students, their models reflected what they found online more than their own individual thoughts.
Week 21 Agenda and Assessment 4.2 Overview
Below you can find a link to the Week 21 Agenda. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post.
Week 21 Agenda
Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)
Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)
Additionally, if you visit the page linked below it will take you to a list of links showing which agenda corresponds to which assessment.
Wednesday, March 1, 2017
Week 20 Agenda
Sunday, February 26, 2017
Reflecting on Unit 2 Assessment 2
This assessment posed a challenge. It is about the universe and gravity, while there are some physical activities that can be done, most of the activities must be theoretical to an extent...which made the assessment drag.
Don't get me wrong, I feel like my students had a good understanding of what I wanted them to at the end, but in the day to day there was a lot of just sitting at desks and completing assignments using simulations. I felt like the students took longer than needed to finish the parts because they weren't engaged in the same way they were with the previous assessment.
To look at the parts individually:
Part 1:
In this part of the assessment the students used a PhET simulation that related mass and force. The two main aspects this simulation showed were that when mass increased, force increased, and that the force is equal between the two objects even if one is more massive than the other. The simulation is great and to the point, but was also a little hard from some students to apply the information of to the task. It was a hard first part of the assessment to some students. That being said with feedback and me readjusting my expectations of their work, given that it was a difficult start to the assessment made the task one that all could accomplish. (I will also say my evaluation of this part of the assessment is a little biased because I was out sick when they started the task. I think they could have had a clearer idea of what was expected if I were there. Though that does tell me that the instructions may need to give them more initial support, such as an example of what a model of the forces between a planet and its moon might look like.)
Part 2:
This was one of my favorite parts of the assessment, while also being the one I would change for next year. This part of the assessment used an interactive website from the Smithsonian. It is create for showing scale in the solar system and beyond. We had the students use this website to answer questions on a worksheet. Next year, since I think the students need more out of their seats time, I would structure this part of the assessment differently. I would likely still begin with the website, it is easy, interactive, and supplies information. Then I would challenge the students to make their own model of the solar system. This model could be a physical thing, like a diorama, but it could also have them calculating distances, spreading their friends across a the field as planets, and taking a picture. Something creative or interactive. I would also like to tie in the idea of gravity into this part of the assessment specifically, by discussing how gravity affects things even when they are so hugely spread out.
Part 3:
This part of the assessment used a PhET simulation that was thankfully translated recently into HTML5 and suitable for use on the iPad. The simulation looks at gravity and orbits of a few different systems. The students used the simulation to answer questions in a Google Slideshow, using screenshots of the simulation to support their thoughts. The simulation worked well, the students showed the understanding I wanted, and it was all create. (As a note, there is a question that uses the term 'hierarchy" this was difficult for some of the students, but a quick explanation of hierarchy meant there made the question accessible for any that needed help.)
Part 4:
The last part of the assessment has the students explaining how gravity keeps objects (such as satellites and the space station) in orbit or how gravitational assist works. While I liked this part of the assessment, it was difficult for many students. I have to say, though, I give my students a lot of credit. The vast majority of my students always try to reach and conquer the last part of the assessment and this assessment part 4 was no exception to that rule. However, it was difficult for many to really describe how gravity kept it in orbit, they had trouble applying the information from part 3. The biggest challenge was addressing that it was gravity pulling it down plus the forward motion that kept it in orbit.
Overall:
I feel like this reflection sounds more critical of this part of the unit than it should. It was overall great, it used simulations and resources to try to make real something that is often abstract and for the most part to success. But, the parts were in their way similar in their expectations of student interaction with the material (play with the simulation, complete the worksheet/assignment...play with the simulation, complete the worksheet/assignment...). Some of that is likely the fault of my own in not bringing in more interactive elements to enhance the assessment and go beyond. Though not intended as an excuse, being out sick and having not much of a voice, did influence how much I was able to discuss to go beyond the assessment requirements. I only note that because in reflection I also need to think, for next year, what I can do with the assessment to bring in more engagement and interactivity.
Don't get me wrong, I feel like my students had a good understanding of what I wanted them to at the end, but in the day to day there was a lot of just sitting at desks and completing assignments using simulations. I felt like the students took longer than needed to finish the parts because they weren't engaged in the same way they were with the previous assessment.
To look at the parts individually:
Part 1:
In this part of the assessment the students used a PhET simulation that related mass and force. The two main aspects this simulation showed were that when mass increased, force increased, and that the force is equal between the two objects even if one is more massive than the other. The simulation is great and to the point, but was also a little hard from some students to apply the information of to the task. It was a hard first part of the assessment to some students. That being said with feedback and me readjusting my expectations of their work, given that it was a difficult start to the assessment made the task one that all could accomplish. (I will also say my evaluation of this part of the assessment is a little biased because I was out sick when they started the task. I think they could have had a clearer idea of what was expected if I were there. Though that does tell me that the instructions may need to give them more initial support, such as an example of what a model of the forces between a planet and its moon might look like.)
Part 2:
This was one of my favorite parts of the assessment, while also being the one I would change for next year. This part of the assessment used an interactive website from the Smithsonian. It is create for showing scale in the solar system and beyond. We had the students use this website to answer questions on a worksheet. Next year, since I think the students need more out of their seats time, I would structure this part of the assessment differently. I would likely still begin with the website, it is easy, interactive, and supplies information. Then I would challenge the students to make their own model of the solar system. This model could be a physical thing, like a diorama, but it could also have them calculating distances, spreading their friends across a the field as planets, and taking a picture. Something creative or interactive. I would also like to tie in the idea of gravity into this part of the assessment specifically, by discussing how gravity affects things even when they are so hugely spread out.
Part 3:
This part of the assessment used a PhET simulation that was thankfully translated recently into HTML5 and suitable for use on the iPad. The simulation looks at gravity and orbits of a few different systems. The students used the simulation to answer questions in a Google Slideshow, using screenshots of the simulation to support their thoughts. The simulation worked well, the students showed the understanding I wanted, and it was all create. (As a note, there is a question that uses the term 'hierarchy" this was difficult for some of the students, but a quick explanation of hierarchy meant there made the question accessible for any that needed help.)
Part 4:
The last part of the assessment has the students explaining how gravity keeps objects (such as satellites and the space station) in orbit or how gravitational assist works. While I liked this part of the assessment, it was difficult for many students. I have to say, though, I give my students a lot of credit. The vast majority of my students always try to reach and conquer the last part of the assessment and this assessment part 4 was no exception to that rule. However, it was difficult for many to really describe how gravity kept it in orbit, they had trouble applying the information from part 3. The biggest challenge was addressing that it was gravity pulling it down plus the forward motion that kept it in orbit.
Overall:
I feel like this reflection sounds more critical of this part of the unit than it should. It was overall great, it used simulations and resources to try to make real something that is often abstract and for the most part to success. But, the parts were in their way similar in their expectations of student interaction with the material (play with the simulation, complete the worksheet/assignment...play with the simulation, complete the worksheet/assignment...). Some of that is likely the fault of my own in not bringing in more interactive elements to enhance the assessment and go beyond. Though not intended as an excuse, being out sick and having not much of a voice, did influence how much I was able to discuss to go beyond the assessment requirements. I only note that because in reflection I also need to think, for next year, what I can do with the assessment to bring in more engagement and interactivity.
Reflecting on Unit 2 Assessment 1
(I am way behind on writing up my reflections, to get caught up, for the moment, I am just writing reflections on the entire assessment.)
Unit 2 has been very different from unit 1. Due to the structure of the standards and how we are approaching them, they are going by much quicker. This has been its own challenge. A quicker timeline has meant less time for me to grade each part, which has admittedly left me exhausted at times. To help me out, for some assignments, I really focus on just grading the part that matters. Take the third part of this assessment, the investigation stations, I focused my time in looking at the end of the assessment, their claim with evidence. Yes, I skimmed for completion on the station observations and questions, but I honestly don't have time to read through 250 complete lab reports. What I care about is that they can use evidence to support the claim that action at a distance forces do that, act at a distance, so I used my time to see if they met that understanding.
This assessment, as they all do, had four parts.
Part 1:
The first part was a thinking map comparing electric and magnetic forces. This was a very easy start to the assignment, which I like. I want all my students to be able to be successful in completing part 1 of the assessment. The one challenge with this was that the articles themselves were not great. They are on two slightly different topics which made comparing them a bit weird. Next year I want to find different articles or just write my own.
Part 2:
Do you ever have labs you overly worry about and overly prepare for? This was mine. I've done electromagnets before, but never to this scale and I just wasn't sure what I could need. I think I have enough material for the next few years.
I gave a rundown of this in that week's reflection, but once again this is what I used:
D Batteries (I was worried about how well these would last, but each of the batteries held out for the whole day so I have plenty left for next year)
Electrical Tape (Helps hold down the wires to the end of the batteries and stop the students from getting shocked)
Nails (I got a 5lb box of 10d 3" Galvanized Nails. I don't think the nails you get really matter, but you do want steel nails. This is enough nails to last forever, because there is no reason to not just keep using them)
Sandpaper (The students need to sand off the insulation from the wire)
Paper Clips (To test the strength, though really anything with steel, even other nails, would work)
Wire (This is what I got from Amazon I gave each group about 4' and they were able to use it for all their different tests. I bought 2 rolls, but one was well more than enough. If you look closely the dimensions of the spool are not right on Amazon, this thing 3 inches high)
That is by no means the definitive list of materials, but it was successful for my students.
The main challenge I had with this assignment was that it is hard to complete in a class period (let alone the short Wednesday schedule, which I placed the lab on without thinking about until 1st period ended before I expected). The problem is if they don't finish testing in the class period then they will have a different set of material the next day which could skew their testing. The time isn't necessarily a problem, but some groups had more challenge getting their electromagnets working in the first place, which gave them less time to test their proposed changes.
The other challenge I had was some students had trouble applying their findings to the lab to the conclusion, that being said this is part 2 of the assessment. While most students should be able to complete it, it may start to be a challenge to some. The main problems were students suggesting things in the conclusion such as more electricity (which might work, but they didn't test) or using magnetic material, which suggested they just tried to Google an answer or just guess something.
Part 3:
This part of the assessment was a rotation lab about action at a distance forces using electric and magnetic forces as examples. Generally this went well, a downside is that I think this is easier than part 2, but I really wanted all the students to engage with the electromagnet (and it is much easier to do as a whole group) so it needed to go first.
My main challenge with this part of the assessment was that I couldn't get one of the rotation stations to work (the one where they were supposed to get a circle of paper to float). It just stuck to my hand or flipped off the balloon and onto the table. No idea what was going wrong so I scrapped it and related it with a demo of the Van de Graaff generator to add another electric example. I was planning on doing the demo anyways, but just as a demo.
One change I might make is instead of having written observations, have the students take pictures that show the non-contact interaction. I had some students that did this and I felt like taking a picture that showed it was just as effective, if not more so, than trying to describe it. Plus, they do need to describe it overall at the end of the assessment.
Part 4:
The last part of the assessment was drawing models showing how any of the phenomena they observed worked. This was easy for some students, but really a challenge for many of them. It felt accessible to most of my students, which was great as most of them tried to create the models. But I still have students trying to revise their models with increasingly specific feedback from me.
Overall:
I liked this assessment. The lab activities were good. Some of the parts were a bit easier than they could have been, but that is not necessarily a bad thing. And most of my students felt like they could tackle all the parts, which is great. The small problems I had in parts 1 and 3 were minor and are just something to think about for small revisions next year.
Electrical Tape (Helps hold down the wires to the end of the batteries and stop the students from getting shocked)
Nails (I got a 5lb box of 10d 3" Galvanized Nails. I don't think the nails you get really matter, but you do want steel nails. This is enough nails to last forever, because there is no reason to not just keep using them)
Sandpaper (The students need to sand off the insulation from the wire)
Paper Clips (To test the strength, though really anything with steel, even other nails, would work)
Wire (This is what I got from Amazon I gave each group about 4' and they were able to use it for all their different tests. I bought 2 rolls, but one was well more than enough. If you look closely the dimensions of the spool are not right on Amazon, this thing 3 inches high)
That is by no means the definitive list of materials, but it was successful for my students.
The main challenge I had with this assignment was that it is hard to complete in a class period (let alone the short Wednesday schedule, which I placed the lab on without thinking about until 1st period ended before I expected). The problem is if they don't finish testing in the class period then they will have a different set of material the next day which could skew their testing. The time isn't necessarily a problem, but some groups had more challenge getting their electromagnets working in the first place, which gave them less time to test their proposed changes.
The other challenge I had was some students had trouble applying their findings to the lab to the conclusion, that being said this is part 2 of the assessment. While most students should be able to complete it, it may start to be a challenge to some. The main problems were students suggesting things in the conclusion such as more electricity (which might work, but they didn't test) or using magnetic material, which suggested they just tried to Google an answer or just guess something.
Part 3:
This part of the assessment was a rotation lab about action at a distance forces using electric and magnetic forces as examples. Generally this went well, a downside is that I think this is easier than part 2, but I really wanted all the students to engage with the electromagnet (and it is much easier to do as a whole group) so it needed to go first.
My main challenge with this part of the assessment was that I couldn't get one of the rotation stations to work (the one where they were supposed to get a circle of paper to float). It just stuck to my hand or flipped off the balloon and onto the table. No idea what was going wrong so I scrapped it and related it with a demo of the Van de Graaff generator to add another electric example. I was planning on doing the demo anyways, but just as a demo.
One change I might make is instead of having written observations, have the students take pictures that show the non-contact interaction. I had some students that did this and I felt like taking a picture that showed it was just as effective, if not more so, than trying to describe it. Plus, they do need to describe it overall at the end of the assessment.
Part 4:
The last part of the assessment was drawing models showing how any of the phenomena they observed worked. This was easy for some students, but really a challenge for many of them. It felt accessible to most of my students, which was great as most of them tried to create the models. But I still have students trying to revise their models with increasingly specific feedback from me.
I did give my students a model of the type of information their models should show. Since we didn't cover field lines in a lot of detail, but they were important for the model, I felt like an example was needed.
Overall:
I liked this assessment. The lab activities were good. Some of the parts were a bit easier than they could have been, but that is not necessarily a bad thing. And most of my students felt like they could tackle all the parts, which is great. The small problems I had in parts 1 and 3 were minor and are just something to think about for small revisions next year.
Week 19 Agenda and Assessment 4.1 Overview
Below you can find a link to the Week 19 Agenda. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post.
Week 19 Agenda
Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)
Assessment Overview
(As a note, we are skipping unit 3 for the time being. The topics in unit 3 were covered for this group of students in 7th grade, so they are not critical to be covered this year. If there is time at the end of Unit 4 we may return to unit 3.)
Week 18 Agenda
Friday, February 10, 2017
Week 17 Agenda and Assessment 2.3 Overview
Below you can find a link to the Week 17 Agenda. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post.
Week 17 Agenda
Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)
Sunday, February 5, 2017
Week 16 Agenda
Saturday, January 28, 2017
Week 15 Agenda and Assessment 2.2 Overview
Below you can find a link to the Week 15 Agenda. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post.
Week 15 Agenda
Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)
Week 14 Reflection
This week brought about the end of the semester and moved us towards the end of the first assessment in Unit 2.
Starting this week I felt like my students needed more practice constructing models and clarifying field lines, so we began the week by looking at iron filings and a magnet. The students had magnets and a sealed bag of filings and had to use both to try to model the magnetic field around one magnet and then when both were interacting.
To give them some guidance I showed the groups a demo I have that shows the magnetic field of a magnet pretty clearly with just iron fillings.
It is a plastic bottle, to which I added iron fillings. Then I glued a test tube to the top. To the test tube I added a cow magnet (which is the magnet I could find that best fits in the test tube). It is not a very strong magnet, but to show the field that tends to work best. I've tried a similar setup with a bar neodymium magnet with a 75lb pull and it worked, but the stronger pull made the field lines if anything less clear.
We then moved onto the rotation lab, it went fairly well, though I could not get the balloon/plastic bag flyer activity to work, it would repel from the balloon, but would just either stick to my hand or fall the the ground. So, instead I replaced that on the worksheet with the Van de Graaff generator demo and had the students write about their observations with the generator.
The actual demos I do with the generator are rice crispies on top (which fly off due to the buildup of negative charge in them), paper hair (which is just thin strips of paper tapped in the middle like the spokes of a wheel, this lets you both see repulsion and attraction when you bring the discharge wand close), and then student hair (I also have a wand with metallic strings that shows the same thing for students with short hair). All of the demos are the basic idea that like charges repel and opposite attract.
While the demo is certainly not necessary, I think it is really worthwhile to include if there is the one time budget money for it. The supplies I use are:
Van de Graaff Generator
Discharge Wand
Static Hair Wand
It is worth noting, for anyone that hasn't used a Van de Graaff Generator before that how well it works depends greatly on the weather that day.
We ended the week starting our models on how potential energy relates to the orientation and position of the objects, most importantly the distance between them. Continuing my thoughts at the start of the week, I felt like my students needed more support creating their models so I began the lesson going over what a model might look like for the plastic bottle magnet demo from the start of the week.
Starting this week I felt like my students needed more practice constructing models and clarifying field lines, so we began the week by looking at iron filings and a magnet. The students had magnets and a sealed bag of filings and had to use both to try to model the magnetic field around one magnet and then when both were interacting.
To give them some guidance I showed the groups a demo I have that shows the magnetic field of a magnet pretty clearly with just iron fillings.
It is a plastic bottle, to which I added iron fillings. Then I glued a test tube to the top. To the test tube I added a cow magnet (which is the magnet I could find that best fits in the test tube). It is not a very strong magnet, but to show the field that tends to work best. I've tried a similar setup with a bar neodymium magnet with a 75lb pull and it worked, but the stronger pull made the field lines if anything less clear.
We then moved onto the rotation lab, it went fairly well, though I could not get the balloon/plastic bag flyer activity to work, it would repel from the balloon, but would just either stick to my hand or fall the the ground. So, instead I replaced that on the worksheet with the Van de Graaff generator demo and had the students write about their observations with the generator.
The actual demos I do with the generator are rice crispies on top (which fly off due to the buildup of negative charge in them), paper hair (which is just thin strips of paper tapped in the middle like the spokes of a wheel, this lets you both see repulsion and attraction when you bring the discharge wand close), and then student hair (I also have a wand with metallic strings that shows the same thing for students with short hair). All of the demos are the basic idea that like charges repel and opposite attract.
While the demo is certainly not necessary, I think it is really worthwhile to include if there is the one time budget money for it. The supplies I use are:
Van de Graaff Generator
Discharge Wand
Static Hair Wand
It is worth noting, for anyone that hasn't used a Van de Graaff Generator before that how well it works depends greatly on the weather that day.
We ended the week starting our models on how potential energy relates to the orientation and position of the objects, most importantly the distance between them. Continuing my thoughts at the start of the week, I felt like my students needed more support creating their models so I began the lesson going over what a model might look like for the plastic bottle magnet demo from the start of the week.
Ignore the end of the semester note the demo shared the board with, I don't have much white board space in my room.
Sunday, January 22, 2017
Week 13 Reflection
This week brought with it the start of Unit 2, which I've been looking forward to since last year. Unit 1 meant a lot of basic physics with motion and forces. This new unit lets us get into some more interesting topics, starting with electricity and magnets.
We started out the assessment creating a double bubble map about electric fields and magnets, it was a fine activity. The only real issue is the focus between the articles is not that similar so in comparing the differences some students focused more on the fact that a magnet is a thing and an electric field is a field, rather than topics like electric charges having a single polarity while magnets have two. I would like to find different articles for next year, or possibly just write something of my own for them to use.
Then we moved onto the electromagnet lab. I will admit to being a bit worried about this one, I've had my students explore electromagnets before, but I've never set it up so that I had to be ready with materials for 7 periods of students to make them. As a general background, I had my students work in table groups of around 4, while this could certainly be done with groups of 2 I felt like the larger groups helped the groups be successful as more students were involved to problem solve and think of idea to test.
For the materials for the students I got:
D Batteries (I was worried about how well these would last, but each of the batteries held out for the whole day so I have plenty left for next year)
Electrical Tape (Helps hold down the wires to the end of the batteries and stop the students from getting shocked)
Nails (I got a 5lb box of 10d 3" Galvanized Nails. I don't think the nails you get really matter, but you do want steel nails. This is enough nails to last forever, because there is no reason to not just keep using them)
Sandpaper (The students need to sand off the insulation from the wire)
Paper Clips (To test the strength, though really anything with steel, even other nails, would work)
Wire (This is what I got from Amazon I gave each group about 4' and they were able to use it for all their different tests. I bought 2 rolls, but one was well more than enough. If you look closely the dimensions of the spool are not right on Amazon, this thing 3 inches high)
The groups were generally very successful in getting their magnets to work.
For the groups that were not successful at first, their main problems were not sanding off the wire well enough, not attaching the wire to actual connection points of the battery, or just not having it taped down well enough. For some groups I helped them reattach the wire to the battery, for others just slightly pinching the tape to hold the electromagnet to the battery made it work. And with that I will have a caution. These get hot, we're not talking burn off your hand hot, but a bit unpleasant. It is actually mostly the battery itself that gets hot (and any uninsulated parts of the wire). As part of my instructions to my students I cautioned them to not touch the uninsulated parts of the wire and to only leave the electromagnet hooked up when they were actually testing to give it a chance to cool.
The day after the testing was done I showed my students a video of three different electromagnets. I will note that there is always variation in how well the electromagnets perform based on just how good the connection is. (Enjoy the jolly music, it is my attempt to seem hip.)
We started out the assessment creating a double bubble map about electric fields and magnets, it was a fine activity. The only real issue is the focus between the articles is not that similar so in comparing the differences some students focused more on the fact that a magnet is a thing and an electric field is a field, rather than topics like electric charges having a single polarity while magnets have two. I would like to find different articles for next year, or possibly just write something of my own for them to use.
Then we moved onto the electromagnet lab. I will admit to being a bit worried about this one, I've had my students explore electromagnets before, but I've never set it up so that I had to be ready with materials for 7 periods of students to make them. As a general background, I had my students work in table groups of around 4, while this could certainly be done with groups of 2 I felt like the larger groups helped the groups be successful as more students were involved to problem solve and think of idea to test.
For the materials for the students I got:
D Batteries (I was worried about how well these would last, but each of the batteries held out for the whole day so I have plenty left for next year)
Electrical Tape (Helps hold down the wires to the end of the batteries and stop the students from getting shocked)
Nails (I got a 5lb box of 10d 3" Galvanized Nails. I don't think the nails you get really matter, but you do want steel nails. This is enough nails to last forever, because there is no reason to not just keep using them)
Sandpaper (The students need to sand off the insulation from the wire)
Paper Clips (To test the strength, though really anything with steel, even other nails, would work)
Wire (This is what I got from Amazon I gave each group about 4' and they were able to use it for all their different tests. I bought 2 rolls, but one was well more than enough. If you look closely the dimensions of the spool are not right on Amazon, this thing 3 inches high)
The groups were generally very successful in getting their magnets to work.
For the groups that were not successful at first, their main problems were not sanding off the wire well enough, not attaching the wire to actual connection points of the battery, or just not having it taped down well enough. For some groups I helped them reattach the wire to the battery, for others just slightly pinching the tape to hold the electromagnet to the battery made it work. And with that I will have a caution. These get hot, we're not talking burn off your hand hot, but a bit unpleasant. It is actually mostly the battery itself that gets hot (and any uninsulated parts of the wire). As part of my instructions to my students I cautioned them to not touch the uninsulated parts of the wire and to only leave the electromagnet hooked up when they were actually testing to give it a chance to cool.
The day after the testing was done I showed my students a video of three different electromagnets. I will note that there is always variation in how well the electromagnets perform based on just how good the connection is. (Enjoy the jolly music, it is my attempt to seem hip.)
This let us discuss what they should have observed, even if their electromagnets were not as successful or just not as consistent in how they performed.
Week 14 Agenda
Wednesday, January 18, 2017
Reflecting on Unit 1
With the end of last week we reached the end of unit 1. This first unit took a total of 12 weeks and had three parts. Given that there are about 38 weeks in the school year and 4 units, this is a pace that needs to increase. I think that is certainly achievable. The biggest challenge for me is that a quicker pace obviously means I need to be more efficient in grading as well.
Pacing:
Looking at the whole thing, as a general pacing we see three three parts taking the following amounts of time:
First part: 4 Weeks
Second part: 6 Weeks
Third part: 2 Weeks
I know I can shave a least a week from both of the first two assessments which would put me more on track with where I eventually need to be.
Tracking Sheets
I have been using the tracking sheets for the students to monitor their progress through the assessments. And while there are things I like about them, there are other things that just are not working. I would like the sheets to fulfill a way for students to reflect specifically on the SEPs. So I want to work those into it. The date tracking for the students just didn't work well because too many students didn't fill them out initially with the dates and couldn't remember when they completed parts. That being said, I am going to keep the sheets, just adapt them. They work well in helping the students see how the whole assessment goes together.
Grading
As a quick overview of what I am doing, our grades are set up on a 4 point scale. Although the numbers a bit different basically a 4 is an A, 3 B, 2 C, 1 D. In the gradebook the assessments are the majority of the grade. So, if a student got mastery on the first 3 parts of the assessment, but not the 4th, they would get a 3/4 on the assignment. Basically a point for each part. That is working fine. However, I am entering this into the gradebook as one grade (with comments saying what the student hasn't completed).
Next semester I am going to break up the assessment into the 4 parts in the gradebook. I feel like it will give the students and parents quicker feedback on where they are and makes it clearer what still needs to be done. So, if a student finished just the first two parts (and I require mastery before they move on) they would have a 4/4 on each of those two parts and a 0/4 on the other two. Essentially this is the same grade as the combined score (just out of 4 times as many points). Now, I may decide this doesn't help, in which case it would be easy enough to combine it all back together into one grade.
With grading as well, although I use rubric to give feedback and record scores on Google Classroom, the biggest help for me in being organized in tracking student progress has been the assessment record sheets where I mark off the student score for each part and highlight blue when they are done. It has made it very easy for me to see progress for each student at a glance. I will definitely continue doing that.
Overall
I am really happy with our move to NGSS. There is still a lot to learn, things to change, and time for more reflection. But the assessments are working well and I feel like my students are understanding the topics at a deeper level than before. While we have some supplemental activities and lessons, I've been very focused on mostly just working through the assessments. Now, I still have my doubts. I can put together a beautiful PowerPoint or Pear Deck, I can make a fairly engaging lecture, I can get good students to answer questions on a test with respectable levels of correctness. While we still have some Pear Deck lessons, we've moved away from all of that, things I felt very comfortable doing. There are times I wonder if I am still effective. But, I believe in our plan and I believe in student inquiry. While I will improve in implementing NGSS each year I know this is the right path.
Pacing:
Looking at the whole thing, as a general pacing we see three three parts taking the following amounts of time:
First part: 4 Weeks
Second part: 6 Weeks
Third part: 2 Weeks
I know I can shave a least a week from both of the first two assessments which would put me more on track with where I eventually need to be.
Tracking Sheets
I have been using the tracking sheets for the students to monitor their progress through the assessments. And while there are things I like about them, there are other things that just are not working. I would like the sheets to fulfill a way for students to reflect specifically on the SEPs. So I want to work those into it. The date tracking for the students just didn't work well because too many students didn't fill them out initially with the dates and couldn't remember when they completed parts. That being said, I am going to keep the sheets, just adapt them. They work well in helping the students see how the whole assessment goes together.
Grading
As a quick overview of what I am doing, our grades are set up on a 4 point scale. Although the numbers a bit different basically a 4 is an A, 3 B, 2 C, 1 D. In the gradebook the assessments are the majority of the grade. So, if a student got mastery on the first 3 parts of the assessment, but not the 4th, they would get a 3/4 on the assignment. Basically a point for each part. That is working fine. However, I am entering this into the gradebook as one grade (with comments saying what the student hasn't completed).
Next semester I am going to break up the assessment into the 4 parts in the gradebook. I feel like it will give the students and parents quicker feedback on where they are and makes it clearer what still needs to be done. So, if a student finished just the first two parts (and I require mastery before they move on) they would have a 4/4 on each of those two parts and a 0/4 on the other two. Essentially this is the same grade as the combined score (just out of 4 times as many points). Now, I may decide this doesn't help, in which case it would be easy enough to combine it all back together into one grade.
With grading as well, although I use rubric to give feedback and record scores on Google Classroom, the biggest help for me in being organized in tracking student progress has been the assessment record sheets where I mark off the student score for each part and highlight blue when they are done. It has made it very easy for me to see progress for each student at a glance. I will definitely continue doing that.
Overall
I am really happy with our move to NGSS. There is still a lot to learn, things to change, and time for more reflection. But the assessments are working well and I feel like my students are understanding the topics at a deeper level than before. While we have some supplemental activities and lessons, I've been very focused on mostly just working through the assessments. Now, I still have my doubts. I can put together a beautiful PowerPoint or Pear Deck, I can make a fairly engaging lecture, I can get good students to answer questions on a test with respectable levels of correctness. While we still have some Pear Deck lessons, we've moved away from all of that, things I felt very comfortable doing. There are times I wonder if I am still effective. But, I believe in our plan and I believe in student inquiry. While I will improve in implementing NGSS each year I know this is the right path.
Reflecting on Unit 1 Assessment 3
This final assessment for Unit 3 was challenge for some of the students, unlike the earlier assessments where there really was ample time for all students to finish, for this one there was time for some students to finish the fourth part, but not for all. For those who didn't finish in class it became a homework assignment for them. While this was a challenge in its way, I think it reflects the quicker pace I feel like I need to for the assessments. Most won't be the quick two weeks this one was, but the time needs to be used more efficiently.
Beyond the time, the last section of this assessment was a challenge because it was math heavy. I did what I could to support all the students by providing examples on the board and showing how even the more difficult calculation for speed is solved for each value the same exact way. If they could follow the procedure they could get it.
Another challenge was in the graphing. The first two parts of the assessment involved working with real world data, and sometimes that can be hard to use. I believe that the benefit of working with this data was worthwhile, but it was a challenge. In a way, it would be easier to give the students real data that I know would give good results.
Now, that being said, there is no reason the data shouldn't work, other than issues of scale with the wide range in masses. For the data from NASA they calculate the values using the equation for kinetic energy, so it should all line up. They take the magnitude (brightness), use that to estimate the size. They use the size and the average density of an asteroid to estimate the mass. Finally they take the mass and speed to find the energy. Basically the numbers should work. For my students who had graphs that didn't show a trend the issue tended to be mistakes in graphing, or mistakes in adjusting for scale.
And finally, while my students could certainly tell you with certainty that more mass and more speed equal more energy, I am not 100% sure how well they know this for application. In looking at the answers for some students in the final assessment I felt like their answers came across as more of a practiced response, than some deep thought out analysis.
Positives
While that may sound like a lot of challenges, I did feel like this assessment went well. My students got the basic concept, the asteroid drop lab went well, and most students didn't seem overwhelmed by the math in the last part. While the challenges need to be remembered for next year, I don't know that I would specifically change anything for next year other than I might want to add some activity at the end to try to really assess their overall ability to apply this information in detail.
Beyond the time, the last section of this assessment was a challenge because it was math heavy. I did what I could to support all the students by providing examples on the board and showing how even the more difficult calculation for speed is solved for each value the same exact way. If they could follow the procedure they could get it.
Another challenge was in the graphing. The first two parts of the assessment involved working with real world data, and sometimes that can be hard to use. I believe that the benefit of working with this data was worthwhile, but it was a challenge. In a way, it would be easier to give the students real data that I know would give good results.
Now, that being said, there is no reason the data shouldn't work, other than issues of scale with the wide range in masses. For the data from NASA they calculate the values using the equation for kinetic energy, so it should all line up. They take the magnitude (brightness), use that to estimate the size. They use the size and the average density of an asteroid to estimate the mass. Finally they take the mass and speed to find the energy. Basically the numbers should work. For my students who had graphs that didn't show a trend the issue tended to be mistakes in graphing, or mistakes in adjusting for scale.
And finally, while my students could certainly tell you with certainty that more mass and more speed equal more energy, I am not 100% sure how well they know this for application. In looking at the answers for some students in the final assessment I felt like their answers came across as more of a practiced response, than some deep thought out analysis.
Positives
While that may sound like a lot of challenges, I did feel like this assessment went well. My students got the basic concept, the asteroid drop lab went well, and most students didn't seem overwhelmed by the math in the last part. While the challenges need to be remembered for next year, I don't know that I would specifically change anything for next year other than I might want to add some activity at the end to try to really assess their overall ability to apply this information in detail.
Friday, January 13, 2017
Reflecting on Unit 1 Assessment 2
I went into this assessment intrigued with how my students would do compared to the first one. This assessment was mostly engineering challenges and labs (generally engaging), the last was a large research paper (not the most thrilling for all students...at least the writing part).
Overall What intrigued me in the end was that overall, the classes didn't do dramatically better on this assignment in terms of overall score, but there were differnces. The paper was a really clear progression both through the task itself and through difficulty, what ended up is I had a number of students who scored low because it was difficult to get them engaged in the assignment. However, a lot of students were able to make it to the last part and successfully complete the paper. With this new assessment, which was focused on the engineering task of building the balloon powered car, almost all students finished the first two parts of the assessment, but fewer received a perfect score of 4 because the last two parts were more challenging.
Looking at the whole assessment one of the successes was how the students did in choosing an investigative approach for the third task. They had to investigate both how balanced and unbalanced forces change motion and how mass changes motion. This is not an easy task to do. However, the students amazed me with the creativity of their approaches to the challenge.
One of the challenges was the engineering design task itself. The students were engaged and all groups were ultimately successful in building a working car, but the whole process was new to me. Getting the groups to focus on all steps of the process was a challenge for me because it was new to me. I imagine this only gets easier as the process becomes more familiar.
Grading
I feel like as the year progresses I am getting more of a hang on the grading process. The structure of this assessment helped with one of my challenges last assessment, which was getting things graded quickly. Last assessment had some really quick parts which were hard to get graded before the students moved onto the next. For this assessment all the parts took more time. This made grading easier, though I am sure this will vary by assessment.
What I Would Do Different
Overall What intrigued me in the end was that overall, the classes didn't do dramatically better on this assignment in terms of overall score, but there were differnces. The paper was a really clear progression both through the task itself and through difficulty, what ended up is I had a number of students who scored low because it was difficult to get them engaged in the assignment. However, a lot of students were able to make it to the last part and successfully complete the paper. With this new assessment, which was focused on the engineering task of building the balloon powered car, almost all students finished the first two parts of the assessment, but fewer received a perfect score of 4 because the last two parts were more challenging.
Looking at the whole assessment one of the successes was how the students did in choosing an investigative approach for the third task. They had to investigate both how balanced and unbalanced forces change motion and how mass changes motion. This is not an easy task to do. However, the students amazed me with the creativity of their approaches to the challenge.
One of the challenges was the engineering design task itself. The students were engaged and all groups were ultimately successful in building a working car, but the whole process was new to me. Getting the groups to focus on all steps of the process was a challenge for me because it was new to me. I imagine this only gets easier as the process becomes more familiar.
Grading
I feel like as the year progresses I am getting more of a hang on the grading process. The structure of this assessment helped with one of my challenges last assessment, which was getting things graded quickly. Last assessment had some really quick parts which were hard to get graded before the students moved onto the next. For this assessment all the parts took more time. This made grading easier, though I am sure this will vary by assessment.
What I Would Do Different
- I wanted to try out wikiprojects with the first assessment for the engineering design process. While this worked for some students, the lack of structure meant that other groups who were less familiar with the process missed documenting steps. A Google Slides document with a page for each step of the process might help with this.
- For the second activity with Newton's laws it went well, breaking it up by law per day seemed to work well. At the same time some students struggeled more than expected at making an annotated picture or really understanding what that is. I would want to clarify the expectation there more at the start of the assignment.
- I am struggling with the tracking and monitoring sheet, it comes across to my students as something that is just a task, rather than something the are benefiting from. I understand its purpose and want something to fill the role, but I feel like that sheet is not working for me.
What I Would Do the Same
- The ammount of materials provided for the car build seemed good. It was enough for any group to make a car, but still encouraged students to explore other materials and bring stuff from home. And I liked the challenge in general, making the car was achievable for all groups.
- I also really like the Teacher Assessment Record sheet. It is working well to quickly motitor progress through the assessment for each student and takes little time to complete.
Week 12 Reflection
This week began the core of the third and last assessment for unit 1. I began by introducing the set up for lab. I explained that the pan would be half full our flour with a layer of cocoa on top and that they were to look at that impact to figure out how energy varied by mass and speed. I also explained why we were looking at heigh rather than speed, basically that speed is hard to measure, but when something is dropped from heigher up it will have more speed when it hits the surface. My students then worked on their hypothesis, figuring out what they would actually measure, and setting up their data tables.
Asteroid Impact Lab:
I really wasn't sure how much material I would need. I have 7 class periods of 8th grade science and wasn't sure how much I could reuse the material (I also wanted enough that the other 8th grade teacher could use the material as well). I bought 4 10lb bags of flour, one thing of cocoa powder, and four trays. (In the end I needed more cocoa powder and had extra flour.) I mixed the cocoa powder about 1 to 1 with flour, this makes it easier to sift out over the surface and makes it last a bit longer. (With the cocoa powder it is worth noting that the color varies, the Hershey's was darker than the Koger one I got later.)
As seen in the pictures, I didn't worry too much in making sure the flour was perfectly covered, especially at the edges. I tested this at home and the impact was clear even without a perfect covering.
To make things stay a bit cleaner I had the students test with their pan in a cardboard box, this kept the flour that splashed out somewhat contained, rather than over my whole room.
Above is an example of what the student results looked like for their mass trial. The ejecta is particularly apparent in that of the biggest rock, but is there for all of them. I had to make it clear to my students that they needed to pay attention to that white ejecta, not just the hole itself. (I showed the students this picture as an example to illustrate the point.) Below is an example of before and after one impact.
For my groups who didn't finish testing the first day (which was only because they had to share the pans so about 5 groups between all the classes ran out of time), I set up the second day using flour as the surface covering as seen below.
So the brown pattern around the rock is the ejecta from that impact.
All together the lab went well, but it does have the potential to be messy or have students fool around with the pan. To help that I had the students keep the pans in the box. If I was more concerned about student misbehavior, possibly only having a couple pans at the front of the room that I would directly manage the whole time, could help with that.
After the asteroid drop lab we moved onto the math heavy portion of the assessment with the conservation of energy investigation. The students collect some real world data on how more height = more energy and then work on representing conservation of energy on a falling object using math.
For some of my students the math was easy, for most of them it was tricky. To help them out I modeled how we solve the problems to fill out the chart. If they can follow the steps all they need to do is plug in their own numbers to work through the problem.
As a run down of this math, they start by calculating gravitational potential energy. As the ball drops it would lose potential energy so they should see this number decrease to zero as they fill out the chart.
The total energy would be the sum of the potential and kinetic energy, so they can find kinetic by subtracting the potential at that height from the total (which would be the potential energy before it falls). (As a note, the unit for energy is actually "x10^-3 N" because I didn't have them convert the grams into killograms. I felt like that made the math easier, but they certainly could convert the mass to kilograms.) They should see that the kinetic energy increases from 0 as they fill out the chart.
To get speed they use the equation for kinetic energy and solve for speed, this is the part of the work that is tricky and I know I plan on giving partial credit for those who solve for potential and kinetic energy, but can't figure out speed. Since the initial kinetic energy and speed were zero I did a second example showing how to calculate speed of the kinetic energy was 20. They should see that the speed increases from 0, though the exponent doesn't give it the linear trend that we see in the other columns.
While this last assessment is tricky for some, it is the last part of the assessment, so I don't expect all students to complete it. Plus, I feel it is important for the students to actually work with the equation to full understand how the variables change the measurement.
Asteroid Impact Lab:
I really wasn't sure how much material I would need. I have 7 class periods of 8th grade science and wasn't sure how much I could reuse the material (I also wanted enough that the other 8th grade teacher could use the material as well). I bought 4 10lb bags of flour, one thing of cocoa powder, and four trays. (In the end I needed more cocoa powder and had extra flour.) I mixed the cocoa powder about 1 to 1 with flour, this makes it easier to sift out over the surface and makes it last a bit longer. (With the cocoa powder it is worth noting that the color varies, the Hershey's was darker than the Koger one I got later.)
As seen in the pictures, I didn't worry too much in making sure the flour was perfectly covered, especially at the edges. I tested this at home and the impact was clear even without a perfect covering.
To make things stay a bit cleaner I had the students test with their pan in a cardboard box, this kept the flour that splashed out somewhat contained, rather than over my whole room.
Above is an example of what the student results looked like for their mass trial. The ejecta is particularly apparent in that of the biggest rock, but is there for all of them. I had to make it clear to my students that they needed to pay attention to that white ejecta, not just the hole itself. (I showed the students this picture as an example to illustrate the point.) Below is an example of before and after one impact.
For my groups who didn't finish testing the first day (which was only because they had to share the pans so about 5 groups between all the classes ran out of time), I set up the second day using flour as the surface covering as seen below.
So the brown pattern around the rock is the ejecta from that impact.
All together the lab went well, but it does have the potential to be messy or have students fool around with the pan. To help that I had the students keep the pans in the box. If I was more concerned about student misbehavior, possibly only having a couple pans at the front of the room that I would directly manage the whole time, could help with that.
After the asteroid drop lab we moved onto the math heavy portion of the assessment with the conservation of energy investigation. The students collect some real world data on how more height = more energy and then work on representing conservation of energy on a falling object using math.
For some of my students the math was easy, for most of them it was tricky. To help them out I modeled how we solve the problems to fill out the chart. If they can follow the steps all they need to do is plug in their own numbers to work through the problem.
As a run down of this math, they start by calculating gravitational potential energy. As the ball drops it would lose potential energy so they should see this number decrease to zero as they fill out the chart.
The total energy would be the sum of the potential and kinetic energy, so they can find kinetic by subtracting the potential at that height from the total (which would be the potential energy before it falls). (As a note, the unit for energy is actually "x10^-3 N" because I didn't have them convert the grams into killograms. I felt like that made the math easier, but they certainly could convert the mass to kilograms.) They should see that the kinetic energy increases from 0 as they fill out the chart.
To get speed they use the equation for kinetic energy and solve for speed, this is the part of the work that is tricky and I know I plan on giving partial credit for those who solve for potential and kinetic energy, but can't figure out speed. Since the initial kinetic energy and speed were zero I did a second example showing how to calculate speed of the kinetic energy was 20. They should see that the speed increases from 0, though the exponent doesn't give it the linear trend that we see in the other columns.
While this last assessment is tricky for some, it is the last part of the assessment, so I don't expect all students to complete it. Plus, I feel it is important for the students to actually work with the equation to full understand how the variables change the measurement.
Subscribe to:
Posts (Atom)