Wednesday, April 26, 2017

Non-NGSS Agendas

Since we have covered the three NGSS Units that we are covering this year, for the remainder of the year we will be doing supplemental activities that may or may not be structured like typical NGSS assessments.

I will be posting these agendas here in this one post with a brief desciption of the aim for the assignments, updating as we go.

Week 26 Marine Plastics and Plastic Chemistry Week 
- For this series of assignments I wanted to continue on from their capstone projects, looking at resource consumption. I also wanted to bring back in some chemistry as a review for the CAST and for the students to refresh on for high school science. 

Week 27 Finishing Plastics and Video Project
- For this assignment I wanted them to sort of review material from the year or work on answering a science question that interested them, but was never covered in class. So the aim is both to give them support in researching an interesting topic and somewhat indirectly review the material from the year.

Week 28 Finishing Video Project


Monday, April 17, 2017

Week 25 Agenda

Below you can find links to the Week 25 Agenda. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post. 

Monday, April 3, 2017

Week 23 and 24 Agendas and Assessment 4.3 Overview

Below you can find links to the Week 23 and 24 Agendas. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post. 


Week 23 Agenda

Week 24 Agenda

Also, as a general resource, this is what my preliminary plan for this whole assessment will look like. (Things will certainly change as I see what my students need extra support with, but I find it really helpful to have a very general idea of how I might want to approach the whole task.)

Reflecting on Unit 4 Assessment 1

This assessment was somewhat influenced with my students' practice with the example CAST questions online. Beyond the obvious need to incorporate more math (something I am a bit unsure where to incorporate as the SEP for mathematical and computational thinking is not really covered much in the standards about motion and force, but tested using that standard in the practice questions), I felt like my students needed more practice analyzing given data dn interpreting models or diagrams that were given to them, forcing them to make assumptions about the data. This assessment heavily looked at those two skills.

This unit was somewhat tricky to me, I am used to incorporating light boxes, lenses, and mirrors into any unit on waves, but our school doesn't have any supplies for that, so I approached it differently. This made the unit less interactive than I would have liked. Getting something for the students to interact with in this assessment is something we need to factor into the budget. 

Introduction
We started with a Pear Deck about how we can use wave interactions to understand the layers of the Earth and as part of that introduced the types of waves. This allowed me to bring in more Earth Science (on the Earth, not space, level to this unit). As one of my goals I am trying to make all the phenomena and introductions have another science domain tied in. 

Part 1
There is really not much to say about part one, it was designed to introduce the key vocabulary and basic concepts for the unit and did that. It wasn't particularly interesting, but it was functional.

Part 2
For part two the students worked with provided data about Tsunamis, with our focus in this part being about ocean waves. As part of this I used Google MyMaps to plot out the locations, which worked well and was a familiar interface for the students, so they had no trouble using it. The students has to use the provided data to create a graph or other visual representation and form a claim based on that data. The open ended nature of the graphs allowed me to assess which students better understood how to represent data in a graph. 

Some examples of their graphs and other representations are presented below. Although they were all different, there was value in each one. 



Part 3
In part 3 the students were presented with about 7 models and a data table to complete looking at the waves of the electromagnetic spectrum and how they are absorbed, reflected, or transmitted through the layers of the atmosphere. This one was, obviously, focused on light waves. 

This assessment was a struggle for many of the students. The models are not always specific, and it is not as if one picture gave them all the content for a single row or column of the table: they had to put it all together and make assumptions. For their conclusion they discussed how the interaction of EM waves and the atmosphere supported life on Each (bringing a bit more about life science and earth science.) Their answers were great an varied between the protection from Gamma and UV rays, to the greenhouse effect, and to visible light.

Part 4
Part 4 was the part of the assessment I was most worried about going in. They were looking at one of a few scenarios and discussing how soundproofing could be used in that situation. This part of the assessment focused on sound. 

I gave them three options to look at, sound walls by a freeway, sound proofing in the band room, and sound proofing to practice their drums in the garage. They students research materials, discussed benefits and costs, and wrote up a formal proposal. The biggest issue was that, while I want how they present the material to up to them, I want to add more structure in specifying the expectations. 

Overall
I thought the assessment went well, though part 3 was rather challenging for some students. I liked having the focus on the different parts being about different categories of waves. I would like to bring in an interactive element, which could go in part 1, though part 1 still needs to have them working on constructing a basic model of a wave, so it would have to allow for that as well.

Sunday, March 19, 2017

Grading assessments as a whole or as parts?

One of the key changes that has gone along with our 4 part assessment model is the move to a sort of modified standards or mastery based grading on a 4 point scale. On an approximate level this means that a 4 is an A, 3 is a B, 2 is a C, 1 is a D, and below that is an F (though the actual comparison varies slightly). The key effect of this is that it is much more achievable for a student to not fail. 

When discussing grades with struggling students we can discuss an achievable goal to get them to reach a passing grade and demonstrate understanding. On the traditional 100% scale you would have students who would work and work to try to raise their grade, but seem stuck below that 60% mark. If I have a student who wants to put in the effort and show their understanding I want to have a grade system to let them do that.

As part of this change students get one score per assessment, based on how far they demonstrated proficiency. So a student who mastered the first two parts of the assessment, but not the last two, would score a 2/4 on the assessment. A student who mastered all parts would score a 4. This system worked fine, but I felt like my students (and parents) didn't really get where that grade was coming from. It is our first year making this change to NGSS, this assessment model, and mastery grading. So I began semester 2 differently. 

For the second semester, with the aim of improving understanding of the grading system, I moved to giving a score for each part. If a student received mastery on the first and second parts of the assessment, but none of the others, they scored a 4/4 on part 1, a 4/4 on part 2, and zeros on the next two. Ultimately, they scored an 8/16 on the combined assessment grades giving them the same percent of points that they would have had the first semester. I felt like this did help clarify the grading system, and now that it has done that I am happily changing my grades back to what I did semester one, the combined assessment grade.

There are a few reasons why I like this combined grade:
1) It makes student understanding on the assessment and standard more obvious. A student scoring 3.5/4 on the whole assessment is a more succinct measurement than having to consider a number of parts.
2) Not every student is able to master the whole assessment. Yes, I am there to support them and try to guide them to understanding, but the fourth part of the assessment can be rather difficult. Having a zero (or a missing assignment) in the gradebook puts a lot of focus on an assignment, rather than assessing overall understanding.
3) Finally, it is way more time consuming entering each individual score and keeping those updated with revisions and work that comes in late, than just managing one score. I honestly, didn't expect it to be so much more consuming on my end, but it has been. 

I don't regret having the separate scores for a while, I feel like my students now have a better idea of how the assessment score comes together, but I am sure glad to go back to the combined score. 

Thursday, March 16, 2017

Week 22 Agenda

Below you can find a link to the Week 22 Agenda. This will be updated as the week passes and at the end of the week a reflection will be shared in a different post. 


Sunday, March 12, 2017

Reflecting on Unit 2 Assessment 3

This assessment ended up being both easy and hard for me to implement. I did almost the same set of activities last year when first starting to implement NGSS standards and practices, which made it familiar. At the same time, I have evolved a lot in how I implement the standards and practices and it was hard to not fall back to my previous ways. 

We started this unit with an article about a planet with three suns. We used Verso, which allows anonymous discussion, to post what we thought would be different if we lived on that planet. There was a lot of great discussion, before we covered the material in class, on how temperatures, seasons, eclipses, and more would be different. I actually used the same article last year, but in that case it was our final application project where they used the knowledge they had developed to discuss what would be different. 

For the assessment itself there were, as always, four parts. I will be discussing those before. at the end of our implementation of the unit there were some major changes we made in the structure of the assessment. I will add some comments about those. 

Part 1 
In this part of the assessment they gathered information about an object in the solar system and added it to a set of class slides. The goal is to look at scale of objects in the solar system. I did the same activity last year and both times the students were randomly given a name as they walked in and had to find a randomly assigned partner with the same name, that mechanism of pairing up really works well for this low impact activity. I was happy with that part of the task, but at the same time this was too easy as an entire task. I want part 1 to be accessible to all students, but there needs to be more thought and not just research to the task. In the end, on reflection, we moved Part 4 to join Part 1.

Part 2
In this part of the assessment the aim was to look at how we use technology to research about space. This task is also nice because it starts to bring in light and waves which is discussed later in the year. This is also a point where I brought in some whole class activities (more on that in a sec). In the task the students picked some space mission (Cassini, Hubble, Curiosity,...) and discussed who was involved, when it occurred, the technology involved, and the major discoveries. They used this information to make a poster (on paper, PicCollage, some even used Slides) and present the information to their group.

As for the whole class activity, one of my favorite topics in space is how we know the composition of the sun using light. I begin with some quick slides discussing subtractive vs additive color and how we perceive color. As part of that I have three colored bulbs (red, green, blue) that I can use to show how, when light combines, we perceive a different color not both or all three separately. Then we look at emission spectra and discuss how the wavelengths emitted help us identify different elements and looking at it all together helps us understand what something like a star is made of. 
Examples of Spectra
The reason I find this demo really valuable (other than the fact that it is easy to take cool pictures through your phone with the diffraction grating) is that so much of space is distant. You can show them pictures, even have them interact with the websites to explore exoplanets, but it is all distant. This is a real application. Plus it is cool, everyone should look at emission spectra. 

Overall I thought this part of the assessment went well. I was happy with the student products and the students seemed to enjoy being able to pick whatever mission they wanted. 

Part 3
In this part of the assessment we tacked the Sun, Earth, Moon models. If you haven't looked at the evidence statements for this standard, it is a beast (and that is not even including the seasons model that we tacked in assessment 4.2). 

We approached this model over three days and broke down the information as shown below:

Day 1 - Basic Model (If they look at the simulations from the previous assessment this can help them think about this basic model.)
- The Sun
- The Earth
- The Moon
- The Orbital Plane for Each
- The Orbit of Each
- The Tilt of Earth's Axis
- The Distance from the Earth to the Sun and from the Moon to the Earth
- The Time it Takes for the Earth to Rotate and Revolve
- The Time it Takes for the Moon to Revolve

Day 2 - Phases Model (The teacher may want to create a real model of phases with a ball or balloon with one side colored black and have them think about how much they can see as being illuminated as it revolves around them.)
- The Sun
- The Earth
- The Moon
- The Path Light Travels
- The Names of the Phases
- The Amount of the Moon Illuminated by the Sun
- Where, As The Moon Orbits, Each Phase Is

Day 3 - Eclipse Model (Worksheet that can be used with eclipses)
- What Arrangement of The Sun, Earth, and Moon Causes Lunar Eclipses
- What Arrangement of The Sun, Earth, and Moon Causes Solar Eclipses
- How Much of the Earth Sees Each Type Of Eclipse

It was a lot to tackle. There were some things that worked well. We did a picture sort before starting the phases model, which elicited some great group discussions and debates. I let the students choose how they made their model, some drew on paper, some on the iPad, and some used apps like PicCollage to combine actual pictures, with drawings, text, and more to make their models. But this task was a beast, and that doesn't even include the ending paragraph where they need to describe which phases we see during which type of eclipse and explain why we don't have an eclipse twice each month. Plus, some students take a long time working on making their models look super cool, which is great, but makes the assessment drag. 

As such we changed this part of the assessment to be broken down over Parts 3 and 4. 

Part 4
In this last part of the assessment the students used Actively Learn (thank you Core teachers for giving me ideas on other ways to use this site) to write a three paragraph essay discussing a planet that we could use for habitation with technology and we that we couldn't. They returned to their data from the slides to do this. To set up the assignment on Actively Learn I uploaded directions for each part and then inserted questions for each paragraph. This let me give them quick feedback as they worked. Plus, having it broken down into paragraph tasks made the overall task less imposing. I loved this task, but it didn't really fit in Part 4 and thus it moved to join Part 1. 

Overall
Overall I was happy with how the assessment turned out, but as I said there were changes. One of the big things that didn't work how I wanted it was the models. Because the task was so overwhelming in complexity of what was involved, it was hard to implement revision and reflection (a key part of modeling). As a result for many students, their models reflected what they found online more than their own individual thoughts.