SLO NO!! Now What?

https://i1.wp.com/www.butte.edu/departments/careertech/businessed/images/slos.gif

I have been staring at this data for TWO days! CRAP!! What happened?? I am looking at my SLO data (student learning objectives) which were written by me as long-term, measurable academic growth targets for each of my students. I wrote two goals this year that made up my SLOs. Both written in the area of English Language Arts, more specifically in guided reading and writing, and admittedly my weakest areas of practice by far. I’ll take math over ANYTHING any day!! However, I digress…

Let me preface all of this by first explaining what SLOs are and how they have come to fruition. Student Learning Objectives are one of two components used to make up the new Ohio teacher evaluation system (OTES). The other component is the teacher’s performance, as determined by a written professional growth plan, formal observations, and administrative walkthroughs to name a few. Each component is weighted at 50% (50% for teacher performance and 50% for student academic growth), together, making up a final summative rating for the teacher at the end of the school year (Ohio Department of Education, 2014).

https://i2.wp.com/education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-s-Teacher-Evaluation-System/Teachers-at-State-Agencies/State-Board-Policy-on-the-Evaluation-of-State-Empl/OTES-Framework.png.aspx

Now, while there are a few different ways to obtain student growth measures, there were only two options that really applied to me. They were shared attribution and SLOs. Different from the individually developed SLO, the Ohio Department of Education (2014) defines shared attribution as “an optional local student growth measure that can be attributed to a group of teachers. It encourages collaborative goals and may be used as data in the student growth component of teacher and principal evaluations”. Essentially, what this means, if I understand correctly, is that a school or district could decide, collaboratively, to base their SLO on, say, the success or value added measures of their fourth grade students’ state assessment results. If those fourth graders meet the state’s performance index and/or value added measures, the ENTIRE STAFF meets their student-learning objective for that year! Of course, in order for the shared attribution measure to be successful, you would need a completely invested staff that believes wholly in the mission and vision of the school and trusts one another without any doubt. To be frank, the staff would need to be fully acceptant of that old Three Musketeer mantra, “all for one and one for all”! Apparently, several surrounding districts do. I absolutely understand the reservation regarding putting your trust in someone else’s practice and progress. It is definitely a risk especially when your name is attached to your students’ scores. However, it is vitally important for teachers’ actions to support their spoken beliefs. Saying you understand the importance of vertical curriculum alignment and the effects each grade level has on the next, then closing your door to others, your mind to new knowledge, and losing all hope for the success of our students is a misalignment of practice. It just does not make sense! Shared attribution would not benefit a staff such as this.

https://i1.wp.com/scm-l3.technorati.com/11/10/18/54143/collaboration.jpeg

Maybe there were others as confused as I was regarding the two. I have to admit that at the time of its roll out, this portion of the evaluation process was muddy/murky for me. I had some other things clouding my mind and impeding my ability, or better yet, my willingness to even try to comprehend any of this at all. I put it all on the back burner to attend to at a later date and time. Oh, but how quickly things have become very clear.

As stated earlier, I wrote two SLOs for English Language Arts. There were several reasons for doing this. The first reason is that part of the district’s improvement plan is a focus on writing. The other goal, established by our staff, focused on guided reading levels. In creating my goals, my team and I put a lot of thought into our student growth targets. We created them collaboratively in order to support each other and ensure consistency in our instructional practices. Using the SMART goal characteristics (specific, measurable, attainable, realistic, and timely) to create my goals, I developed the following learning objectives for my students:

1) For guided reading, each student will be expected to demonstrate at least one-year’s growth minus one level based on the Fountas and Pinnell text gradient chart.

This essentially meant my students would increase their reading by 2 levels (0r 0ne year’s growth) by mid-March. As I said earlier, my focus was not on SLO development, so I may have made an error in the targets I set. Even still, an increase in two levels did not seem overly ambitious to me…at the time.

2) For writing, each student will be expected to demonstrate a 4-point, or 40%, growth over their original baseline data gathered in September using the district-adopted STOP rubric. For example, if a student scored 2 out of ten in September, he/she is expected to have 6 out of ten points on the rubric by March.

Now, as student scores increased, the growth target decreased. So, students that scored a 7 as a baseline only needed to demonstrate a 3-point growth over their original baseline and so on. Again, this target did not appear to be unattainable for my students over a 6-8 month period.

I have high expectations for my students, for which I am very proud. I will never change nor deviate from the expectations upheld for my students. For that, I may learn a potentially unfortunate lesson. Even though student growth is what schools, districts, and states desire to see over a year’s time, as with value added measures; and although many of my students made growth in both areas, the only student growth that mattered were those that either met or exceeded the student growth target I set for them. Those that made minute amounts of progress were not taken into consideration at all. Because of the high expectations and the ambitious hopes and dreams I have for my students, there is now a great possibility that I will be rated this year as a DEVELOPING teacher! That’s “developing”, as in one-step above ineffective and one step below proficient. “Developing” defined as undergoing development, growing, or evolving. My thought process eludes me. Let me get this straight. I am a teacher with 10 years of experience, a doctorate degree in education, a budding business, and the determination of becoming a premier leader in school improvement and educational reform. Yet, according to my student growth data, I could possibly be rated as a developing teacher!!! I am suddenly bothered, extremely annoyed, and incredibly disappointed.

http://aubenoire.files.wordpress.com/2011/05/no-expectations2.jpg

Since I am undoubtedly aware of and acknowledge my areas of strength and weakness, I realize that there is always and will always be an opportunity to learn and grow. No way am I perfect, nor do I want to be. A developing expert in my field, perhaps? Yes, absolutely, but even experts continue to research and learn within their area of expertise. Identifying gaps in my practice is clearly not the issue. As I continue to sit and go over the lessons I’ve done, small groups I’ve facilitated, strategies I’ve taught, or even the things I didn’t quite do as effectively as I could have, I identify the real basis of my irritation.

My data clearly shows a gap in my instruction. It is true data. It communicates the areas of practice that need attention as data is intended to do. But, the longer I sit and stare, the more I begin to think about the ways in which this data will affect my future as an educator and any goals I have beyond the classroom. How could I possibly turn this data in as it is? This does not look good at all! I stop right there and shake off those thoughts as the leader within me emerges. It seems to me that the purpose of this process is to help teachers become more effective in their practice, right. However, the first thing I contemplated was falsifying my data to meet the needs of whom…MYSELF!! Immediately, I cast aside the needs of my students. I wonder how many teachers have had or will have the same thoughts if/when they see that their data is not up to par. I wonder how many will change their data to meet their own needs. It makes sense to me because none of us wants to receive a low summative rating. That is just human nature. We would all like to be rated as knowledgeable educators who understand their practice and are able to help every single one of their students grow every single year. No extraneous factors will ever get in the way. Our targets will always be set perfectly every year and our students will never fall short. That’s how we all want to be viewed. But, the reality is, the state of education is forever changing in practice and pedagogy. Therefore, those of us on the front line will fall short at some point because of the inevitable rate in which these changes occur. However, in this process, as it is right now, teachers will always benefit because we will all make certain that our rating will reflect proficient and accomplished performance. In the end, the students will be the ones that suffer because their instruction will not be at the top of our priority list. Meeting our SLO targets will. Now, that’s what I call fair! Um, not so much…

https://i2.wp.com/d32ogoqmya1dw8.cloudfront.net/images/NICHE/source_mark_anderson_www.ander_1340482725_361.jpg

Needless to say, I could not and did not change my data. The moment I thought about doing that, the evaluation process lost is intended purpose. For me, the process no longer focused on instructing students, but rather on me making certain that I received the rating I needed to maintain my job. Now, the process was not promoting growth in effective instructional practices, but instead, effectiveness in falsifying documents and perfecting the practice of lying. The proverbial dog and pony show, more commonly known as the scheduled evaluation observation (we ALL put our best foot forward during those), along with the lowered expectation bar, which will soon convincingly be identified as the rigor I will claim to provide in my classroom, has the great potential of guiding my future in this field. How does that even make sense? Doesn’t this defeat the purpose of this whole process? Thankfully, I would never shortchange my students by lowering my expectations for them. It just does not align with my personal or professional morals. They need to be challenged, want to be challenged, and should be challenged. However, in the meantime, there is clearly a flaw. In my opinion, whether student learning objectives or shared attributions, the process is ambiguous, inequitable, unreliable, and holds no validity in regards to teacher accountability. In fact, it only evokes this simple question; NOW what??

Advertisements

6 comments on “SLO NO!! Now What?

  1. Laurie Loomis says:

    Your two days of data staring were the same for me! Several teachers told me that I should retest the students that did not do well on my Writing SLO since it was my SLO. I did not do so. I felt it would be quite unethical. I heard through the grapevine of several teachers who “fixed” their data. Shame on them. Goals are just goals. They are not meant to be permanent but ever changing. Instead of using this data to evaluate me as a teacher, I would love to see this data passed along to the next grade level so that my students’ new teachers are made aware of their new students’ strengths and weaknesses. Then new goals can be set and new supports put in place without reinventing the wheel once again.

    • I agree whole heartedly! I heard the same thing, but just couldn’t bring myself to “fix” it. It defeated the real purpose of how data is to be used! If we are to refine our instruction, we need to use the data that is provided to do that! Unfortunately, because they are used as part of our evaluation, integrity will always be at risk. I just can’t take it! As a colleague shared with me, it is not an equitable process and ultimately does not benefit our students at all.

      What a great idea! Passing along SLO data, along with guided reading levels, would provide a good baseline picture of our students in the fall! That’s a great idea! Thank you for sharing!

  2. Johnc641 says:

    Good website! I truly love how it is easy on my eyes and the data are well written. I’m wondering how I could be notified whenever a new post has been made. I have subscribed to your RSS feed which must do the trick! Have a great day! bdefcafkcbfc

  3. Smithg782 says:

    When I originally commented I clicked the Notify me when new comments are added checkbox and now every time a comment is added I get 4 emails using the same comment. Is there any way you may take away me from that service? Thanks! ddckkkebakdedfbk

    • Oh no! I do apologize for that. It must be a flaw with WordPress. It does not appear that I can control my followers settings from my side. Maybe if you go back in the way you subscribed, the box will show up again to change the notifications? I’m sorry I cannot give an accurate solution. Let me know if that helps though. Thank you again so much. Please, keep following.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s