Standard 4
Copyright Christina and Richard Hum
“We show understanding of something by using it, adapting it, customizing it” (Wiggins and McTighe 2005, p.93).
How do I ever know that I am on the right path? How will I know that I am inspiring others? Our individualism and uniqueness, like grains in sand, combine together like rocks and give a structure of knowledge and experience. This is constantly reviewed and broken apart and rebuilt again into a variety of other structures. Stopping along the way to reflect allows me to consider my growth and change by comparing the start, the middle and where I am headed. It is the self-assessment, and peer assessment component that yields the greatest growth. Each step of Kigluait was evaluated using tools to understand our growth and direction of growth, to be sure it was towards the target end goals. This could not have been done without technology to provide surveys and feedback from our customers, the students. Every student product, every survey and every set of comments were evaluated for success and ways to improve. As a result, curriculum were written and re-written to maximize student learning and provide valuable experiences.
How do I ever know that I am on the right path? How will I know that I am inspiring others? Our individualism and uniqueness, like grains in sand, combine together like rocks and give a structure of knowledge and experience. This is constantly reviewed and broken apart and rebuilt again into a variety of other structures. Stopping along the way to reflect allows me to consider my growth and change by comparing the start, the middle and where I am headed. It is the self-assessment, and peer assessment component that yields the greatest growth. Each step of Kigluait was evaluated using tools to understand our growth and direction of growth, to be sure it was towards the target end goals. This could not have been done without technology to provide surveys and feedback from our customers, the students. Every student product, every survey and every set of comments were evaluated for success and ways to improve. As a result, curriculum were written and re-written to maximize student learning and provide valuable experiences.
Standard 4: Educational technology leaders communicate research on the use of technology to implement effective assessment and evaluation strategies. Educational technology leaders:
A. Apply technology in assessing student learning of subject matter using a variety of assessment techniques.
B. Use technology resources to collect and analyze data, interpret results, and communicate findings to improve instructional practice and maximize student learning.
C. Apply multiple methods of evaluation to determine students' appropriate use of technology resources for learning, communication, and productivity.
Statement: Just as technology can be used to create, it can also be used to assess our progress and our creativity to meet needs. Evidence for this standard is demonstrated through a variety of assessment tools and techniques using technology to easily collect and analyze data.
Artifact 1: Action Research paper and Units from Earth Science Wikispace Page
Artifact 2: Student Information System Examination Report
Artifact 3: Technology Assessment Tool (the live version can be accessed here)
Artifact 4: Survey Data for Kigluait
Artifact 5: Student Surveys for PE
A. Apply technology in assessing student learning of subject matter using a variety of assessment techniques.
B. Use technology resources to collect and analyze data, interpret results, and communicate findings to improve instructional practice and maximize student learning.
C. Apply multiple methods of evaluation to determine students' appropriate use of technology resources for learning, communication, and productivity.
Statement: Just as technology can be used to create, it can also be used to assess our progress and our creativity to meet needs. Evidence for this standard is demonstrated through a variety of assessment tools and techniques using technology to easily collect and analyze data.
Artifact 1: Action Research paper and Units from Earth Science Wikispace Page
Artifact 2: Student Information System Examination Report
Artifact 3: Technology Assessment Tool (the live version can be accessed here)
Artifact 4: Survey Data for Kigluait
Artifact 5: Student Surveys for PE
What does this standard mean to me?
Assessment can be viewed as the circle that connects all components together. It is used in the beginning to identify needs and set goals, used throughout to monitor progress towards goals, and at the end to determine if the goal was met and what further goals need to be set. In terms of technology, assessment is used at all levels and in all areas. From identifying technology equipment to meet the needs of learners to the assessment of technology itself, to help teachers and administrators complete their daily tasks with efficiency. Berger explains, “assessment is most often a process of shepherding growth, rather than deriving a final grade or level. It is the transition from formal critique to ongoing informal critique which signifies to me the real adoption of this culture” (Berger, 1907 A shift in the Modes of Assessment in the Classroom), keeping in mind that growth includes the utilization of the technology itself for both teachers and students in a classroom culture that sees technology as a supportive tool.
For example, utilization of technology for student assessment should be examined through the lens of summative and formative assessment. This means teachers begin with the ending in mind, thinking about what the tasks should look like that will help students to learn what they need to reach the culminating task, as well as focusing on specific guiding questions and activities to support the learning, which is precisely where assessment with technology shines. Wiggins and Mctighe (2005, p. 152) state, “Effective assessment is more like a scrapbook of mementos and pictures than a single snapshot.” This means that technology can be creatively used to provide a variety of methods for creating those different assessments for students or for students to demonstrate their understanding.
Assessment techniques can also include the more formal quizzes and tests that can be created with websites that allow you to create and give quizzes, or more informal quizzes and tests using clickers or mobile devices to take quick polls and provide check up questions during lectures or discussions. Additional assessment techniques can include the use of rubrics, which of course can be generated and shared online with other teachers, and the use of digital journaling or blogging (or any other online communication tools) to self-assess and peer assess using a rubric and the global community for real feedback.
Furthermore, sites like Merlot have a goal “to improve the effectiveness of teaching and learning by increasing the quantity and quality of peer reviewed online learning materials that can be easily incorporated into faculty designed courses.” ("Merlot: about us," 2009) Other similar sites such as Rubistar ("Rubistar: Create Rubrics," 2011) provide similar services. Finally, technology can be used to help students be innovative in demonstrating their understanding of learning by creating a variety of products, from movies to Voicethreads, to blog posts to websites with products that can be shared back into the global community. In short technology provides a teacher with a wealth of possibilities to help students demonstrate their understanding, in addition to providing a method for students to culminate their learning using e-portfolios.
But Petrilli (2011, p.90) reminds us with tools such as blogs or twitter, “It's hard to know whether all this tweeting adds up to anything significant. Of course, much the same was once said of blogs; now it's well-accepted that a well-written blog post can be just as influential as a newspaper.” It is just a matter of how a teacher utilizes the technology to reach the end need, or as Richardson (2010, p.88) explains, teachers like “R. Richard Wojewodzki (twitter.com/teachpaperless), who uses Twitter as a way of having students collect snippets of information, keep a running assessment of what they know, and build vocabulary and grammar skills.” (Richardson, 2010 p. 88)
Along with using technology to assess student learning from the teacher’s perspective, technology can be used to collect data on both students and teachers to help improve instructional practices. As mentioned above, online digital tools such as polls and surveys created and implemented online can be used to collect data. In addition, there are a plethora of apps for mobile devices for administrators and teachers to use to collect informal data on practices. In addition the utilization of spreadsheets and databases allow for sorting and organizing of data to help identify target goals to work on to improve areas of weakness. Though this can be done at a very simplified level, it can be also be setup at a very complex level, as such is starting to occur in student information systems (SIS) to better provide teachers with immediate tools for making informed decisions and plans based on data. For example, several of the SIS now allow teachers to collect data on mastery of standards, peer assessment, and self-assessment. Such systems allow for more possibilities of being sure that “the performances we demand are appropriate for the particular understanding sought” (Wiggins and McTighe 2005, p. 183).
Finally, assessment within technology also includes assessing students’ use of technology. With any skill, progress is indicated with set benchmarks or goals to demonstrate abilities. Wiggins and Mctighe (2005, p.154) explain, “Real challenges involve specific situations with “messiness” and meaningful goals: Important constraints, ‘noise,’ purposes, and audiences at work”. This is also supported by Moskal (2003-06-00 Part I, Developing performance Assessment) in which she states, “performance assessments allow students the opportunity to display their skills and knowledge in response to "real" situations (Airasian, 2000; 2001; Wiggins, 1993).” Therefore, part of our assessments that are linked with content should also include assessing technology skills as they most likely are embedded in real challenges to demonstrate learning. Thus assessing technology skills should be assessed through the learning cycle, because “whatever the subject we learn best by going through many part-whole-part learning cycles-trying it out, reflecting, adjusting. We learn just enough content to be able to use it, and we make progress by tacking increasingly complicated ideas and aspect of performance” (Wiggins and McTighe, 2005 p.292). Furthermore, Berger (1997, A shift from Quantity to Quality), reminds us that, “the emphasis is on keeping up with production, not falling behind in classwork or homework, rather than in producing something of lasting value. Like a fast food restaurant, the products are neither creative nor memorable.”
In short, the use of technology to meet assessments should start with the following questions, “If the desired result is for learners to..”, “Then you need evidence of the student’s ability to..,” “So the assessments need to require something like…” (Wiggins and McTighe 2005, p. 162). In addition, Wiggins and McTighe (2005, p. 167) make a great point when they state, “If we have done a good job in framing the unit around essential questions, then we have another helpful way to think through and to test the appropriateness of our assessment ideas.” Thus when students answer these questions we can really look to see “Their explanation of why they did what they did, their support of the approach or response, and their reflection on the result that we may gain fuller insight into their degree of understanding” (Wiggins and McTighe 2005, p. 161), which is precisely where the use of technology plays a supportive and much needed role.
For example, utilization of technology for student assessment should be examined through the lens of summative and formative assessment. This means teachers begin with the ending in mind, thinking about what the tasks should look like that will help students to learn what they need to reach the culminating task, as well as focusing on specific guiding questions and activities to support the learning, which is precisely where assessment with technology shines. Wiggins and Mctighe (2005, p. 152) state, “Effective assessment is more like a scrapbook of mementos and pictures than a single snapshot.” This means that technology can be creatively used to provide a variety of methods for creating those different assessments for students or for students to demonstrate their understanding.
Assessment techniques can also include the more formal quizzes and tests that can be created with websites that allow you to create and give quizzes, or more informal quizzes and tests using clickers or mobile devices to take quick polls and provide check up questions during lectures or discussions. Additional assessment techniques can include the use of rubrics, which of course can be generated and shared online with other teachers, and the use of digital journaling or blogging (or any other online communication tools) to self-assess and peer assess using a rubric and the global community for real feedback.
Furthermore, sites like Merlot have a goal “to improve the effectiveness of teaching and learning by increasing the quantity and quality of peer reviewed online learning materials that can be easily incorporated into faculty designed courses.” ("Merlot: about us," 2009) Other similar sites such as Rubistar ("Rubistar: Create Rubrics," 2011) provide similar services. Finally, technology can be used to help students be innovative in demonstrating their understanding of learning by creating a variety of products, from movies to Voicethreads, to blog posts to websites with products that can be shared back into the global community. In short technology provides a teacher with a wealth of possibilities to help students demonstrate their understanding, in addition to providing a method for students to culminate their learning using e-portfolios.
But Petrilli (2011, p.90) reminds us with tools such as blogs or twitter, “It's hard to know whether all this tweeting adds up to anything significant. Of course, much the same was once said of blogs; now it's well-accepted that a well-written blog post can be just as influential as a newspaper.” It is just a matter of how a teacher utilizes the technology to reach the end need, or as Richardson (2010, p.88) explains, teachers like “R. Richard Wojewodzki (twitter.com/teachpaperless), who uses Twitter as a way of having students collect snippets of information, keep a running assessment of what they know, and build vocabulary and grammar skills.” (Richardson, 2010 p. 88)
Along with using technology to assess student learning from the teacher’s perspective, technology can be used to collect data on both students and teachers to help improve instructional practices. As mentioned above, online digital tools such as polls and surveys created and implemented online can be used to collect data. In addition, there are a plethora of apps for mobile devices for administrators and teachers to use to collect informal data on practices. In addition the utilization of spreadsheets and databases allow for sorting and organizing of data to help identify target goals to work on to improve areas of weakness. Though this can be done at a very simplified level, it can be also be setup at a very complex level, as such is starting to occur in student information systems (SIS) to better provide teachers with immediate tools for making informed decisions and plans based on data. For example, several of the SIS now allow teachers to collect data on mastery of standards, peer assessment, and self-assessment. Such systems allow for more possibilities of being sure that “the performances we demand are appropriate for the particular understanding sought” (Wiggins and McTighe 2005, p. 183).
Finally, assessment within technology also includes assessing students’ use of technology. With any skill, progress is indicated with set benchmarks or goals to demonstrate abilities. Wiggins and Mctighe (2005, p.154) explain, “Real challenges involve specific situations with “messiness” and meaningful goals: Important constraints, ‘noise,’ purposes, and audiences at work”. This is also supported by Moskal (2003-06-00 Part I, Developing performance Assessment) in which she states, “performance assessments allow students the opportunity to display their skills and knowledge in response to "real" situations (Airasian, 2000; 2001; Wiggins, 1993).” Therefore, part of our assessments that are linked with content should also include assessing technology skills as they most likely are embedded in real challenges to demonstrate learning. Thus assessing technology skills should be assessed through the learning cycle, because “whatever the subject we learn best by going through many part-whole-part learning cycles-trying it out, reflecting, adjusting. We learn just enough content to be able to use it, and we make progress by tacking increasingly complicated ideas and aspect of performance” (Wiggins and McTighe, 2005 p.292). Furthermore, Berger (1997, A shift from Quantity to Quality), reminds us that, “the emphasis is on keeping up with production, not falling behind in classwork or homework, rather than in producing something of lasting value. Like a fast food restaurant, the products are neither creative nor memorable.”
In short, the use of technology to meet assessments should start with the following questions, “If the desired result is for learners to..”, “Then you need evidence of the student’s ability to..,” “So the assessments need to require something like…” (Wiggins and McTighe 2005, p. 162). In addition, Wiggins and McTighe (2005, p. 167) make a great point when they state, “If we have done a good job in framing the unit around essential questions, then we have another helpful way to think through and to test the appropriateness of our assessment ideas.” Thus when students answer these questions we can really look to see “Their explanation of why they did what they did, their support of the approach or response, and their reflection on the result that we may gain fuller insight into their degree of understanding” (Wiggins and McTighe 2005, p. 161), which is precisely where the use of technology plays a supportive and much needed role.
What artifacts demonstrate my mastery?
Glenn Fleishman from Seattle, Washington. Interface is copyright Apple Inc.
The first two artifacts I have chosen to demonstrate mastery of this standard include an Action Research paper and a Wikispace page with a set of units for my current Earth Science class. The units have been designed using the Backwards Design model and mimic the concept of Quests developed by a school in New York, Quest to Learn. Quest to Learn uses something called Boss Levels (a term inherited from World of Warcraft), which they call, “ synthesizing spaces,’ and provide opportunities both for students who need a little extra work or those seeking accelerated opportunities to extend their learning” (Quest to Learn, Curriculum and Assessment para 1). This same concept was quoted by Wiggins and Mctighe (2005, p.94) quoting Swiss child psychologist Jean Piage (1973/1977), “student understanding reveals itself by student innovation in application.”
The third artifact that I have provided as evidence is a report of an examination of a Student Information System. I have included this artifact because it examines one of the three most used SIS in Alaska. It demonstrates my understanding of the importance of a solid SIS to help teachers pin-point and identify student needs to be focused on in the classroom. A quality SIS will allow teachers to not only collect data by standards, but sort, organize or view data providing valuable information about students strength’s and areas that need improvement. This can be used for instructional planning or identifying why a student may be struggling on a particular topic or area.
The third and final artifact I am including is a technology assessment tool (the live version can be accessed here) that was created for training teachers to use technology. I have included this tool because it provides a good example of demonstrating both a digital means of collecting and evaluating data, as well as the use of the data by the teachers to assess their use of technology. The tool was used as both a pre and post assessment and given through Google Docs, to allow for quick and easy access to the tool and collaboration of the data. Based on the pre-data, teachers were asked to utilize the information to identify 1-2 areas that they wanted to focus their individualized plan on to demonstrate proficiency. The assessment, based on ISTE’s NETS for Teachers, also took into consideration available technology and basic knowledge of how to utilize the technology available to them to help provide myself, the trainer, and further data to help them meet their goal. Upon completion of their plan, teachers were then asked to complete the post-assessment, as well reflecting on their growth and what elements they felt allowed them to complete their plan, as well as what their future technology goal might be.
In general, these set of artifacts demonstrate the variety of methods that technology can be wielded for assessment and it’s many components and facets. Furthermore they have provided valuable experience in applying technology skills. And regardless of who the learner is or what the content is, learning, as I seem to discover over and over, is not black and white, right answer, wrong answer. It is a cyclical process and hence, “mistakes are not avoidable, or shameful, but key episodes in gaining understanding” (Wiggins and McTighe 2005, p. 239).
The third artifact that I have provided as evidence is a report of an examination of a Student Information System. I have included this artifact because it examines one of the three most used SIS in Alaska. It demonstrates my understanding of the importance of a solid SIS to help teachers pin-point and identify student needs to be focused on in the classroom. A quality SIS will allow teachers to not only collect data by standards, but sort, organize or view data providing valuable information about students strength’s and areas that need improvement. This can be used for instructional planning or identifying why a student may be struggling on a particular topic or area.
The third and final artifact I am including is a technology assessment tool (the live version can be accessed here) that was created for training teachers to use technology. I have included this tool because it provides a good example of demonstrating both a digital means of collecting and evaluating data, as well as the use of the data by the teachers to assess their use of technology. The tool was used as both a pre and post assessment and given through Google Docs, to allow for quick and easy access to the tool and collaboration of the data. Based on the pre-data, teachers were asked to utilize the information to identify 1-2 areas that they wanted to focus their individualized plan on to demonstrate proficiency. The assessment, based on ISTE’s NETS for Teachers, also took into consideration available technology and basic knowledge of how to utilize the technology available to them to help provide myself, the trainer, and further data to help them meet their goal. Upon completion of their plan, teachers were then asked to complete the post-assessment, as well reflecting on their growth and what elements they felt allowed them to complete their plan, as well as what their future technology goal might be.
In general, these set of artifacts demonstrate the variety of methods that technology can be wielded for assessment and it’s many components and facets. Furthermore they have provided valuable experience in applying technology skills. And regardless of who the learner is or what the content is, learning, as I seem to discover over and over, is not black and white, right answer, wrong answer. It is a cyclical process and hence, “mistakes are not avoidable, or shameful, but key episodes in gaining understanding” (Wiggins and McTighe 2005, p. 239).
References
DESSOFF, A. (2009). Reaching Graduation with Credit Recovery. District Administration, 45(9), 43-48. Retrieved from EBSCOhost.
Moskal, B. M. (2003-06-00 ). Developing classroom performance assessments and scoring rubrics - part i. ERIC Clearinghouse on Assessment and Evaluation, DOI: ED481714
Moskal, B. M. (2003-06-00 ). Developing classroom performance assessments and scoring rubrics - part ii. ERIC Clearinghouse on Assessment and Evaluation, DOI: ED481715
Wiggins, G. , & McTighe, J(2005). Understanding by design. (2nd ed., pp. 13-81). Alexandria, Virginia: ASCD
Merlot: About Us. (2009, 0303). Retrieved from http://taste.merlot.org/
Rubistar: create rubrics for your project-based learning activities . (2011, September 19). Retrieved from http://rubistar.4teachers.org/
Wiggins, G. , & McTighe, J(2005). Understanding by design. (2nd ed., pp. 13-81). Alexandria, Virginia: ASCD
Quest to learn. (n.d.). Retrieved from http://q2l.org/node/14
Richardson, W. (2010). Blogs, wikis, podscasts, and other powerful web tools for classrooms. (3 ed.). Thousand Oaks, California: Corwin.Retrieved from http://books.google.com/books?id=CArG5bfUy-sC&printsec=frontcover&dq=Will Richardson
Petrilli, M.J. (Fall 2011) All A-Twitter about Education. Education Next v. 11 no.4. p. 90-1
Moskal, B. M. (2003-06-00 ). Developing classroom performance assessments and scoring rubrics - part i. ERIC Clearinghouse on Assessment and Evaluation, DOI: ED481714
Moskal, B. M. (2003-06-00 ). Developing classroom performance assessments and scoring rubrics - part ii. ERIC Clearinghouse on Assessment and Evaluation, DOI: ED481715
Wiggins, G. , & McTighe, J(2005). Understanding by design. (2nd ed., pp. 13-81). Alexandria, Virginia: ASCD
Merlot: About Us. (2009, 0303). Retrieved from http://taste.merlot.org/
Rubistar: create rubrics for your project-based learning activities . (2011, September 19). Retrieved from http://rubistar.4teachers.org/
Wiggins, G. , & McTighe, J(2005). Understanding by design. (2nd ed., pp. 13-81). Alexandria, Virginia: ASCD
Quest to learn. (n.d.). Retrieved from http://q2l.org/node/14
Richardson, W. (2010). Blogs, wikis, podscasts, and other powerful web tools for classrooms. (3 ed.). Thousand Oaks, California: Corwin.Retrieved from http://books.google.com/books?id=CArG5bfUy-sC&printsec=frontcover&dq=Will Richardson
Petrilli, M.J. (Fall 2011) All A-Twitter about Education. Education Next v. 11 no.4. p. 90-1