Monday, August 17, 2009

Classroom teachers need to acquire strategies in facilitating development of higher-order thinking skills (HOTS) and problem-solving/process skills and to know when a student has developed these skills. They need to acquire skills in developing assessment tasks that could accurately measure the students’ conceptual understanding in mathematics/science and measure their ability to analyze and solve problems as a result of their understanding. Moreover, they must also know how to select the most important mathematics/science skills and processes to assess and to ensure that the most appropriate form of assessment is utilized. It is in this context that the seminar-workshop was undertaken.

Ms. Edna Callanta guides the participants in formulating test items in Mathematics.

A total of 63 science and mathematics teachers from 14 public elementary schools and two public high schools in the District of Limay, Bataan attended the seminar-workshop that was sponsored by Alstom Foundation, Inc. The seminarworkshop was held on May 28-30, 2009 at the Alstom Clubhouse, Alangin, Limay, Bataan. It consisted of lectures and workshops. The plenary lecture on Assessing Student Learning Effectively focused on the meaning of assessment and why it is considered a powerful tool for influencing the learning process, the different types of assessment and the salient features of each type, the three cognitive domains and the accompanying thinking skills under each domain in the Trends in International Mathematics and Science Study (TIMSS) framework, and scoring objectively constructedresponse items and other authentic assessment formats using rubrics. The parallel lectures in Science and Mathematics centered on developing HOTS and the guidelines on formulating good multiple-choice and constructed-response items. In the two parallel workshops, the participants developed, critiqued, and revised assessment items. The following criteria for critiquing were followed: (1) congruency of the lesson objective and the test item; (2) accuracy of the correct answer/s; (3) plausibility of the incorrect choices for multiple-choice items and clarity of the scoring rubric table for constructed-response items, and (4) accuracy of the illustrations and their labels, if there are any.