Sunday, October 10, 2010

Attack on ISD

ISD = ADDIE

Main criticisms
- ISD as it is used is too slow
Current times call for the ability to change, and to do so quickly to stay ahead. The speed of the internet has made the ADDIE model archaic and slow.
- There is no destination
ISD is more focused on methods than results. People seem to get along well, often even better and faster without applying ISD principles.
- When it is followed rigidly, it produces bad solutions
Learning is far from merely scientific, so scientific-like procedures are bound to miss the mark.
- It clings to the wrong world view
Most ISD products carry the assumption that they are training idiots with no knowledge. This is never the case, and it builds resentment for training.

Article describes ISD as a scientific approach. As a purely scientific approach to a non-scientific problem, there is bound to be a disconnect if it is followed... scientifically.

I love to say that I agree with all of the claims presented in this argument. Thank heavens that our field isn't defined by the ADDIE model. I think that its time that our field loses its "science-envy" and begins to recognize that the problems we face our complex, multi-disciplinary, and beyond simple science. Learning deals with emotions, characters, individuals, attitudes, and personalities as much as it deals with methodologies, prior-knowledge, etc. We consider it an art to write a novel to help people think a certain way or accept a different perspective on a world issue. We consider it an art to pain a picture that stirs peoples emotions and causes them to dig deep in their understanding of what is depicted. How is it any less of an art to design a context and a learning environment that melds with varying emotional states, personalities and so forth? Let's get passed the ADDIE model already.

Tuesday, October 5, 2010

Evaluation Report -- Timez Attack

Evaluation Report of Big Brainz' Timez Attack


Prepared for

Big Brainz, Educational Gaming Company







Prepared By

The IP&T 661: Intro to Evaluation in Education Class, Spring 2010
Taught by Dr. Richard E. West
Instructional Psychology and Technology Department
David O. McKay School of Education
Brigham Young University, Provo

June 15, 2010







 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 TABLE OF CONTENTS



EXECUTIVE SUMMARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 

Evaluation Purpos

Limitations of the Evaluation and Disclaimers

Description of Timez Attack


LITERATURE REVIEW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . 

Multiplication Mastery and Math Anxiety

Computer Assisted Instruction (CAI) Benefits

Recommendations from the National Mathematics Advisory Panel

Conclusion


METHODOLOGY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 

Research Design

Participants

Research Methods

    Quantitative Research Measures

    Qualitative Research Methods

Data Collection Procedures

Data Analysis

    Quantitative Data Analysis

    Qualitative Data Analysis 
 

PRESENTATION OF RESULTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 

Summary of Finding

Timez Attack and Student Mastery of Multiplication Facts

Timez Attack and Student Self-Efficacy


DISCUSSION OF FINDINGS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Achievement Comparison

Mixed Methods Benefits

Age Differences

Anxiety and Self-efficacy

Peer Interaction

Learning vs. Gameplay


CONCLUSIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Conclusions

Recommendations

    Evaluation Questions

    Evaluation Methods

Future Planning for TA


APPENDICES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 

Appendix A – Data Collection Instruments

    Multiplication Attitudes Survey Pre-test

    Multiplication Attitude Survey Post-test

    Multiplication with 12s Pre/Post-test

Appendix B – Student Comments

    Student Comments Recorded by Observers During TA Play

    Student Responses to Post-Survey Question 11

    Student Responses to Post-Survey Question 12


REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


METAEVALUATION: Evaluation Criteria, Standards, Objectives and Judgments . . . . . . . . . .

Utility

Feasibility

Accuracy

Propriety

Conflicts of Interest

 



Executive Summary
 

Big Brainz, an educational gaming company, developed Timez Attack (TA) to help students learn basic multiplication facts through stimulating, game-based learning. With keyboard and mouse controls, students navigate various three-dimensional environments and encounter "assessment stations" to demonstrate their mastery of multiplication facts.


Big Brainz requested an evaluation of TA to identify changes in student mastery of multiplication facts and student self-efficacy regarding mathematics. In preparation for the evaluation our evaluation team--most of which are graduate IP&T students--conducted a literature review researching multiplication mastery and math anxiety; computer assisted instruction (CAI) benefits including motivation, achievement, and self-efficacy; and recommendations from the National Mathematics Advisory Panel.


Our evaluation team, in cooperation with administrators and faculty at Wasatch Elementary School, completed a controlled exercise using a variety of methods in order to collect and code the data. We administered a 12s times table test to two groups of students and collected data from pre/post-test surveys. We also collected data through observations and informal interviews throughout the experiment and from the participants' written responses to open-ended questions on the surveys.


Our data suggests that TA compares with other instructional methods for teaching times-tables. The outcome between a session of playing Timez Attack and a session of classroom instruction showed no significant differences. Our preliminary findings also suggest that students who experienced both teacher-instruction and TA gameplay attempted more multiplication problems on their post multiplication tests and made fewer errors than students without the additional instruction.


Our data shows a positive change in students' mathematics self-efficacy, or their belief in their own capacity to solve multiplication problems. Responses indicate that exposure to TA resulted in generally decreased math anxiety and an increased confidence toward learning multiplication facts.


It is recommended that the findings in this report be utilized for marketing purposes to help consumers understand the positive ways the product contributes to students' experiences and complements in-class instruction. While it was observed that Timez Attack can be engaging and beneficial for the students, the benefits the game provides may not be readily understood by teachers, administrators, and parents.

 

The narrow time frame of this project resulted in limiting the scope of this evaluation. As a final recommendation, this project should not be viewed as conclusive but rather serve as an initial step toward a more comprehensive and thorough evaluation with more controlled conditions and larger subject pool.



 

Introduction

“I did it!” Joe exclaimed with excitement. The rest of his class had begun filling out a survey about Timez Attack – the game they’d recently finished playing – just as the evaluation team had asked them to. “I just have to beat the sixes and I'm done,” pleaded Joe. “Can I play the game?”


The evaluator whom Joe had asked felt a mix of frustration and pleasure. They needed the questionnaires filled out so they could evaluate the game, but Joe was so excited about mastering his times tables!


Joe began trying to multitask, switching back and forth between the game and the questionnaire. “Yes! I got it done!”  Joe raised his arms in victory.

The evaluator was satisfied because Joe still got the questionnaire finished before his class had to leave the lab. Joe and the evaluator both got what they wanted. And for Joe, at the very least, Timez Attack seems to be a great success.


Timez Attack (TA) is a computer game developed by Big Brainz Incorporated. The game was developed to help students master basic multiplication facts. The design allows students to practice multiplication facts while overcoming challenges presented by the game. Students engage in learning and mastering the facts as they progress through the game.


Big Brainz designed TA to reduce or eliminate math-related anxiety that can, according to Scarpello (2007), decrease students' "confidence in their ability to do math." To achieve this they implemented high quality game-effects to engage and motivate the students to master their times-tables in a less stressful self-evaluative setting as opposed to a stress-inducing peer setting often found in other instructional methods.


This evaluation of Timez Attack was conducted with forty-eight 2nd and 3rd grade students and their instructors from two separate classes at Wasatch Elementary School in Provo, Utah. The report that follows outlines the planning, execution, and the results of this evaluation project.


Evaluation Purpose

This evaluation assesses the effectiveness of TA in helping students achieve multiplication mastery and self-efficacy (one’s belief in one’s ability to succeed in specific situations). The principle evaluation questions were:

  1. Does playing TA improve multiplication fact mastery?

 

  1. Does playing TA contribute to positive changes in students’ multiplication self-efficacy?

Limitations of the Evaluation and Disclaimers

Several variables impacted the effectiveness of this evaluation to draw inferences about the questions. These variables included the following:

 

  1. The time frame for the evaluation project was compressed. The evaluation was a class project conducted within a short academic period that limited the scope of the work that could be completed.
  1. The studies were conducted at the conclusion of the school year when many of the students had already mastered the math facts in the game.  In addition, an Eagle Scout project focused on math instruction helped provide math facts support to the students in one of the groups (Study 1).

  2. The evaluation included a small population of students not randomly selected. This does not necessarily invalidate the evaluation observations, but it does limit inferences that may be drawn to larger populations.
  1. Limitations in the physical environment where the observations were conducted may have affected the students’ attention, particularly in the administration of the post-tests.
  1. Novice (graduate student) data collectors were used for the evaluation.
  1. The students’ time on task was not monitored. The time on task in the control group in study 2 was not compared to the time on task of the experimental group. The time on task may account for some of the observed differences.
  1. The experimental groups were not consistent. For example, one study (study one) did not have a control group.
  2. The control group received cutting edge math instruction from their teacher that did not represent the average teacher-led math instruction that students typically receive.   
 

Description of Timez Attack

 
Figure 1. Throwing creatures as multiplicands.

TA is a computer-based video game set in a three-dimensional graphical environment wherein a single player controls the movements of an alien-like character in the third-person through fantasy worlds including a dungeon, machine world, a lava-filled underworld, and a jungle. The objective of the game is to guide this character through various regions or levels using the multiplication skills the game teaches them. To reach higher levels they must overcome frequent encounters with opposing computer characters who challenge the player with multiplication problems, or tests.


Encounters typically begin with the player arriving at an impassable obstacle, such as a locked door. The game then presents a multiplication problem with a multiplicand and multiplier and provides the player with a number of creatures (e.g. snails, robot spiders) equal to the multiplier, each with the value of the multiplicand. The player must gather all of these creatures and throw them at the obstacle (Figure 1). This action visually adds up each creature as a copy of the multiplicand until the sum equals the correct answer. The player is then tasked with typing in the correct answer, which was just revealed by throwing the creatures.

 
 
Figure 2. An encounter with an opposing non-player character.

The obstacle, or door, opens if the player types the correct answer. An opposing computer character (e.g. a troll, a dragon) then emerges to challenge the player. The player must rapidly answer multiplication problems that appear on the opposing character’s body by typing the correct answer within a short period of time. Each time the player provides a correct answer the opposing character’s “health” bar depletes by a degrees (Figure 2). Often, when the player provides an incorrect answer their character is punished by an attack which depletes part of his or her health bar, and the opposing character’s health bar recovers some degree of health. The player defeats the opposing computer character after delivering a certain number of correct answers. The computer character disappears, allowing the player to pass to the next area or obtain a key by which the player may then advance past a locked door.

 

Each world is composed of twelve levels, which culminate in a final “boss” challenge wherein the player must answer all the variations of multiplication problems encountered previously in that particular world. Unlike regular computer character encounters, the final boss does not respond to incorrect answers. However, the player must have mastered their times-tables and achieve 100% accuracy to defeat the final boss.

 

Beyond encounters with opposing computer characters, the player must evade environmental hazards such as falls from cliffs or bridges, falling rocks, etc. Failure to do so may result in depletion of the player character’s “health”, or in resetting the character to an earlier position in that level.

 

The player’s character is controlled via the computer’s keyboard and/or mouse. While the game may be played with one hand, optimal in-game performance depends on using both hands: one on the keyboard, the other on the mouse.

 

TA software is available for both Windows and Mac, and is playable on Linux through a Windows emulator such as Wine. Additional details on hardware requirements and support are available at http://www.bigbrainz.com/Support.php.


Literature Review

Educational researchers have debated the effectiveness and worth of educational computer and video game investments, through many studies and experiments. For brevity's sake, this literature review will not address whether or not high-tech educational games compete well with other recreational computer and video games. Rather, it will provide research-based evidence that multiplication math anxiety is a problem worth addressing (Berk, 2009; Jackson and Leffingwell, 1999; Meece et al, 1990; NRC, 1989; Sarason, 1980; Scarpello, 2007; Swetman, 1994; Tankersley, 1993) and that computer assisted instruction (CAI) can lessen students' math anxiety by increasing their motivation to achieve—resulting in higher self-efficacy (Berk, 2009, pgs. 644-5; Kebritchi, 2008; Ke & Grabowski, 2007; Klawe, 1998; Moreno, 2002; National Mathematics Advisory Panel, 2008; NCTM online, 2010; Programme for International Student Assessment, 2005; Rosas, 2003; Sedighian & Sedighian, 1996; Shaffer et al., 2005). It also discusses mathematic automaticity improvement induced by CAI, and addresses the impact of real-world game aspects (Carter & Norwood, 1997; Federation of American Scientists, 2006; Jackson, 1999; Meece et al., 1990; NMAP, 2008; NRC, 1989, 2001; Scarpello, 2007; Shaffer et al., 2005).

 

Multiplication Mastery and Math Anxiety

According to the National Council of Teachers of Mathematics's (NCTM) standards, third to fifth grade students need to develop multiplication fluency. Memorizing times tables 1-12 constitutes a fundamental basic portion of children's expected mathematical learning and understanding (NCTM online, 2010). Metaphorically speaking, one cannot understand a conversation spoken in a language one does not know. Students need to gain proficiency with the times tables language to gain fluency in higher mathematics.

 

Comparison studies between U.S. children and children in nations with higher mathematics achievement, in addition to studies tracking cross-generational changes within the U.S., reveal that in the earliest stages of learning math, many U.S. children today are slower and less efficient with solving whole number multiplication problems (NMAP, 2008). Some never attain proficiency. The panel attributes the deficiencies to a lack of “quantity and quality” practice, curricula emphases, and parental educational involvement (pg. 26). As a result, students are less capable when facing more complex mathematical problems. This in turn increases the students’ math anxiety and lowers their self-efficacy because, according to Meece et al. (1990), past performance is a leading factor in anxiety.

 

For example, Carol Jackson and Jon Leffingwell (1999) carried out a study to determine when students first experience mathematics anxiety, and to determine what instructor behaviors create or exacerbate the anxiety. 93% of their study's participants—college seniors—reported experiencing math anxiety of some kind, the first occurrences beginning mostly in 3rd and 4th grade (16%). Other research also indicates 4th grade as a key time when more students begin to experience anxiety (Swetman, 1994; Tankersley, 1993).  

 

Computer Assisted Instruction (CAI) Benefits

Educational computer and video games offer an approach to learning that can potentially reduce negative anxiety by improving achievement and self-efficacy in mathematics.  They do this by motivating students to complete and win the game through mastering the necessary educational skills (Kebritchi, 2008; Ke & Grabowski, 2007; Klawe, 1998; Moreno, 2002; Rosas, 2003; Sedighian & Sedighian, 1996).


Kebritchi, Hirumi, and Bai's study (2008) demonstrated the effectiveness educational games can have at increasing motivation and achievement. These researchers tested the effectiveness of the pre-algebra and algebra games Evolver™ and Dimenxian™ in improving achievement scores and motivation for a sample of 193 students. They also tested the games to see if their impact varied according to the students’ prior knowledge, computer experience, and language background.

 

Kebritchi et al. randomly assigned the teachers to experimental and control groups; teachers in the experimental groups were trained and encouraged to use the video games in class. The researchers found a significant achievement difference between the experimental and control group on exam scores. Most interviewed students said they preferred the game to other school activities. In addition, student and teacher interviews revealed that both teachers and students agreed that the game reinforced the students’ mastery and achievement by motivating them to succeed and was overall a more effective, motivating, and differentiating approach to learning math.

 

Recommendations from the National Mathematics Advisory Panel

The National Mathematics Advisory Panel (2008) recommends that mathematics teachers consider high-quality, well-designed and implemented CAI programs such as Evolver™ and Dimenxian™ that incorporate drill and practice to developing students' mathematical automaticity. By automaticity they mean effortless, accurate, and quick computation performance that frees the working memory so the students can focus their attention on more complicated tasks. However, they also recommend that students develop computational fluency simultaneously with conceptual understanding and problem-solving skills because they are mutually supportive and facilitative of learning each.

 

Many CAI games address these issues by implementing a real-world aspect making it possible for the players to “inhabit roles that are otherwise inaccessible to them” (Shaffer et al., 2005, pg. 105). They teach students how to think, act, and learn like the professionals in order to survive, or be competitive in the real world. The National Mathematics Advisory Panel advocates mathematics instruction involving real-world problems because it improves the children's assessment performance involving similar problems (NMAP, 2008). However, it does not seem to affect computation of simple word problems or equation solving if they are not aspects of the real-world problems taught.

 

It is not evident in the Panel's report of whether or not the Panel is aware of the real-world benefits of educational games. Even without this knowledge, they encourage the use and research of CAI, while cautioning teachers to “critically inspect individual software packages and the studies that evaluate them” (pg. 51). They also advise teachers to consider the practicalities of using the software—the necessary software, hardware and technical support, curriculum integration and professional development.

 

Conclusion

Students whose teachers take the Panel's recommendations can benefit from increased motivation and mathematical achievement (Berk, 2009, pg 644-5; Kebritchi, 2008; Ke & Grabowski, 2007; Klawe, 1998; Moreno, 2002; NCTM online; NMAP, 2008; Programme for International Student Assessment, 2005; Rosas, 2003; Sedighian & Sedighian, 1996;  Shaffer et al., 2005). This will, in turn, increase students’ self-efficacy, thereby reducing their math anxiety (Berk, 2009; Jackson, 1999; Jackson & Leffingwell, 1999; Meece et al, 1990; NRC, 1989; Sarason, 1980; Scarpello, 2007; Swetman, 1994; Tankersley, 1993). They will also gain greater mathematical automaticity enabling them to learn more complex math with less effort (NMAP, 2008; Shaffer et al., 2005). However, how they will benefit from these positive effects depends, in part, on the real-world aspect of the games they use (Carter & Norwood, 1997; Federation of American Scientists, 2006; Jackson, 1999; Meece et al., 1990; NMAP, 2008; NRC, 1989, 2001; Scarpello, 2007; Shaffer et al., 2005).


Methodology  

Research Design

Our evaluation team collected data using two studies that addressed TA's potential influence on mastery of the 12s time tables and self-efficacy toward learning multiplication facts. Multiplication with 12s was chosen for observation and analysis because the teachers had already taught times tables 1-11 to all of the participants. We employed a mixed methods approach in both studies in order to collect, code, and analyze the data.  We collected the data using a pre/post 12s times test, pre/post Multiplications Attitude Survey (MAS) and field notes including observations and informal interviews. 
 
Study 1 included 3rd-grade participants whose teacher had also previously taught the 12s multiplication facts. Due to limited computer access participants were randomized into two separate groups. However, since both groups had the same experience this study had no control group. Pre/post-test instruments were used to measure changes in ability and self-efficacy of participants after a 30-minute intervention with Timez Attack. As there was no control group we employed a single group pre/post-test research design. This design is illustrated in Figure 1 below where PRE=pre-test, POST=post-test, and X= Observed Intervention (30-minutes of playing Timez Attack).  

   

 Figure 1 -- Research Design for Study 1

     

All Participants

 

PRE

 

X

 

POST

  
 

Study 2 included 2nd and 3rd-grade participants whose teacher had not yet taught them the 12s multiplication facts. Students were randomly assigned to an experimental or control group by generating a randomized table. Study 2 used an experimental pre/post-test research design, as illustrated in Figure 2, where R=Randomization, E=Experimental Group, C=Control Group, PRE=pre-test, POST=post-test, Y=Traditional Teacher-Led Instruction, and X=Observed Intervention (30-minutes of playing Timez Attack). 

  

 Figure 2 -- Research Design for Study 2 

 

Group 1 (R,E)

 

PRE

 

X

 

POST

 

Y

 

 

Group 2 (R,C)

 

PRE

 

Y

 

POST

 

X

 

POST2

  

  

 

Participants
 

Participants in Study 1 included 26 3rd-grade students. Participants in Study 2 included 12 2nd-grade and 10 3rd-grade students all from the same elementary school. Prior to data collection, participants from both studies used Timez Attack during computer lab time. Study 1 participants' teacher had previously taught them the 12s times-tables while Study 2's teacher had not.

 

Research Methods 

Quantitative Research Measures
 
We used the teachers' existing instrument to calculate and asses student proficiency of multiplication facts, consisting of a 100-item multiplication facts test limited to 12s. It included both 12 X (a number from 1-12) and (a number from 1-12) X 12 problems. It was distributed before and after the TA playing session, or teacher-led instruction in the Study 2 control group. The same instrument was used as both a pre- and post-test and can be found in Appendix A.
 
A pre/post-Multiplication Attitudes Survey (MAS) was developed to access (1) attitudes towards learning multiplication facts and (2) attitudes towards and perceptions of Timez Attack. The pre-MAS contained eight questions. The post-MAS contained twelve questions. Questions 1-6 on both the pre-MAS and post-MAS were designed to measure students’ self-efficacy, along with physiological and emotional states involved in multiplication math fact mastery. Questions 1, 2, 4 and 5 were identical while 3 and 6 were only slightly different on the pre- and post-tests. Questions 7 and 8 on the pre-test were designed to provide information regarding home learning of multiplication facts and previous exposure to Timez Attack. Questions 7-12 on the post-tests elicited information regarding perceived utility of and motivation to use Timez Attack.
 
Response options for the first seven questions on the pre-test and first ten questions on the post-test involved a five-point Likert-type scale. Five possible responses were provided -- “NO, no, ?, yes, YES” -- with the two extreme options being in bolder type to make it easier for students to interpret the rating choices. Instructions on how to interpret the scale were verbally explained to students in advance by saying, “Circle the big ‘YES’ if you think what the statement says is completely true and you very much want to say yes, circle the little ‘yes’ if you think it is basically true,” and so forth. Similar methods of administration were used by Nichols et al. (1990). Response options for question 8 on the pre-test included four options: “never, a few times, many times, almost every day.” (See Qualitative Research Methods regarding open-ended questions 11 and 12 on the post-MAS)
 

Likert-type questions measuring self-efficacy and psychological and emotional states regarding multiplication fact mastery were adapted from the Fennema-Sherman Survey (Fennema & Sherman, 1976). This is one of the most widely used math attitude surveys in math education.  Changes to Fennema-Sherman survey questions included changing the word “math” to “multiplication math facts.”  Other adaptations were done in order to make the questions age-appropriate (see Table 1). The complete pre- and post-MAS can be found in Appendix A.   

 

Table 1 -- Likert-type Questions Pre/Post MAS 

Index

Statements

Pre/Post

 

 

 

Self-Efficacy

 

 

I can learn new multiplication math facts easily. 

I am good at memorizing multiplication math facts. Remembering new multiplication facts is hard for me

 

Pre/Post

Pre

Post

Psychological/Emotional states

I like learning multiplication math facts.

Learning new multiplication math facts scares me.

I usually do not worry about remembering multiplication math facts.

Learning new multiplication math facts makes me feel bored

Learning new multiplication math facts makes me feel uneasy and confused.

 

Pre/Post

Pre/Post

Pre

 

Post

 

Pre

Home influence

I study my multiplication math facts at home.

 

Pre

Exposure to Timez Attack

I have played Timez Attack.

 

Post

Perception of TA Utility and Motivation to Use  

I will play Timez Attack at home.     

I think playing Timez Attack would help me know my multiplication math facts.

Learning to play Timez Attack was easy.

Playing Timez Attack is a fun way to learn my

multiplication math facts.

 

Post

Post

 

Post

Post

 

  

Qualitative Research Methods

   

Qualitative data collection methods included participant responses to open ended questions 11 and 12 on the post-MAS (see Table 2), field note observations, and short informal interviews. Field notes were taken during all stages of the studies. There was no specific focus for the observations, and observers did not have a specific checklist of items to watch for. Rather, observers were instructed to note anything that seemed interesting about the students’ actions and statements while participating in our evaluation. So, though we shared knowledge of the evaluation questions, our observations may be particular to the individual observers.

 

During Timez Attack gameplay, however, observers did particularly note participants, enjoyment, frustrations, improved self-efficacy and learning (including conceptual understanding) of multiplication facts. Interview questions, also not specifically predetermined, were intended to illicit participants’ feelings and attitudes toward learning multiplication facts by playing Timez Attack, along with their feelings toward Timez Attack game play in general. 

 

Some questions regarding learning with Timez Attack included, “Why do you throw the balls at the gate?”  “Why do you pick up the spiders?” and “Do you think Timez Attack is useful?”  Other questions meant to determine Timez Attack game play satisfaction included, “If you could change anything about Timez Attack what would you change?” and “What did you like most about playing Timez Attack?” All student responses to open-ended questions as well as field note observations and interviews can be found in Appendix B. 

 

Table 2 -- Short Response Indices and Statements Post-MAS 

Index

Statements

Pre/Post

 

 

 

Timez Attack Utility/Attitude 

What I love about Timez Attack is… 

Post

 

What I hate about Timez Attack is…

Post

 

 

Data Collection Procedures

All participants were given a three-minute test on the 12s times table in their regular classroom, followed by the MAS pre-test. All Study 1 participants then played thirty minutes of Timez Attack on the 12s level. After playing the game, Study 1 students again took the 12s timed test followed by the MAS post-test. During all stages of data collection, researchers took field notes including observations of and short interviews with students. Table 3 illustrates data collection procedures for Study 1.

 

Table 3 -- Study 1 data collection procedures 

Time

Study 1  (Prior 12s learning)

15

minutes

All Participants -

Times-test and Pre-MAS 

Observations and Interviews

 

30 minutes

All Participants -

Timez Attack (12s level) 

Observations and Interviews 

15 minutes

All Participants -

Times-test and Post-MAS

Observations and Interviews

 
 
 
In Study 2, after administrating the pre-test to all study participants, Group 1 (the experimental group consisting of six 2nd graders and five 3rd graders) played the multiplication by 12s level of Timez Attack for thirty minutes. During this same time Group 2, the control group (consisting of six 2nd graders and five 3rd graders), were taught the 12s time table through teacher-led instruction.
 

Teacher instruction was carried out in the students’ regular classroom by their classroom teacher, and consisted of multiple representations of multiplication. The teacher helped students access prior knowledge about arrays, applied that knowledge to the new fact (12s), discussed the problem-solving strategy of breaking up 12s facts into the sum of 10s and 2s, and drilled using a kinetic game involving rhythm.

 

At the end of thirty minutes Group 1 and Group 2 were again given a three-minute 12s times test. The post-MAS was then administered only to Group 1. At this point in Study 2, Group 1 went back to their classroom for teacher-led instruction and Group 2 was also allowed to play the 12s level of Timez Attack. As with Group 1, Group 2 played Timez Attack at level 12 for thirty minutes and was given the 12s times test followed by the post-MAS. Similar to Study 1, researchers recorded observations of and short informal interviews with Study 2 participants throughout the various stages of this experiment. Table 4 illustrates data collection procedures for Study 2.

   

Table 4 -- Study 2 data collection procedures 

Time

Study 2 (No prior 12s learning)

15

minutes

Group 1 and 2 participants -

Times-test and Pre-MAS

Observations and Interviews

 

30 minutes

Group 1 (experimental group)-

Timez Attack (Level 12)

Observations and interviews

 

Group 2 (control group)-

Teacher instruction of 12s

Observations and Interviews

15 minutes

Group 1 - Times-test and Post-MAS

Group 2 - Times-test

 

30

minutes

Group 1- Teacher instruction of 12s

Observations and interviews

 

Group 2- Timez Attack (Level 12)

Observations and interviews.

15 minutes

Group 2 - Times-test and Post-MAS

 

 
 
 
Data Analysis 

  

Quantitative Data Analysis

 

In scoring the times tests, evaluators determined students’ overall correctness and the correctness of each individual 12s math fact (i.e. 12 x 1, 12 x 2 …12 x 12) by considering any written answer to a problem as an "attempt," every correct answer as a "correct response," and every incorrect attempt as an "error." These definitions allowed evaluators to measure changes in attempts, changes in the number of correct responses, changes in the number of errors, and ratio of errors to attempts, or the "error rate." Changes in correct responses, errors, and the error rate were calculated and also analyzed using Excel in order to measure the achievement of mastery by the students. Changes in attempts may reflect not only mastery, but also changes in student self-efficacy.   

 

The coding of Likert-type questions (YES, yes, ?, no, NO) from the pre/post-MAS ranged from numbers 1 to 5 with 1 coinciding with “NO” and 5 coinciding with “YES.” Coding of “never, a few times, many times and almost every day” response question ranged from 1-4 with 1 being “never” and 4 being “almost every day.

 


All Likert-type responses and changes in identical Likert-type student pre/post-MAS responses were then calculated, summarized, compared and analyzed using Excel tables and graphs. Information resulting from pre/post-times test analysis and Likert-type pre/post-MAS can be found in written, table and graph form in the Presentation of Results section of this report.
 
Qualitative Data Analysis
 
At least two evaluators coded each open-ended question response, observation, and interview, and sought agreement on the interpretation. Codes were not pre-determined but emerged during data collection and analysis. A complete list of categories used in coding qualitative data, along with other tables and graphs used in qualitative data analysis can be found in the Presentation of Results section of this report.

     

 

Presentation of Results


Summary of Findings


The primary questions to be answered in this evaluation were


    1. Does playing TA improve multiplication fact mastery?

     

    1. Does playing TA contribute to positive changes in students’ multiplication self-efficacy?

Our evaluation showed an improved proficiency in multiplication of 12s amongst students who played Timez Attack. The tables and discussion that follow will explain the observed differences. On average, students answered 11-13 more questions correctly on the post-test after they played the game than they scored on the pre-test.


Our evaluation also showed that there was a positive change in students' self-efficacy toward multiplication. The tables and discussion that follow will explain the observed differences in the self-efficacy measures as well.  

 

Timez Attack and Student Mastery of Multiplication Facts
As discussed in the methodology section, the evaluation took the form of two separate studies.

Study One

The first study was an experimental group consisting entirely of students in the third grade and did not have a corresponding control group. The only treatment applied to the experimental group between the pre and post tests was playing Timez Attack. This group had already studied the 12s and many had "passed off" these facts in normal classroom instruction. The results of the evaluation study on this group showed students attempted an average of 10.3 more problems on the post-test than on the pre-test. The students answered an average of 11.1 more problems correctly, made an average of .8 fewer errors, and reduced their error rate from 2.8% of attempts to .06% of attempts.
 
Table 5 -- Students in Study #1 
DescriptionPre-TestPost-TestChangeAvg Change
# of Students in Class2424
 
 
# Questions Attempted9681,215+247+10.3
Attempted (% of Possible1)40.3%50.6%
 
 
Correct Answers9411,207+266+11.1
Correct % (of Attempts)97.2%99.3%
 
 
Errors278-19-.8
Error Rate (of Attempts)2.8%.06%
 
 
1 Total Attempts possible is 100 problem per test multiplied by the number of students
 
 

Study Two

The second study conducted for this evaluation differed from the first study in three critical aspects:
 
  1. The students are in a class composed students in the 2nd and 3rd grades.
  2. The students had not yet studied the 12s multiplication facts.
  3. The study students were divided into an experimental group and a control group.

The experimental group was taught and practiced the 12s facts using Timez Attack. The control group was taught and practiced the 12s facts through instruction from the classroom teacher. Students received pre- and post-tests to evaluate the affect of the different modes of instruction and practice.  

Experimental Group

The results from the experimental group (Table 6) showed that students attempted an average of 11.5 more problems while answering an average of 12.9 more problems correctly. The students also made an average of 1.5 fewer errors and reduced their error rate from 21.7% to 6.6% of attempts following their Timez Attack experience.
 
Table 6 -- Students in Study #2 - Experimental Group
DescriptionPre-TestPost-TestChange
Avg Change
# of Students in Class1111

# Questions Attempted161287+126+11.5
Attempted (% of Possible1)14.6%26.1%

Correct Answers126268+142+12.9
Correct % (of Attempts)78.3%93.4%

Errors3519 -16-1.5
Error Rate (of Attempts)21.7%6.6%

1 Total Attempts possible is 100 problem per test multiplied by the number of students
 

Control Group
 
The results from the control group (Table 7) showed that students attempted an average of 10.5 more problems while answering an average of 15.1 more problems correctly. The students made an average of 1.5 fewer errors and reduced their error rate from 35.2% to 4.4% of attempts following the classroom instruction received from their teacher.

Table 7 -- Students in Study #2 - Control Group
DescriptionPre-TestPost-TestChange
Avg Change
# of Students in Class1111

# Questions Attempted179295+116+10.5
Attempted (% of Possible1)16.3%26.8%

Correct Answers116282+166+15.1
Correct % (of Attempts)64.8%95.6%

Errors6313 -50-4.5
Error Rate (of Attempts)35.2%4.4%

1 Total Attempts possible is 100 problem per test multiplied by the number of students
 

Additional Question of Interest

Study group two consisted of one class that was a combined class of 2nd and 3rd grade students. The random assignment of students into the experimental and control groups resulted in groups that were approximately the same of 2nd and 3rd grade students in each group. When combined, there were 11 students in each grade who were part of the evaluation study.

The evaluation team considered the possibility that grade level may influence the performance of students. In order to address this question, further analysis segregated Study Two's groups into 2nd or 3rd grade students in order to examine the results more granularly (Table 8 and Table 9).

These data show 3rd grade students achieved significantly higher improvement than the 2nd grade students in both the experimental and control groups.  

3rd grade students in the experimental group attempted an average of 16.7 more problems, answered an average of 17.2 more problems correctly, and reduced their error rate from 18.7% of attempts to 6.9% of attempts following game play. This compares to the results for 2nd grade students in the experimental group who attempted an average of 5.8 more questions, answered and average of 7.8 more questions correctly, and reduced their error rate from 27.8% to 6.0%.

Date for the control group show similar results. The 3rd grade students attempted an average of 12.2 more problems, answered an average of 19.0 more problems correctly, and reduced their error rate from 38.0% of attempts to 2.5% of attempts following teacher instruction. This compares to the results for 2nd grade students who attempted an average of 9.2 more questions, answered and average of 11.8 more questions correctly, and reduced their error rate from 30.1% to 2.1%.

The data held consistent in grade level comparisons following the final assessment where 3rd grade students attempted 14.6 more problems on average while answering and average of 19.5 more problems correctly and reducing the error rate from 28.0% to .03%. The 2nd grade students attempted an average of 9.5 more problems, answering and average of 12.6 more problem correctly and reducing their error rate from 30.1% to 2.1%.
 
Table 8 -- Students in Study #2 - 3rd Grade Students Only

Description

Experimental Group

Control Group

Combined Treatments

Pre-Test

Post-Test

Chng

Avg Chng

Pre-Test

Post-Test

Chng

Avg Chng

Pre-Test

Post-Test

Chng

Avg Chng

# of Students in Class

6

6

 

 

5

5

 

 

11

11

 

 

# of Attempts

107

204

+103

+16.7

100

161

+61

+12.2

207

368

+161

+14.6

Attempted (% of Possible1)

17.8%

34.0%

 

 

20.0%

32.2%

 

 

18.8% 

33.5%


 

Correct

87

190

+103

+17.2

62

157

+95

+19.0

149

367

+218

+19.5

Correct % (of Attempts)

81.3%

93.1%

 

 

62.0%

97.5%

 

 

72.0%

99.7%

 

 

Error

20

14

-6

-1.0

38

4

-34

-6.8

58

1

-57

-5.18

Error Rate (% of Attempts

18.7%

6.9%

 

 

38.0%

2.5%

 

 

28.0%

.03%

 

 





  1 Total Attempts possible is 100 problem per test multiplied by the number of students

Table 9 -- Students in Study #2 - 2nd Grade Students Only

Description

Experimental Group

Control Group

Combined Treatments

Pre-Test

Post-Test

Chng

Avg Chng

Pre-Test

Post-Test

Chng

Avg Chng

Pre-Test

Post-Test

Chng

Avg Chng

# of Students in Class

5

5

 

 

6

6

 

 

11

11

 

 

# of Attempts

54

8

+29

79

134

+55

+9.2

133

237

+104

+9.5

Attempted (% of Possible1)

10.8%

16.6%

 

 

13.2%

22.3%

 

 

12.1%

21.5%

 

 

Correct

39

78

+39

+7.8

54

125

+71

+11.8

93

232

+139

+12.6

Correct % (of Attempts)

72.2%

94.0%

 

 

68.4%

93.3%

 

 

69.9%

97.9%

 

 

Error

15

5

-10

-2.0

25

9

-16

-2.7

40

5

-35

-3.2

Error Rate (% of Attempts

27.8%

6.0%

 

 

31.6%

6.7%

 

 

30.1%

2.1%

 

 


 1 Total Attempts possible is 100 problem per test multiplied by the number of students
 
 
 
 
 
 
 
 
The data was further analyzed to aggregate the two experimental groups and report the impact of Timez Attack after both studies. Table 10 reports the results for the experimental groups.  

The final data analysis viewed the experimental group by grade level.  That data is summarized in Table 11.
 
Table 10 -- Combined Report - Both Studies Experimental Group
 
Attempts
Correct
Errors
Description
Count
Pre
Post
Change
Avg
Pre
Post
Change
Avg
Pre
Post
Change
Avg
Study #1
(all participants)
24
968
1,215
247
+10.3 
941
1,207 
266
+11.1 
27
8
-19 
 -.8
Study #2 (experimental group)
11
161
287
126
+11.5 
126
268
142
+12.9 
35
19
-16
-1.5 
Totals
35
1,129
1,502
373
+10.7
1,067
1,475
408
+11.7 
62 
27
-35
-1.0  

 
Table 11 -- Combined Report - Both Studies Experimental Group By Gradet
 
Attempts
Correct
Errors
Description
Count
Pre
Post
Change
Avg
Pre
Post
Change
Avg
Pre
Post
Change
Avg
3rd Grade
30
1,075
1,419
344
+11.5
1,028
1,397
369
+12.3
47
22
-25 
 -.8
2nd Grade
5
54
83
29
+5.8
39
78
39
+7.8 
15
5
-10
-2.0 
Totals
35
1,129
1,502
373
+10.7
1,067
1,475
408
+11.7 
62 
27
-35
-1.0

All of the data analyzed suggest that Timez Attack (both instead of and combined with teacher-led instruction) supports significant positive results for students learning multiplication facts, similar to results from teacher-led classroom instruction.


Timez Attack and Student Self-Efficacy
In addition to the pre- and post-tests of multiplication facts, we employed survey instruments, observation, and student interviews to assess students' experiences playing TA. Our focus in this section is on students' mathematics self-efficacy, which is their sense of their abilities to master math facts and their general learning capabilities.
 
The surveys were distributed to Study 1 as well as Study 2 (both experimental and control groups). Table 12 summarizes the analysis of the survey responses to these questions.

The student responses to question 1 suggests an increased positive feeling that "I can learn new multiplication math facts easily" for all participants. Study 2's experimental group showed a much greater increase in this sense of self-efficacy than the control group, and significantly higher than students exposed to TA in Study 1.
 
Students who played Timez Attack--both in Study 1 and in Study 2's experimental group--agreed more with the statement "I like learning multiplication facts" after playing Timez Attack, whereas student agreement with question 2 declined in Study 2's control group.
 
Responses to question 4, "Remembering new multiplication facts is hard for me", showed nearly identical increased disagreement in both control and experimental groups, suggesting that self-efficacy in terms of memory simply improved with practice.
 
Question 5 attempted to measure anxiety about math, stating "I usually do not worry about remembering multiplication math facts". Agreement with this statement went up in both Study 1 and Study 2's experimental group, but down slightly in Study 2's control group.

Table 12 -- Student Responses to Likert-type Self-Efficacy Survey Questions
  

Description

Question #2

Question #2

Question #4

Question #5

Pre-Test

Post-Test

Diff

Pre-Test

Post-Test

Diff

Pre-Test

Post-Test

Diff

Pre-Test

Post-Test

Diff

Study #1

(all participants)

3.913

4.478

0.565

4.000

4.565

0.565

1.500

1.409

-0.087

3.136

3.136

0.000

Study #2 (experimental group)

3.182

4.182

1.000

4.091

4.545

.0455

1.182

1.000

-0.182

3.091

3.273

0.182

Study #2 (control group)

3.727

3.818

0.091

3.818

3.545

-0.273

1.364

1.300

-0.182

2.000

2.182

0.182

Survey Questions:

 

Question 1: I can learn new multiplication facts easily.

 

Question 2: I like learning multiplication facts.

 

Question 4: Remembering new multiplication facts is hard for me.

 

Question 5: I usually do not worry about remembering multiplication math facts.

 










 

 
Findings from Qualitative Data:

Seven observers recorded their impressions watching students engage with TA, while one observer recorded her observation of the control group. Students also responded to two open-ended post-survey questions about their experience with TA. We coded all statements and, in the course of coding, 14 main codes emerged. They are described below, with any relevant sub-codes and examples taken from the data:


Table 13 -- Descriptions and Examples of Codes

 

Code
Description
Examples
Comparison
 
The statement represents a students’ comparison of Timez Attack to another game or another type of mathematics learning activity. Such statements were sub-coded as either Positive or Negative, with Positive indicating a preference for Timez Attack, and Negative indicating a preference for another activity.

Positive: "I play a few other math video games. It’s pretty good. Better. … I like it more.”


Negative: "It was pretty good. It’s a little bit less good because there’s this [other] game I really like.”

Conceptual UnderstandingThese observation notes give evidence that the students might be developing a conceptual understanding of the process behind or meaning of multiplication as repeated addition.
“It is 7X12 so since there is 7 balls in each [and] I count by sevens to get the answer.“
 
Emotional Response
 
These statements include any action or declaration by students that give insight into what emotional response they may be having to playing the game. Such statements were sub-coded as either Positive or Negative. Positive sub-codes correspond with positive emotions (such as enjoyment or happiness), while Negative sub-codes correspond with negative emotions (such as frustration or confusion).
Positive: Student shouted, “This is fun!”

Negative: It was obvious he knew the math facts and was looking to use them in the game, but the pace of the game and confusing instructions frustrated him.
 
Engagement
 
Statements with this code reflect how engaged the students were while playing the Timez Attack program. Statements were sub-coded as Positive or Negative. Positive sub-codes indicate active engagement with the game, and Negative sub-codes indicate disengagement from the game.
Positive: He was engaged in the game throughout the entire session. 

Negative: Looked very bored pushing the button to throw the balls. Not looking at screen when doing this.
 
Game ConstructThis code focuses on the basic appearance and design of the game. Sub-codes are Graphics, Audio, Timing, and Characters.
Graphics: When the colors shine out at one point in a fantastical way, she "ooohs."

Audio: The students interacted differently with the game. [Some students] used the headphones. [Two others] did not.

Timing: “Umm... how when the spiders come out, there are blue things... it like turns blue near them and then you can't get them--it takes a long time to get them.”

Characters: Commenting about the characters while playing it.
GameplayStatements given this code reflect the dynamic and interactive nature of gaming as students play Timez Attack.Picked up the spider characters and threw them over the edge of the walkway opposite the wall where they were supposed to be thrown for the game. Played the game instead of the math.
Hawthorne EffectThese statements reflect an influence of the evaluator/observer on the student trying to take the test or play the game.Very nervous about being watched. Edgy each time someone is looking over his shoulder. Looks up at person watching and has a difficult time proceeding with game.
LearningThese observation statements contain a reference to students learning multiplication facts using Timez Attack."I like how it's a fun game and you can learn in a fun game."
Problem-Solving StrategyThese statements are about students demonstrating any type of problem-solving strategy to solve multiplication problems.Student was using his fingers as manipulatives as soon as he encountered a multiplication fact. He actually took his fingers off of the controls to figure the answer.
Self-EfficacyStatements regarding students’ self-efficacy reflect how the student viewed himself as a mathematician and how competent he feels in his mathematical abilities. Such statements might include insights into a student’s confidence, or lack thereof, in doing mathematics. Sub-codes were either Positive or Negative, where Positive indicates a high level of self-efficacy and Negative indicates a lower level of self-efficacy.
Positive: “The game makes me feel better at math.”

Negative: “This is hard… I can’t think of it!”
SocialThese statements involve an interaction between two or more students or observers while playing Timez Attack. Such statements give insight into what students are saying/doing to each other, how they interact with those around them, and how their interactions are influencing their use of Timez Attack.Asked her neighbor for help. He showed her what to do.
Technical IssueStatements indicating issues with the game not working due to problems with the hardware/software.Froze a bit, then started working.
UsabilityThese statements indicate how easily the students are able to use the game and achieve their goals. Sub-codes were Positive or Negative, with Positive indicating student facility with the game and Negative indicating difficulties had by the students.
Positive: Used mouse combined with direction arrows on keyboard to move more smoothly.

Negative: Still confused by game controls and movements.
UsageThese observation notes deal with the extent to which the student has used Timez Attack in the past."I played it a lot last year at school.”

 

There were 335 total coded statements from these observation protocols, informal interviews, and survey responses. Because multiple codes were allowed for some statements, a total of 486 codes were assigned. The following table and accompanying chart display the frequency of code occurrences:


Table 14 -- Frequency of Codes


Code

Frequency

Emotional Response

Positive

61

103

Negative

42

Gameplay

 

84

Usability

Positive

26

58

Negative

32

Learning

 

 

41

Engagement

Positive

22

35

Negative

13

Social

 

31

Game Construct

Graphics

6

29

Audio

6

Timing

9

Characters

8

Self-Efficacy

Positive

14

24

Negative

10

Usage

 

21

Conceptual Understanding

 

18

Comparison

Positive

16

17

Negative

1

Technical Issue

 

12

Problem-Solving Strategy

 

8

Hawthorne Effect

 

5

TOTAL:

486




 

Three tables listing all of the student comments noted during evaluator observations or recorded as responses to the open-ended post-survey questions can be found in Appendix B.

Discussion of Findings

Based on the data above, we found the following:

Achievement Comparison between TA and Classroom Instruction

Was there a difference in students' abilities to memorize multiplication facts? First, we split a 3rd grade class in half and gave them all a pretest. Then, for 30 minutes, one half experienced classroom instruction while the other half played TA. Last, all the students took a post test. The data from pre- and post-tests show no significant difference between the two groups. On the bright side, students in both of these groups attempted more problems and made fewer errors on the post-test than other groups that experienced only one of these.


Grade/Age Differences

Not surprisingly, 3rd graders who played TA made larger gains than 2nd graders who played TA for the same amount of time. It might follow to reason that the young tender minds of 2nd graders should thus be protected from TA. However, this same reasoning would also require that scrawny individuals should be disallowed from gyms, sick people should be barred from hospitals, and ignorant children should be prevented from attending school at all. In the face of all this, the ambitious educator will see tremendous opportunity. It is likely that if 2nd graders practice more than 3rd graders, those 2nd graders will eventually exceed 3rd grade performance. We highly recommend further evaluation to provide evidence for this assertion.

 

Anxiety and Self-efficacy

In the group that experienced 30 minutes of direct classroom instruction, there was no significant difference in reported self-efficacy from before to after. However, when we compared pre- to post- testing of the group that played TA for 30 minutes, there was a significant positive change in self-efficacy. Students who only played TA also reported much less anxiety toward learning multiplication facts after they had played for half an hour.


Peer Interaction

Students frequently engaged in social interactions with their peers during game play. This suggests that social networking applications within the TA game play structure could lead to greater usage and deeper effects on student self-efficacy.

 

Learning vs. Gameplay

Students enjoyed playing the game. When asked what they disliked about the game, they most often mentioned the robot, falling off of bridges, or dying and having to restart. Although these items were intentionally built into the game, some math teachers (who we talked to on the side) were concerned that it took too much time away from student learning.


Marketing


We found that TA helps students learn their multiplication facts about as well as traditional classroom approaches. Therefore, it is a good supplement to traditional classroom teaching for visually and conceptually teaching the times tables. We also found that playing TA improves students' self-efficacy of multiplication skills more than traditional classroom approaches to teaching multiplication facts. These two facts suggest that in order to effectively market TA to school districts, teachers, and administrators, TA ought to be advertised as a powerful supplement to classroom instruction that helps to build student confidence and mastery. This marketing approach will also bring with it the idea that these benefits can be accessed outside of the time and social constraints inherent in traditional classroom approaches.


 

Conclusions

 

The focus of this evaluation project was to determine if playing TA improves multiplication fact mastery and/or contributes to positive changes in students’ multiplication self-efficacy. According to the data collected, playing TA does improve mastery of multiplication facts, but this improvement was roughly equal to that of the control group. This suggests that TA is equally as effective as classroom instruction in helping students achieve multiplication mastery. Additionally, students who received teacher instruction first and then played TA exhibited the highest improvement overall. While this improvement could be attributed to time on task, it shows that time spent playing TA is time "on task" and its use extends beyond the simple entertaining nature of other educational games.


According to the results of the multiplication tests and the student math attitude surveys, it appears that playing TA results in a positive change in students' multiplication self-efficacy. The pretest/post-test results showed that in all of the study groups, students attempted more questions on the post-tests. This indicated a higher level of self-efficacy (as students appear to be more confident in attempting more test items). This increase in multiplication self-efficacy was roughly equivalent to the increase that was experienced by those who received only classroom instruction. This indicates that playing TA has the positive effects of improving students’ multiplication self-efficacy beyond the classroom setting.

 

The results from our study indicate that TA is useful as a supplement to classroom instruction. Teachers can only devote a limited amount of individualized attention to each student, while playing TA allows for an unlimited practice experience with multiplication facts outside of classroom instruction. For those students who may feel anxiety or experience difficulty in learning multiplication facts in classroom settings, TA may help provide an alternative environment for students to learn by trial and error without time or social pressures. Games like TA can offer alternative modes of learning that may reach students who are not as prepared to learn in traditional classroom settings. As more is required from students and teachers, it will become critically important to leverage technology to help everyone use time more efficiently in the classroom and in the home.


Possible Confounding Variables in Findings

The summary of findings is supported by the quantitative and qualitative data collected for this evaluation. However, there are some confounding variables that may be influencing these reported results that are unrelated to specific research questions. Some of the variables that we consider pertinent to the analysis include the following:
 
  1. The students took the same test three different times within a compressed time period. The frequency of testing may have allowed students to experience greater confidence unrelated to the methods of instruction. This may have also allowed the students to incorporate strategies in the taking of the test that build upon the prior test experience to short-cut answering without considering each problem.
  2. The students in Study 2 were separated into two different groups with one group led by a teacher and the other group using Timez Attack for instruction. There may be variations in time-on-task spent learning the math facts based on whether the student was participating in the classroom instruction or the game. This may be one reason for the measures of improvement in the quantitative analysis of these two groups.
  3. The students were immersed in math-related activities throughout the duration of the study. This focus may have influenced performance on the assessments. A controlled follow-up with some interval of time between assessments and engagement in the evaluation exercise would inform whether the students had really improved in proficiency by retaining the facts over a longer term. 

Recommendations

The results and findings of this evaluation report can be helpful in two major ways. First, the findings can be utilized to drive the enhancement of Timez Attack game programming and marketing strategies. Timez Attack can be really engaging for students, but teachers and other school personnel are largely unaware of how the game will provide a benefit for the students. Parents may also experience this confusion about the learning benefits of Timez Attack game play. We suggest that Big Brainz gear its marketing toward helping teachers, administrators, and parents understand that Timez Attack has been shown to provide a fun, confidence building way to complement the teacher in the classroom by producing higher times-table competencies than would otherwise be attained. It could be helpful to provide information on the website addressed to the different stakeholder groups such as parents, teachers, school administrators, etc. Building rapport with these stakeholders can build market penetration and client satisfaction.

 

Another recommendation is making the information of students' progress more readily available to teachers. The inclusion of more granular data reporting for the teacher could include measurements of time spent on encounters, time spent moving around, number of correct and failed attempts for each problem, sortable by multiplier/multiplicand, etc. Additional reporting metrics will add value to the game when a buying decision is considered.


Future Evaluation Plan


Our initial findings can provide direction for a more comprehensive and thorough evaluation of the Timez attack software with a more robust environment, conditions, and subject pool. In an effort to promote continuous improvement, we submit here some suggestions for future evaluations.


We recommend that a future evaluation work with students at the beginning of their 3rd-grade year before they have been exposed to the multiplication tables. This will reduce the lurking variable of students' previous exposure to the multiplication facts. Also, we recommend that the evaluation be more longitudinal in nature with a greater number of students. This will allow for more concrete, trustworthy results, increasing the strength of the marketing strategy.


Additionally as part of our data analysis, we discovered that for the 12s multiplication facts, students improved on some numbers much more than they did on others. 3,4,5,11,12 seem to be the numbers that most students got wrong, and interestingly enough, 3,4,10, and 12 were the numbers that students attempted more. Also, these seem to be the numbers where students made the most improvement. If we look between the lines, we see that 5,7,8,9, and 11 were the least attempted, and 6-9 were the numbers where the least improvement was made. We did not include these results in the evaluation because there are too many lurking variables, such as how often the questions show up on the test, where they are placed on the test, and students' individual test-taking strategies. However, we do recommend that a future evaluation examine which questions are more problematic for students (controlling for the lurking variables by having students use the same strategy and having the questions more evenly distributed). This information could affect how the game could focus more/less on certain multiplication facts, which would be a significant marketing advantage.


Below are listed other possible questions and methods for future evaluations.

Evaluation Questions:

1.     What effect does game play have on the way students feel about their ability to learn and use math functions?

2.     How do different groups use the game?

  • Public school students
  • Public school teachers
  • Other school personnel (librarians, lab technicians, SPED, etc.)
  • School Administrators
  • Home schoolers
     

3.     What influence does the teacher have on the attitudes of the students with respect to the game?

4.     Does the confidence developed by game play transfer to other math areas? In what ways?

5.     Does the confidence developed by game play transfer to other curricular subjects? Which and in what ways?

6.     What other methods are used to teach the multiplication tables in traditional classrooms? How much time do they take? How do their effects

        compare with Timez Attack?

7.     How do teachers feel about Timez Attack?

8.     What do students remember the most after playing Timez Attack?

9.     What is the effect of Timez Attack as a remediation intervention?

Evaluation Methods:

In this evaluation, two classes of second and third grade students were observed before and after they played Timez Attack. There were some serious factors that reduced the effectiveness of this evaluation, which should be remedied in future evaluations. It is suggested that:


1.     Students have no previous experience with the Timez Attack game.

2.     Students are observed who have not begun to learn their multiplication facts.

3.     Feedback from teachers be intentionally solicited as part of the evaluation.

4.     Only online versions of the game be used so that usage analytics can be acquired.

5.     More than two classrooms be used in the study.

6.     More than one school demographic be used in the study.

7.     Multiple observations be conducted over a period of time.

8.     The study starts with the beginning and not the end of the school year.

9.     More interviews and case studies are generated to provide a thick description of the experience of the clients.

10.    Students be chosen at random to participate in the study to enhance validity of the study.

11.  Student achievement be assessed over time to analyze long-term recall of math facts.

 

These suggestions can help to drive a deeper and more meaningful exploration of the effects that Timez Attack is achieving on the various stakeholder groups.



 

Appendices

Appendix A - Data Collection Instruments

Pre-Multiplication Attitudes Survey

 
Read each sentence.

 

Circle “NO” if you think what is says is not true AT ALL

Circle “no” if you think the sentence is mostly not true

Circle the “?” if you don’t know or are not sure

Circle “yes” if you think it is mostly true

Circle “YES” if you think it is VERY true

 

 

1.  I can learn new multiplication math facts easily

 

 

NO

 

no

 

?

 

yes

 

YES

 

2.  I like learning multiplication math facts

 

 

NO

 

no

 

?

 

yes

 

YES

 

3.  I am good at memorizing multiplication math facts

 

 

NO

 

no

 

?

 

yes

 

YES

 

4.  Learning new multiplication math facts scares me.

 

 

NO

 

no

 

?

 

yes

 

YES

 

5.  I usually do not worry about remembering multiplication math facts.

 

 

NO

 

no

 

?

 

yes

 

YES

 

6.  Learning new multiplication math facts makes me feel bored.

 

 

NO


no

 

?

 

yes

 

YES

 

7.  I study my multiplication math facts at home.

 

 

NO

 

no

 

?

 

yes

 

YES

 

8.  I have played Timez Attack…(Circle the correct answer.)

 

 

NO

 

no

 

?

 

yes

 

YES

 


Post-Multiplication Attitudes Survey

Read each sentence. 

 

Circle “NO” if you think what is says is not true AT ALL

Circle “no” if you think the sentence is mostly not true

Circle the “?” if you don’t know or are not sure

Circle “yes” if you think it is mostly true

Circle “YES” if you think it is VERY true

 

 

1.  I can learn new multiplication math facts easily.

 

 

NO

 

no

 

?

 

yes

 

YES

 

2.  I like learning multiplication math facts.

 

 

NO

 

no

 

?

 

yes

 

YES

 

3.  Remembering new multiplication facts is hard for me.

 

 

NO

 

no

 

?

 

yes

 

YES

 

4.  Learning new multiplication math facts scares me.

 

 

NO

 

no

 

?

 

yes

 

YES

 

5.  I usually do not worry about remembering multiplication math facts.

 

 

NO

 

no

 

?

 

yes

 

YES

 

6.  Learning new multiplication math facts makes me feel uneasy and confused.

 

 

NO

 

no

 

?

 

yes

 

YES

 

7.  I will play Timez Attack at home.

 

 

NO

 

no

 

?

 

yes

 

YES

 

8.  I think playing Timez Attack would help me know my multiplication math facts.

 

 

NO

 

no

 

?

 

yes

 

YES

 

9.  Learning to play Timez Attack was easy.

 

 

NO

 

no

 

?

 

yes

 

YES

 

10. Playing Timez Attack is a fun way to learn my multiplication math facts.

 

 

NO

 

no

 

?

 

yes

 

YES

11. What I love about Timez Attack is… (Write you answer below.)
 

 

 

12. What I hate about Timez Attack is… (Write your answer below.)
 

 


Multiplication with 12s Pre/Post-test

 
 

Appendix B - Student Comments

Student Comments Recorded by Observers During TA Play

Shouted “This is fun!”

Said she had played the game “a few times."

Exclaimed “I’m Good!” when he defeated the robot after answering all of the facts in an end of level checkpoint.  “Yes!”

“Finally” when completing the end of level checkpoint after she had lost to the robot on two previous attempts

“When do we get the dvd so we can take it home?  I wish I could load it today at home and play it now.” 

"This is hard." "I can't think of it."

"Watch this, it's funny!" she says when she was at a challenge door.

"I played it a lot last year at school. I think it's really cool. It's, I don't know.” (shrugs)

"It's pretty fun. I don't know, it's just fun."

"I knew what it was, but I can't find the keys. It's hard."

"What the heck?" one student says when a challenge door is missing the second bottom numeral. Another girl said the same thing had happened to her.

"I really like it. I just enjoy doing my times tables. This is one of my favorite ways. It's times tables made into a game so double the fun." He said he had played it a long time at school, and liked the lava world because "it's the hardest one."

"Do you know if you wait, the spiders just walk to you?"

"It was fun. It was pretty good (compared with other games he has played). It's a little bit less good because there's this game I really like. You go around catching stuff and stuff. "

"Yes I did it" a girl pumps her fist when she defeats the robot.

"I hate this part. I always fall off the edge" when she's at the bridge, trying to cross.

"Finally I'm going to the next level. Are you on the third level? I'm on the third level. I passed you. Have you been keeping track of your levels?"

"Look at me!" she says this as she falls off the cliff.

"I've played the older version at school. Don't we get to download this at home? My mom will probably let me do that. I play a lot [of games]. I play Wii. I play a few other math video games. It's pretty good. Better (Timez Attack is vs. those others). I like it more. I like video games where you run around. The running around and fighting the bad guys. Do we get to play again? I want to play right now.”

"I thought it was fun because you can do challenges and stuff but math is involved. I've never gotten past the dungeon" "It is pretty much the same fun [as other games]."

"It was awesome! It was really fun to play." She most liked going against the robot but said it was " a little hard to learn the keys.”

"You can't backspace [when at a challenge spot]. It sucks."

Interview- Q: What did you like the most about Timez Attack? “Helps us learn times tables. Parents can't tell us not to use it. I like it the most cause it's long.”

Q: What did you NOT like about Timez Attack? “When you really want to finish it, try to go too fast, and make dumb mistakes.”

"Yes" when he beats final boss

"I can't see. It's too dark on the thing."

"Yes" "This is really cool"

"I don't know how to work it."

"It's angrifying" the thing in the game doesn't com back if you fall

"My computer's frozen."

"I can't think of it!"

"Ah, 12s are hard!"

Interview – Q: What do you like about Timez Attack? “It's not like having a piece of paper or flashcards.  It's like a game.  It's fun.  And it teaches you and makes it so you can memorize.”

Q: What don't you like about Timez Attack? “ Umm... how when the spiders come out, there are blue things... it like turns blue near them and then you can't get them--it takes a long time to get them.”

Q: Do you think playing Timez Attack is useful? “It's good practice.”

Q: if you could change anything about Timez Attack, what would you change? “On the last level, make it easier to get the dragon.”

Q: Do you play at home? “Not much.  I play one or two times in three or two weeks.”

Q: Do you have any siblings? Do they play at home? “I have three younger siblings, but they don't play.  My older brother used to play when he was in third grade, but he passed all of them, so he stopped.”

"I don't know how to get out.”

"Did I just come up from there?"

Interview - Q: Why do you throw the snails at the wall? “ Well, they represent each group in the array.”

Q: So, they help you count up? “Yeah.”

Interview – Q: Why do they have those dots? “Because they... it's sort of like an array."

Interview - Q: Did you know your 12s coming in? “Sort of. A few of them--that would have been useful to know!"

"No!" he exclaimed quietly when he missed another question.

"Now I'm fighting the big boss, I'm guessing."

"No!" "This is so fun!"

''Holy cow you are good”

“Not another one of these guys”(robot) “This is fun”

Interview - Q:  Is it easy to play the game? “Kind of easy (the multiplication can be hard).”

Interview - Q: Have you played the game before? “Yea, at school.“

Shouting “yea” when she is correct, shouting “oh no” when wrong.

Interview - Q: Why do you throw balls at the gate? “It helps you figure out the right answer so you can get in.”

Interview - Q: Why do you throw balls at the gate? “You have to throw the balls to get in the gate.“

Interview - Q: Why do you throw balls at the gate? “Counts until last one then that is the answer .”

Interview - Q: Have you play this before “Yes.”

Q: Where? “At school and home. At home it has slugs. I like slugs better.”

Q: Why are there groups with 5 balls on the gate? “Not sure.”

Interview - Q: Why do you hit the spider? “Helps you count to get answer.”

Q: What do you like about the game? “You do not have to ask anyone the answer if you forget.”

Q: Anything else? “It makes me get me answers faster.”

Q: What do you not like about the game? “When I get the wrong answer.”

Claps when right, says “ah sweet”

Interview - Q: Why is it counting by two when you throw the ball? “Because it is 2 x 12, so you have to count by 2’s.”

“I do not like battling the robot. He beats me up.” “Oh no, I killed him” “Not sure how I get to next level. Am I in it?”

Says “phew” when he gets them right.

Interview - Q: Why are there groups of balls on the gate? “They help you figure out the answer.”

Q: How? “Since it is 9X12 you have 9 groups. Wait, I guess I have 9 in every group and then I would have to count twelve of them.”

Interview - Q: Why are there groups of balls on the gate? “To get the answer “

Q: Why is there 7 in each group? “It is 7X12 so since there is 7 balls in each I count by sevens to get the answer.“

 “YES!” at checkpoint, with handpump. 

Interview - Q: Why do you throw the balls?  “Because it counts up and gives the answer if you need it. “

Likes the game “because it's a mix between a game and times tables.”

Likes the game “better than memory cards.”

Game “makes me feel better at math.

“I just did it!”

“Ah, 12s are hard!”

“Oh, it's . . .; I can't do it . . .” “Yes!"

 “Okay, think.”

 “Yes!” when he found it was right.

“I just have to beat the sixes and I'm done. Can I play the game?”

“Yes! I got it done!”

“Oh! Oh my Gosh!”

“Ah!”

“Turn the other way!”

“Door, another door, weird.” “Yes! Oh, bam bam ...”

“Oh!”

To his friend- “It shows you the ones you need to practice to beat the game.”

“Okay.” “Ah.” “Ch, ch.”

“Oh no!”

“Ah! I forgot that one!”

Gives her friend suggestions- “Just walk off the edge.”

Singing- “I'm going backward; I'm going backward.”

“Oh!” -goes back and laughs. “When is it gonna come and get me?”

“How do I do this part?” “Ahh! I did it!”

“And I was doing good and just figured out how . . .”

“Only got two.”

 “When does this guy [robot] end? When does this guy end? I'm getting annoyed at how long he's taking! Do you know how much longer?” “I couldn't finish it.” -He told me. “How many lives do I have?” -He asked me. “I had like twenty.”-He replied. I asked him if he was glad to be doing something else rather than facing the robot. “Yeah, but I'm gonna have to face him again.” -He said.

“That was hard!”

“It took me 25 minutes just to beat the robot!”

 “Sixty.” (Speaking right answer as typing.)

“Oh come on, no!”

One student, while filling out pre-survey, "Nobody would do it almost everyday!"

Student leaving to play, "I'm not good at Timez Attack. I always get stuck in the dungeon."

"It's a good game."

"Burst all the people"

"I like how it's a fun game and you can learn in a fun game."

"They made it more difficult. Like you can go in alleyways."

"I like how they made a videogame for while you're learning."

"I didn't hate any of it."

"I liked it. It was a fun game."

"There's just SO much spiders! You have to get ALL of them and it takes a long time."

"I liked every bit of it!"

"It kind of takes a long time to figure them out..."

"I LOVED all of it!"

"Well, some of the class didn't like it."


Student Responses to Post-Survey Question 11

The following responses are taken verbatim from the open-ended, post-survey question, "What I love about Timez Attack is..."

It helps you remember timz tables
Asoum
It is a game you can play and learn lots of new stuff
It is fun.
you can learn multiplication in a fun way.
It helps you learn your math facts.
It's a fun game to play and help's you learn multiplication
It make me feel smarter.
they made multiekatshin easy
Everything
how you can chang levels!
I love the green guy.
you had to fite
It helped me learn my math facts
It's a good way to learn.
the same
That you can learn about the math facts!
That you have challenges and match is envolved
It teaches you multiplication facts.
It helps you learn more.
I can learn math while I'm playing computer games
That it teaches you something when you play. So it's a good way to learn.
It's a fun way to learn times tables.
the math.
mutipulcation
it teches me times
it’s a fun way to learn math facts
?
It helps kid learn times tables.
nothing
its fun and it teachs you x tables
It help me think
You can learn and have fun at the same time!
Lrning math facts
it gives you a challage.
It's so fun. It helped me with my multiplucation 12's.
killing the robotst.
nothing
you lern your multiplication
when you colecet the spiders
 
Because its fun and a good way to learn your times quistian
You get to go on the moving elivater thing
the ashin
It's fun and easy and you do not get lost. P.S. the green dude
it is math but it is a fun game

 


Student Responses to Post-Survey Question 12

The following responses are taken verbatim from the open-ended, post-survey question, "What I hate about Timez Attack is..."

?
starting over
nothing!
It is hard.
you have to get the spiders & slugs.
I fall off when I'm trying to go really really fast.
that you don't just get to write the anser then
the spiders
I hatted nothing
They tell you the answer when you get it wrong
nothing!
The robot!
nuthing
the graphikiks
nothing.
the robot
That you fall off when you miss going on the blocks!
nothing
You have to type in the answer before a certain time
Ansers to hard
If you died, you would have to restart the level
I don't hate it but it challenging to go on the transportation you might fall.
the green guy doesn’t have much clothes on.
you die.
doesn't give you enough time to answer
nuthing
NONE
?
nothing!
nothing
nothing!
nothing
It doesn’t give me the time I need to find the numbers on the keybord!
nathing
nothing
I keep pressing rong buttons
when the brige was really skinny.
That once I beat the boss it made me start over
Nothing
the robot!
its hard to get from qustion
You have to throw the dimends.
dus not let me think
Nothing
they make you go back if the robot kills you
 

References

  

Berk, L. (2009). Child Development (8th ed.). Boston, MA: Pearson Education, Inc.


Carter, G. S., & Norwood, K. S. (1997). The relationship between teacher and student beliefs about mathematics. School Science and Mathematics, 97, 62-67.


Fennema, E., & Sherman, J.A. (1976). Fennema-Sherman mathematics attitudes scales: Instruments designed to measure attitudes towards the learning of mathematics by males and females. JSAS Catalog of Selected Documents in Psychology, 6(1), 3b.


Federation of American Scientists. (2006). Harnessing the power of video game for learning. Retrieved from http://fas.org/gamesummit/.


Jackson, C. D., Leffingwell, J. R. (1999). The role of instructions in creating math anxiety in students from kindergarten through college. The Mathematics Teacher, 92(7), 583-586.


Ke, F. & Grabowski, B. (2007). Gameplaying for maths learning: Cooperative or not? British Journal of Educational Technology, 38(2), 249-259.


Kebritchi, M., Hirumi, A., and Bai, H. (2008). The effects of modern math computer games on learner's math achievement and math course motivation in a public high school setting. [Research brief based on dissertation research for Doctor of Philosophy in Education with an Instructional Technology Specialization from the College of Education at the University of Central Florida (UCF)].


Klawe, M. M. (1998). When does the use of computer games and other interactive mult-imedia software help students learn mathematics? Retrieved from http://www.cs.ubc.ca/nest/egems/reports/NCTM.doc.


Meece, J. L., Wigfield, Al, & Eccles, J. (1990). Predictors of math anxiety and its influence on young adolescents' course enrollment intentions and performance in mathematics. Journal of Educational Psycholoy, 1, 60-70.


Moreno, R. (2002). Who learns best with multiple representations? Cognitive theory implications for individual differences in multimedia learning. Paper presented at World Conference on Educational Multimedia, Hypermedia, & Telecommunications. Denver, CO.

 

National Council of Teachers of Mathematics. (1995). Assessment standards. Retrieved from http://www.standards.nctm.org/.


National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education.


National Research Council. (1989). Everybody counts: A report to the nation on the future of mathematics education. Washington, DC: National Academy Press.


Nichols, J., Cobb, P., Wood, T., Yackel, E., & Patashnick, M. (1990). Assessing students' theories of success in mathematics: Individual and classroom differences. Journal for Research in Mathematics Education, 21(2), 109-122.


Programme for International Student Assessment. (2003). Learning for tomorrow's world: First results from Programme for International Student Assessment 2003. Retrieved from www.pisa.oecd.org.


Rosas, R., Nussbaum, M., Cumsille, P., Marianov, V., Correa, M., Flores, P., et al. (2003). Beyond nintendo: design and assessment of educational video games for first and second grade students. Computers & Education, 40(1), 71-24.


Scarpello, G. (2007). Helping students get past math anxiety. Connecting Education & Careers, 82(6), 34-35.


Sedighian, K. & Sedighian, A. S. (1996). Can educational computer games help educators learn about the psychology of learning mathematics in children? 18th Annual Meeting of the International Group for the Psychology of Mathematics Education. Florida, USA.


Shaffer, David W., Squire, K. R., Halverson, R. and Gee, J. P. (2005). Video games and the future of learning. The Phi Delta Kappan, 87(2): 104-111.


Swetman, Daniel. (1994). Fourth grade math: The beginning of the end? Reading Improvement, 31, 173-76.


Tankersley, Karen. (1993). Teaching math their way. Educational Leadership, 50, 12-13.

Metaevaluation

The meta evaluation was performed by the evaluation team using the following peer reviewed instrument:
*Program Evaluations Metaevaluation Checklist (Based on The Program Evaluation Standards),created by Daniel L. Stufflebeam (1999).
 
Utility:
On the subject of Utility the overall score was 71%. Report clarity and Information Scope were particularly strong. The educational nature of this evaluation encouraged us to properly identify and clarify stakeholder needs and circumstances. Due to the limited scope of the evaluation, Stakeholder identification and Timeliness/Dissemination achieved slightly lower (Good) results. Stakeholder involvement was identified as a main weakness in the evaluation, again due to the restricted timeline of the project. Data was collected from only two classes and at only one point in time. Because of these limitations, we were unable maintain a more satisfactory level of interaction with the stakeholders. In summary, while we feel we gave adequate attention to the stakeholder’s needs there was simply not enough time to properly address the questions.
 
Feasibility:

Overall, the feasibility of this evaluation was rated “very good” using Stufflebeam's Program Evaluations Metaevaluation Checklist (1999). Our overall rating was a score of 8, which equates to 67%--the low end of the "very good" category.

 

This rating is a result of averaging three areas: Practical procedures, political viability, and cost effectiveness.

 

Practical procedures was our strongest area. While we question our appointment of competent staff (all evaluators were student novices), we believe we made a strong showing in other areas. For example, we tailored our methods and instruments to the client's information requirements, choosing procedures in light of known constraints. We also made use of local resources, and designed the observation to minimize disruption to the teachers and classrooms.

 

Political viability was our weakest area. We failed to maintain communication or involvement with the stakeholders, particularly our client, throughout the evaluation. We did not agree on, or even extensively consider, editorial and dissemination authority. Further, the lack of formality in planning and contracting for the engagement with the client created concern in understanding expectations. On the other hand, we did spend sufficient time considering positions of different interest groups. We also immediately terminated any corrupted evaluation data as it was encountered. While we were consistently aware of what our client wanted, we maintained objectivity and worked to consistently avoid bias.

 

Considering the cost and value of the evaluators time was null, it is not surprising that our analysis of our evaluation project was high in terms of cost-effectiveness. However, there were examples of inefficiency. Particularly, we had seven observers present during the evaluation—nearly one observer for every three students. Also, we employed all 12 evaluators in the authoring of the evaluation report. This was because this was largely an academic exercise for our course, and our instructor wanted each student to have the experience of participating in every step of the evaluation. Our strengths in this area included making use of in-kind services, providing a report that may inform decisions, and minimizing both disruption and time demands on client/program personnel.

 
Accuracy:
Our overall accuracy score was 68%, or “very good” based on Daniel L. Stufflebeam’s “Program Evaluations Metaevaluation Checklist” accuracy standards. In regards to program documentation, a technical report was produced and a copy was provided for the stakeholders. Limitations lie in that we did not collect descriptions of the intended program from various written sources, and we did not analyze discrepancies between the various descriptions of how the function was intended to function. Context analysis was also a strong area of our evaluation, as we noted and reported important contextual influences and effects. However, we did not analyze how the program’s context is similar to or different from contexts where the program might be adopted. Furthermore, data collection procedures were specifically identified and tied to the evaluation’s purposes. Information was collected from a variety of sources from documented populations, and all data collection instruments were documented, justified, and reported in the evaluation report (in a technical appendix). Key questions for the evaluation were identified, data analysis procedures were identified to address these questions, and information was collected in a valid and reliable manner. Although consistency between multiple observers was not checked during data collection, methods of triangulation were employed during data analysis. Our biggest weakness lied in aspects of impartial reporting. The client was not engaged during the process to ensure impartial reporting, and perspectives of all involved stakeholders were not reported. We did not obtain any outside audits of the report, and we did not report alternative plausible conclusions for our results.
 
Propriety:
Service Orientation: We identified the strengths and weaknesses of the program. Because of the brief nature nature of this report, we chose to focus on the needs of Big Brainz.
Advanced Written Agreements: While we did not include written agreements, we did have verbal agreements.
Rights of human Subjects: We were clear to the stake holders that the evaluation would protect the participants human rights.

Potential Conflicts of Interest:
 
  1. James West (of Big Brainz) is a first cousin of Rick West (the Instructor of the BYU course that undertook the evaluation). Our evaluation team took precautions to avoid a conflict of interest by choosing a Project Manager other than Rick West to lead the interactions with Big Brainz.
  2. Rick West's daughter attends the same school and class where we did the evaluation. Again, our evaluation team took care to minimize this potential conflict of interest.