There is no doubt technology has revolutionized most aspects of our lives, including what’s happening in our classrooms. But what are the indicators of quality when it comes to the technology we put in front of students? Educators face a daunting task of keeping up with rapidly evolving edtech products, identifying the best available applications and effectively implementing them in their classrooms. A common barrier for educators trying to incorporate technology is a lack of knowledge about the products and their alignment with course objectives, not to mention a heavy reliance on self-taught product use and integration.
The ISTE Seal of Alignment product certification program has developed a reputation for identifying excellent edtech products that align with the ISTE Standards. Recently, EdSurge talked with ISTE researchers who managed Teacher Ready, a research-driven project that (1) provides edtech decision-makers with tools to validly and reliably evaluate edtech, and (2) helped expand the ISTE Seal of Alignment certification by integrating user experience and product usability components.
What is the work the Teacher Ready team set out to do?
Brandon Olszewski, the director of research at ISTE, underscores the need to determine the best way to evaluate edtech products. “It’s really about edtech product quality, particularly usability. Did the tool help you accomplish your task effectively and efficiently? Is it easy to use? Will this technology help my students achieve their learning goals? Those are indicators of success.”
Nicole Langford, a research associate at ISTE, describes the purpose of the Teacher Ready project as developing a “framework for teachers to use as they score and evaluate products that they’re bringing into their systems, to help them choose better products from an oversaturated market.”
How did the team embark on this project? Caitlin McLemore, a senior research associate at ISTE, explains that early research activities encompassed a literature review, input from an expert panel, teacher focus groups and “think-aloud” interviews. After developing the framework, McLemore notes that “user testing allowed us to hone in on what specific dimensions and indicators were most important and ensure reliability and validity within the framework.”
Source link
Leave a Reply