Dimensions of Success: A Quality Improvement Observation Tool
What is DoS?
The Dimensions of Success observation tool, or DoS, pinpoints twelve indicators of STEM program quality in out-of-school time. The DoS tool focuses on understanding the quality of a STEM activity in an out-of-school time learning environment and includes an explanation of each dimension and its key indicators, as well as a 4-level rubric with descriptions of increasing quality.
DoS was designed to be a self-assessment observation tool for STEM program administrators and staff. It can also be used by external evaluators or funders to track quality in programs over time or quality across a city or a state. To use DoS, you must be trained and certified (see section below). After certification, you can use the tool as often as you would like to measure the quality of STEM activities. When used for program quality improvement, we suggest debriefing the activities or lessons with your ratings with staff, and having them join in the process of pinpointing strengths, weaknesses, and next steps for improving quality.
For more information, visit The PEAR Institute’s website here:
The Twelve Dimensions of Success:
DoS measures twelve dimensions that fall in 4 broad domains: Features of the Learning Environment, Activity Engagement, STEM Knowledge and Practices, and Youth Development in STEM.
The first three dimensions look at features of the learning environment that make it suitable for STEM programming (e.g., do kids have room to explore and move freely, are the materials exciting and appropriate for the topic, is time used wisely and is everything prepared ahead of time?).
The second three dimensions look at how the activity engages students: for example, they measure whether or not all students are getting opportunities to participate, whether they are doing activities that are engaging them with STEM concepts or something unrelated, and whether or not the activities are hands-on, and designed to support students to think for themselves versus being given the answer.
The next domain looks at how the informal STEM activities are helping students understand STEM concepts, make connections, and participate in the inquiry practices that STEM professionals use (e.g., collecting data, using scientific models, building explanations, etc.).
Finally, the last domain assesses the student-facilitator and student-student interactions and how they encourage or discourage participation in STEM activities, whether or not the activities make STEM relevant and meaningful to students’ everyday lives, and the experiences. Together, these twelve dimensions capture key components of a STEM activity in an informal afterschool or summer program.
The DoS tool is NOT a grading rubric for educators or afterschool facilitators. Rather, the DoS observation tool breaks down afterschool program activities to analyze and best identify and improve upon gaps in access, the quality and accuracy of STEM education, the preparedness of the site, and to make sure youth are engaged, learning, and have a voice in the programs they are a part of. Programs that may excel in preparedness might find that their activities aren’t teaching STEM goals and objectives as effectively as they could be, or a program with peerless STEM content might realize they are limiting their youth’s engagement or agency within the program itself.
Whatever the case may be, DoS is NOT punitive new regulations or performance assessments for students and educators. Rather, DoS is an observation tool that identifies key areas of success and sets tangible quality standards for afterschool STEM programs, analyses program performance on-site and in person, and generates reports that can then be used for program quality improvement.
How Do I Use DoS?
To use DoS, a potential observer must complete a certification process. First, he/she must attend a 2-day training (in-person or online) to learn how to define and observe quality in each dimension.
Next, potential observers must complete a set of video simulation exercises to practice their understanding of the tool. PEAR will then review their ratings and evidence from these exercises, and will provide customized feedback at a one-hour calibration session (phone conference). At this session, PEAR trainers will help to address any questions and to provide additional examples that might be needed to clarify use of the tool.
Finally, potential observers will then arrange to practice using DoS in the field at afterschool sites in their local area. This step allows them to use the tool in the field and to incorporate the feedback they received on the video simulations. Upon successful completion of all these requirements, observers will be DoS certified for 2 years and can use the tool as often as they would like during that period. After 2 years, there are opportunities for re-certification if needed.
Training is offered monthly and new dates are posted as scheduled:
August 14th-15th, 2019; 10am-4pm ET. Registration Deadline: August 6th.
September 18th-19th, 2019; 11am-5pm ET. Registration Deadline: September 10th.
October 9th-10th, 2019 (Tentative); 10am-4pm ET
November 13th-14th, 2019; 10am-4pm ET. Registration Deadline: November 5th.
Training Registration: Please follow this link to begin the registration process: https://www.surveymonkey.com/r/DoS_Training_Registration
Please contact STEM Coordinator, Hannah Meisels, for more information about the tool, training/certification process, or to schedule a private training: email@example.com
DoS can empower afterschool and summer STEM program staff to embrace their role in inspiring the next generation to do STEM, be interested in STEM, and understand important STEM ideas that they can take with them throughout their lives. The tool helps to provide the common language that program/state administrators, staff, evaluators, etc. can use to describe their activities and where they excel and where they can improve.
For more information, visit their website here: