RAOP
Assessment & Evaluation
RAOP uses a structured assessment and evaluation framework aligned to program objectives: (1) strengthen educator capability in inquiry-based STEM research workflows, (2) support evidence-based reasoning using simulation-to-hardware activities, and (3) produce classroom-ready lessons and assessments that educators can implement in grades 5–12 settings. The framework prioritizes artifact-based evidence collected every cycle, embeds lightweight formative checks during labs, and rotates one validated standardized instrument per year to maintain rigor while minimizing burden.
- Growth in inquiry-based experimental design and evidence-based reasoning.
- Growth in scientific literacy skills for interpreting results and evaluating evidence.
- Quality and classroom readiness of educator-produced lessons and assessments.
- Feasibility for adoption in high-need LEAs through clear materials and practical tools.
Primary Outcome Measures
Results are summarized at the cohort level and used to refine future cycles. Artifact-based evidence and formative checks are collected every cycle.
- GIBL reflection logs (Predict–Test–Observe–Explain–Reflect) tied to selected modules
- Educator-developed lesson plan/curriculum draft (grades 5–12) derived from RAOP labs
- Classroom adaptation plan (constraints, feasibility, materials, timing)
- Implementation notes (barriers, supports, troubleshooting outcomes) used for annual refinement
- This is the primary evidence base every year and is designed to be low-burden and classroom-relevant.
- Short exit tickets/check-ins linked to lab objectives
- Observation checklists for setup readiness and expected outputs (virtual and hardware)
- Lab artifacts (plots, screenshots, brief interpretation prompts) for coaching and reflection
- Formative checks are used for instructional improvement and educator support.
- Cohort-level pre/post summaries for the selected instrument
- Triangulation with artifact-based evidence and educator reflections
- Planned rotation: Year 1—EDAT; Year 2—TOSLS; Year 3—CLASS (or equivalent); Years 4–5 repeat based on findings.
- If a different validated instrument is substituted, the same pre/post timing and reporting structure are preserved.
Evaluation Components
Evaluation combines learning outcomes, implementation fidelity, and classroom translation evidence to ensure the program delivers measurable value and produces implementable classroom outputs.
- Structured observation checklists for virtual sessions and on-site labs
- Attendance and participation tracking for required sessions
- Facilitator logs documenting constraints, deviations, and adjustments
- Cohort-level fidelity summary and lessons learned
- Actionable improvements for the next cycle (pacing, supports, logistics)
- Artifact-based evidence collected every cycle (GIBL reflections, lab outputs, interpretation prompts)
- One rotating standardized pre/post instrument per year (EDAT or TOSLS or CLASS/equivalent)
- Formative checks embedded in modules for coaching and support
- Cohort-level outcome summaries and trends
- Mapped evidence to program objectives and planned refinements
- Rubric-based review of educator deliverables (lesson plan, student materials, assessment tool, implementation notes)
- Peer review and program-team feedback cycles (draft → revise → finalize)
- Advisory board review of cohort-level deliverable quality and adoption constraints (biannual)
- Classroom-ready implementation package per educator (lesson + student materials + assessment + implementation notes)
- Identified strengths/gaps used to improve templates and supports for the next cohort
- End-of-week pulse surveys (Weeks 1–3) focused on clarity, pacing, and support needs
- End-of-program survey on feasibility and intended classroom use
- Optional follow-up check-in during the academic year (adoption status and needs)
- Cohort-level experience summary (what worked, what should change)
- Targeted supports for classroom adoption (time, resources, constraints)
Technical & Advisory Support
Evaluation is supported by continuous technical guidance and external oversight. Advisory reviews are conducted biannually and used for continuous improvement.
- Provides technical support for both virtual and on-site laboratory environments.
- Assists with setup verification, troubleshooting, and workflow reliability.
- Supports consistent execution of the simulation-to-hardware workflow used in RAOP activities.
Year 1 Cycle (2026) — Assessment & Evaluation Timeline
Evidence is summarized at the cohort level and used for continuous improvement.
| When | Activity | Owner | Evidence collected | Output / decision |
|---|---|---|---|---|
Jan–Feb 2026 (Recruitment & selection window) | Finalize cohort selection; confirm educator eligibility; distribute onboarding instructions and required accounts. | RAOP Program Team |
| Confirmed cohort and readiness status for summer cycle. |
Jun 2026 (Technical onboarding) | Software access verification; orientation to virtual workflow; expectations for artifacts and deliverables; evaluation overview. | RAOP Program Team + Technical Support Team |
| Educators ready to begin Week 1 virtual phase with standardized setup. |
Week 1 (Virtual phase) | Pre-assessment for the rotating standardized instrument selected for the annual cycle + foundational guided inquiry modules; formative checks embedded in labs. | Educators + RAOP Program Team |
| Baseline dataset for the annual instrument and first-week learning evidence for coaching. |
Week 2 (Virtual phase) | Deeper analysis and design tasks; draft classroom materials (lesson outline + assessment draft); peer review cycle. | Educators + RAOP Program Team |
| Draft classroom-ready package prepared for on-site validation. |
Week 3 (On-site phase) | Hardware-aligned validation and demonstration; revise lesson/assessment for feasibility; capture performance evidence. | Educators + RAOP Program Team + Technical Support Team |
| Final classroom implementation package per educator. |
End of Week 3 (Immediate post-program) | Post-assessment for the rotating standardized instrument selected for the annual cycle + final deliverables submission + end-of-program feedback survey. | Educators + RAOP Program Team |
| Cohort outcome summary and consolidated recommendations for refinement. |
Five-Year Program Timeline — Continuous Improvement
High-level repeatable cycle used to refine curriculum and strengthen adoption.
| When | Activity | Owner | Evidence collected | Output / decision |
|---|---|---|---|---|
Each year (Jan–Mar) | Recruit and select cohort; confirm eligibility and logistics; finalize module subset for the annual experiments. | RAOP Program Team |
| Annual cohort plan aligned to goals and constraints. |
Each year (Jun) | Technical onboarding and evaluation orientation; confirm access to virtual labs; share expectations for artifacts/deliverables. | RAOP Program Team + Technical Support Team |
| Consistent starting conditions across cohorts. |
Each year (Jul) | Deliver the 2-week virtual phase + 1-week on-site phase using a simulation-to-hardware workflow; collect pre/post, formative evidence, and deliverables. | Educators + RAOP Program Team + Technical Support Team |
| Annual cohort outcomes + classroom-ready curriculum artifacts suitable for adoption in high-need LEAs. |
Each year (Aug–Oct) | Analyze cohort-level outcomes; synthesize strengths and gaps; revise templates, supports, and recommended module pathways. | RAOP Program Team |
| Refined curriculum and supports for the next cohort. |
Assessment Resources
These resources support consistent assessment implementation across cohorts.
Follow RAOP
Updates, announcements, and participant highlights.
Official RAOP social channels. Additional platforms will be added as they launch.