RAOP

Assessment & Evaluation

RAOP uses a structured assessment and evaluation framework aligned to program objectives: (1) strengthen educator capability in inquiry-based STEM research workflows, (2) support evidence-based reasoning using simulation-to-hardware activities, and (3) produce classroom-ready lessons and assessments that educators can implement in grades 5–12 settings. The framework prioritizes artifact-based evidence collected every cycle, embeds lightweight formative checks during labs, and rotates one validated standardized instrument per year to maintain rigor while minimizing burden.

What RAOP measures
  • Growth in inquiry-based experimental design and evidence-based reasoning.
  • Growth in scientific literacy skills for interpreting results and evaluating evidence.
  • Quality and classroom readiness of educator-produced lessons and assessments.
  • Feasibility for adoption in high-need LEAs through clear materials and practical tools.

Primary Outcome Measures

Results are summarized at the cohort level and used to refine future cycles. Artifact-based evidence and formative checks are collected every cycle.

Core Evidence (Collected Every Cycle)
Purpose: Document educator learning and classroom translation using authentic, artifact-based evidence aligned to guided inquiry-based learning (GIBL).
Timing: Throughout Weeks 1–3, with final submission at the end of Week 3.
Evidence collected
  • GIBL reflection logs (Predict–Test–Observe–Explain–Reflect) tied to selected modules
  • Educator-developed lesson plan/curriculum draft (grades 5–12) derived from RAOP labs
  • Classroom adaptation plan (constraints, feasibility, materials, timing)
  • Implementation notes (barriers, supports, troubleshooting outcomes) used for annual refinement
Notes
  • This is the primary evidence base every year and is designed to be low-burden and classroom-relevant.
Formative Checks During Labs (Virtual + On-site)
Purpose: Monitor learning during the simulation-to-hardware workflow and provide timely support without high-stakes testing.
Timing: Throughout Weeks 1–3 (embedded in each selected module).
Evidence collected
  • Short exit tickets/check-ins linked to lab objectives
  • Observation checklists for setup readiness and expected outputs (virtual and hardware)
  • Lab artifacts (plots, screenshots, brief interpretation prompts) for coaching and reflection
Notes
  • Formative checks are used for instructional improvement and educator support.
Rotating Standardized Instrument (One Primary Per Year)
Purpose: Maintain rigorous cohort-level outcome measurement while minimizing educator burden by rotating one validated instrument each annual cycle.
Timing: Pre (start of virtual phase) and Post (end of on-site phase), for the selected instrument in that year.
Evidence collected
  • Cohort-level pre/post summaries for the selected instrument
  • Triangulation with artifact-based evidence and educator reflections
Notes
  • Planned rotation: Year 1—EDAT; Year 2—TOSLS; Year 3—CLASS (or equivalent); Years 4–5 repeat based on findings.
  • If a different validated instrument is substituted, the same pre/post timing and reporting structure are preserved.

Evaluation Components

Evaluation combines learning outcomes, implementation fidelity, and classroom translation evidence to ensure the program delivers measurable value and produces implementable classroom outputs.

Implementation Fidelity (Program Delivery)
Verify that the core program elements are implemented as designed across the two-week virtual phase and the one-week on-site phase.
Methods
  • Structured observation checklists for virtual sessions and on-site labs
  • Attendance and participation tracking for required sessions
  • Facilitator logs documenting constraints, deviations, and adjustments
Outputs
  • Cohort-level fidelity summary and lessons learned
  • Actionable improvements for the next cycle (pacing, supports, logistics)
Learning Outcomes (Educator Growth)
Assess educator learning outcomes aligned to inquiry-based STEM research workflows, simulation-to-hardware validation, and evidence-based reasoning.
Methods
  • Artifact-based evidence collected every cycle (GIBL reflections, lab outputs, interpretation prompts)
  • One rotating standardized pre/post instrument per year (EDAT or TOSLS or CLASS/equivalent)
  • Formative checks embedded in modules for coaching and support
Outputs
  • Cohort-level outcome summaries and trends
  • Mapped evidence to program objectives and planned refinements
Classroom Translation (Deliverables Quality)
Evaluate the quality and classroom feasibility of educator-produced materials derived from RAOP labs (grades 5–12).
Methods
  • Rubric-based review of educator deliverables (lesson plan, student materials, assessment tool, implementation notes)
  • Peer review and program-team feedback cycles (draft → revise → finalize)
  • Advisory board review of cohort-level deliverable quality and adoption constraints (biannual)
Outputs
  • Classroom-ready implementation package per educator (lesson + student materials + assessment + implementation notes)
  • Identified strengths/gaps used to improve templates and supports for the next cohort
Educator Experience (Satisfaction + Feasibility)
Understand educator experience, workload feasibility, and barriers to classroom adoption in high-need LEAs.
Methods
  • End-of-week pulse surveys (Weeks 1–3) focused on clarity, pacing, and support needs
  • End-of-program survey on feasibility and intended classroom use
  • Optional follow-up check-in during the academic year (adoption status and needs)
Outputs
  • Cohort-level experience summary (what worked, what should change)
  • Targeted supports for classroom adoption (time, resources, constraints)

Technical & Advisory Support

Evaluation is supported by continuous technical guidance and external oversight. Advisory reviews are conducted biannually and used for continuous improvement.

Technical Support Team
  • Provides technical support for both virtual and on-site laboratory environments.
  • Assists with setup verification, troubleshooting, and workflow reliability.
  • Supports consistent execution of the simulation-to-hardware workflow used in RAOP activities.
External Advisory Board
Dustin J. Tyler
Arthur S. Holden Jr. Professor of Biomedical Engineering
Professor – Electrical, Computer, and Systems Engineering
Director, Human Fusions Institute (HFI)
Case Western Reserve University
Almuatazbellah Boker
Collegiate Assistant Professor
Virginia Tech

Year 1 Cycle (2026) — Assessment & Evaluation Timeline

Evidence is summarized at the cohort level and used for continuous improvement.

WhenActivityOwnerEvidence collectedOutput / decision
Jan–Feb 2026 (Recruitment & selection window)
Finalize cohort selection; confirm educator eligibility; distribute onboarding instructions and required accounts.RAOP Program Team
  • Roster confirmation and onboarding completion checklist
  • Baseline information required for program logistics
Confirmed cohort and readiness status for summer cycle.
Jun 2026 (Technical onboarding)
Software access verification; orientation to virtual workflow; expectations for artifacts and deliverables; evaluation overview.RAOP Program Team + Technical Support Team
  • Onboarding completion logs
  • Support tickets (as needed)
Educators ready to begin Week 1 virtual phase with standardized setup.
Week 1 (Virtual phase)
Pre-assessment for the rotating standardized instrument selected for the annual cycle + foundational guided inquiry modules; formative checks embedded in labs.Educators + RAOP Program Team
  • Pre-assessment submissions (cohort-level analysis)
  • Lab artifacts (plots, screenshots) and short GIBL reflections
  • Pulse survey (clarity, pacing, support needs)
Baseline dataset for the annual instrument and first-week learning evidence for coaching.
Week 2 (Virtual phase)
Deeper analysis and design tasks; draft classroom materials (lesson outline + assessment draft); peer review cycle.Educators + RAOP Program Team
  • Formative check-ins and lab artifacts aligned to selected modules
  • Draft lesson plan + draft assessment artifact
  • Peer review notes and revision actions
Draft classroom-ready package prepared for on-site validation.
Week 3 (On-site phase)
Hardware-aligned validation and demonstration; revise lesson/assessment for feasibility; capture performance evidence.Educators + RAOP Program Team + Technical Support Team
  • On-site lab artifacts (validated results and observations)
  • Revised lesson plan + revised assessment with implementation notes
  • Fidelity checklist and facilitator log
Final classroom implementation package per educator.
End of Week 3 (Immediate post-program)
Post-assessment for the rotating standardized instrument selected for the annual cycle + final deliverables submission + end-of-program feedback survey.Educators + RAOP Program Team
  • Post-assessment submissions (cohort-level analysis)
  • Final deliverables package (lesson + student materials + assessment tool/rubric + implementation notes)
  • End-of-program survey results
Cohort outcome summary and consolidated recommendations for refinement.

Five-Year Program Timeline — Continuous Improvement

High-level repeatable cycle used to refine curriculum and strengthen adoption.

WhenActivityOwnerEvidence collectedOutput / decision
Each year (Jan–Mar)
Recruit and select cohort; confirm eligibility and logistics; finalize module subset for the annual experiments.RAOP Program Team
  • Recruitment metrics and selection rubric summaries (cohort-level)
  • Annual module selection rationale tied to program outcomes
Annual cohort plan aligned to goals and constraints.
Each year (Jun)
Technical onboarding and evaluation orientation; confirm access to virtual labs; share expectations for artifacts/deliverables.RAOP Program Team + Technical Support Team
  • Onboarding completion logs
  • Support tickets/resolution summaries
Consistent starting conditions across cohorts.
Each year (Jul)
Deliver the 2-week virtual phase + 1-week on-site phase using a simulation-to-hardware workflow; collect pre/post, formative evidence, and deliverables.Educators + RAOP Program Team + Technical Support Team
  • Pre/post outcome summaries for the rotating standardized instrument at cohort level
  • Module artifacts and formative check results
  • Final classroom implementation packages
  • Fidelity logs and experience surveys
Annual cohort outcomes + classroom-ready curriculum artifacts suitable for adoption in high-need LEAs.
Each year (Aug–Oct)
Analyze cohort-level outcomes; synthesize strengths and gaps; revise templates, supports, and recommended module pathways.RAOP Program Team
  • Cohort analysis memo and prioritized improvement actions
  • Versioned updates to templates and supports
Refined curriculum and supports for the next cohort.

Follow RAOP

Updates, announcements, and participant highlights.

Official RAOP social channels. Additional platforms will be added as they launch.