The IRIS EPO portfolio is evaluated through a strategic combination of internal and external evaluation practices, applied throughout the lifecycle of a project, with the goal of maximizing the desired programmatic impact. This approach is approach, based on the Collaborative Impact Analysis Method (IAM) (Davis and Scalice, 2015), which combines staff knowledge of programs and products, audiences, and content, with the expertise of an outside evaluator. This combination captures the effects on the behaviors, attitudes, skills, interests, and/ or knowledge of users/program participants, while achieving efficiencies by having IRIS staff conduct much of the development of assessment instruments, data collection efforts, and preliminary data analysis. To ensure success, an external evaluator provides consultation, review and feedback, and/or more robust analysis of data. Risks are monitored on a project-by-project basis as part of the evaluation process, allowing resources to be reallocated as needed to keep projects on schedule.
Each project in the portfolio is annually reviewed jointly with the external evaluator. Working with the staff lead for the project, the external evaluator scores the robustness of the project’s current evaluation, using a qualitative rubric based on best practices. The outcome of these annual reviews is a project score and steps to improve the project’s evaluation and impact. In this way, the process delivers the formative and impact data to ensure project efficacy and efficiency. Periodically, each project prepares a report on the impact of the evaluation on the project going forward. These reports are used for high-level, cross-program analysis and strategic planning. In addition, the program activities are monitored regularly by the EPO Standing Committee and at longer intervals by a separate external panel and independent evaluator.
Distinguished Lecture Series
Internships
Meetings
Posters
Professional Development
Seismographs in Schools
Social Media
Software
Teachable Moments
Website
Brudzinsky, M., M. Hubenthal, S. Fasola, E. Schnorr (2021) Learning in a Crisis: Online Skill Building Workshop Addresses Immediate Pandemic Needs and Offers Possibilities for Future Trainings, Seismological Research Letters, Vol 92, No. 5, 3215–3230. https://doi.org/10.1785/0220200472
Sumy, D.S., R. Welti, M. Hubenthal (2020) Applications and Evaluation of the IRIS Earthquake Browser: A Web‐Based Tool That Enables Multidimensional Earthquake Visualization, Seismological Research Letters, Vol. 91, No. 5, 2922–2935. https://doi.org/10.1785/0220190386
Bravo T, Taber J and Davis H (2020) A Case Study of Highly-Engaged Educators' Integration of Real-Time Seismic Data in Secondary Classrooms. Front. Earth Sci. 8:180. doi: 10.3389/feart.2020.00180
Hennet, C.B., J.J. Taber, G.E. van der Vink, C.R. Hutt. (2003) Earthquakes in Museums, Seismological Research Letters, Vol 74, No. 5, 628-634.
Hubenthal, M., O’Brien, T., & Taber, J. (2011) Posters that foster cognition in the classroom: multimedia theory applied to educational posters. Educational Media International, 48(3), 193–207. [url=http://doi.org/10.1080/09523987.2011.607322]http://doi.org/10.1080/09523987.2011.607322[/url]
Smith, M., J. Taber, and M. Hubenthal. (2006) Real-Time Seismic Displays in Museums Appeal to the Public, Eos Trans. AGU, 87 (8), 85, 91. (Full Text)
Davis, H. & Scalice, D. (2015). Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method (Invited), AGU Fall Meeting, Abstract # ED53D-0871.