Evaluate Effectiveness

Find research papers, survey instruments & results to evaluate how well a solution, approach or method is working to assess student learning.

Philosophy

AT-Evaluation Process

Academic Technology (AT) Evaluation consultants collaborate with campus units to contribute to the scholarship of teaching, learning, and research with technology at higher education institutions.  The purpose of Academic Technology evaluation is to:


Demonstrate how the effective use of technology adds value to teaching, learning, and research

Measure the effectiveness of technology tools in reaching specific goals or outcomes

Inform refinements to tools and processes

Gather evidence of the user experience to inform instructional technology design and implementation

Determine the state of instructional technology adoption on our campus

Determine current and future instructional technology needs

Align services and support with common teaching and learning challenges on our campus

Collect formative feedback for how to improve tools and methods

AT Reports

2010 Academic Technology Survey Results: The 2010 AT survey was sent out in October 2010 to a random sample of faculty, instructional staff, and teaching assistants. The purpose of the survey is to assess the state of instructional technology use, identify teaching and technology needs and challenges, and gather preferences for learning about and implementing instructional technologies. 

Technology Enhanced Collaborative Group Work Report (2009): The 2008 Engage Technology Enhanced Collaborative Group Work (TECGW) Adaptation Award provided 40 instructors with technical and pedagogical support to implement technology enhanced group projects in their courses.  An evaluation of the program was conducted to gather evidence of methods of good practice in facilitating technology enhanced group projects.

Aligning Collaboration Tools With Academic Needs (ACTWAN): The Collaboration Tools project objective is to provide a report on the current use, successes and challenges of collaboration tools in higher education, from both the instructors’ and students’ viewpoints.

Learn@UW Milestone Review (2008): This report describes the current state of the Learn@UW program at UW-Madison and provides longitudinal analysis of Learn@UW adoption since its initial implementation in 2003, including usage information, support endeavors, and feedback from users.

2008 Academic Technology Survey :The AT Survey is sent out bi-annually to faculty, instructional staff, and teaching assistants to determine the state of instructional technology adoption our campus and current and future teaching and technology needs. This report describes the results from the 2008 survey.

VITAL Report and Handout- A Look at Grassroots Video Production and Delivery on Campus: Members from the General Library System, College of Letters and Science, School of Education, School of Medicine and Public Health, and DoIT-Academic Technology researched and documented the pedagogical and technological issues around the production and delivery of grassroots video for instruction on the UW-Madison campus.

Methodologies

Focus groups: Focus groups are similar to a group interview; the evaluator is trying to gather in-depth information on a specific tool or topic.  Focus groups are different from interviews in that they make use of group dynamics to generate data and insights that would be unlikely to emerge otherwise. (NSF 2002 Handbook for Project Evaluation)  Academic technology uses focus groups to gather information on the use, need, and perceived impact of various technologies.

Interviews: Interviews are used for obtaining in-depth information on specific topics.  There is a range of interviewing methods, from structured interviews to in-depth interviews. Interviews usually yield the richest data, and they are also quite time-consuming and expensive. (NSF 2002 Handbook for Project Evaluation)  Academic Technology uses interviews to gather information about an individual’s experience with our services or a specific technology.

Observations: Observations are used to gather first-hand data on programs, processes or behaviors.  Observations allow the evaluator to enter into and understand the situation/context and they provide good opportunities for identifying unanticipated outcomes. (NSF 2002 Handbook for Project Evaluation)  Academic Technology uses observations to observe how users interact with specific technologies and tools.

Surveys: Surveys are used for obtaining information on a wide range of topics when in-depth probing of responses is not necessary. Surveys are good at gathering descriptive data in breadth, not depth.  Surveys tend to include mostly close-ended questions. (NSF 2002 Handbook for Project Evaluation)
Academic Technology uses surveys to gather technology use and satisfaction data, program success measures, and event feedback.

Tests: Tests are used to assess the status of knowledge or the change in knowledge over time. (NSF 2002 Handbook for Project Evaluation)
Academic Technology uses tests to measure the effectiveness of a specific tool or method in helping students reach stated learning goals or outcomes.

Usability Tests: Usability tests are used to discover errors and areas of improvement for tools under development.  Testers are observed using the tool to perform specific tasks.  Observers record notes such as time to complete task, number of errors made, ability to recall information, and tester response to tool.   Academic Technology does user tests to gather feedback on and subsequently refine technology tools that are under development.  

Evaluation Assistance

Academic Technology assists instructors with planning and implementing the evaluation component of grants or other funded projects, in addition to those funded by Engage awards. For more information, visit the Engage website.