May Newsletter: Evaluating eLearning Experiences

While all learning experiences evaluate learners’ progress and comprehension of intended objectives,

…Learning designers also evaluate how the learners responded to the designed learning experience.

They complete evaluations by reviewing:

  • Student Engagement

  • Overall Effectiveness

  • Social Inclusion

  • Flow of Materials

  • Student Responses & Suggestions

  • …the list goes on!

Let’s evaluate a course produced by a team of Instructional Designers.

Course Here: https://sites.google.com/asu.edu/ldt502c-1davincitutorial/home?authuser=0

The expansion of training programs during World War II led to the growth of usable instructional design models. Reiser (2017) reported that the 1960s and 1970s gave way to extracting instructional technology and design processes from conversations about multimedia development, which in turn produced more than three dozen different instructional design models referenced in the literature between 1970 and 2005 (Branch & Dousay, 2015; Gustafson, 1991, 1991; Gustafson & Branch, 1997, 2002). During the instructional design process, evaluation of the learning materials is a continuous and necessary step. It allows learning designers to recalibrate and rectify learning materials to promote a higher-level learning experience for all learners. Similar to a science experiment, learning designers hypothesize the learner outcome(s) based on decisions made during the design and development process with the provided information. Based on Wiley, Stader, & Bodily’s research (2020), each decision is a hypothesis of the form “in the context of these learners and this topic, applying this instructional design approach in this manner will maximize students’ likelihood of learning.” After testing the hypothesis (when learners experience the learning design), learning designers find that the initial hypothesis is seldom correct. Like all experiments, hypotheses are inevitably refined to obtain cyclical improvement. 
While evaluating the course Fundamentals of Video Production Using DaVinci Resolve 18 (Jordan et al., 2024), I found the course to be precisely tailored for the intended learners. It was clear that this learning experience was intended for learners with little to no experience with video production software. Each microlearning video scaffolded the tools and necessary skills to begin the video production process with DaVinci Resolve 18. This course simplified the cumulative tools needed for faculty to begin video production immediately, as it provided a learning experience to open the DaVinci application, import videos and media, edit videos, fade audio, fade a screen to black, export a completed video to YouTube, embed the video into Canvas for students to view, and even provided “pro tips” to assist professors with accelerating their new skills. In addition, the microlearning videos allowed learners to expand their learning beyond the course. Learners could view closed captions, take screenshots, pause, rewind, and fast-forward video. Along with usable content, the learners were provided a glossary, leading to more clarity with common terms used in the course. Lastly, the incorporated buttons to assist learner navigation to move throughout the course encouraged learners to review previously learned resources.  These tools provided learners a higher level of autonomy to learn at their own pace. They could follow along with their class materials, take their own notes, and easily refer back to the course (e.g. bookmark the course), just to name a few options. Robert E. Martin (2009) found that University faculty have historically enjoyed a great deal of autonomy in implementing their own notions about course design and teaching methodology in their courses. To promote autonomy and independence in the DaVinci Resolve 18 software program, learners were encouraged to follow along on their device as they progressed through each part of the learning experience. Lastly, the learning design included references to research-based elements. As a University professor, compiling and performing research is a requirement of the position. As a result, professors have multiple opportunities to present at conferences around the globe to present their most recent research findings. During a 2005 study surveying over 700 Chief Academic Officers (CAOs), it was found that a faculty member’s publication record carried more weight in his/her performance evaluation now than it did 10 years ago (O’Meara, 2005). Therefore, the learners would likely feel a feeling of inclusion and trust during the learning experience. 
When Dr. Mott approached the learning design team about developing and delivering an instructional program, his ultimate goal was to elevate the online student learning experience by increasing instructor presence using effective unit introductory videos. While taking the course that was delivered, I noticed that the learning tasks were intertwined with Merrill’s First Principles of Instruction (Merrill, 2018), as shown by the scaffolded sequence of telling learners the necessary content, showing learners how to complete the tasks in DaVinci Resolve 18, and asking learners to do those tasks during assessments. In addition, the learning designers encouraged learners to follow along on their devices, allowing learners to obtain the “do” action independently and at their own pace. Faculty members are accustomed to lecture-style instruction in a tell-ask environment, therefore this learning experience is lively and independent. Merrill (2018) found that much existing instruction can be significantly enhanced by converting from Tell-Ask learning events to Tell-Show-Do learning events. This element brought a deeper level of learning as the assessments were reflective in the student’s actions to take while producing videos. Therefore, this learning experience resulted in a higher level of comprehension and application for the intended objectives. 
While the learning design has many positive aspects, improvement during any experience is imperative and a part of the instructional design process. One of the most honest tools to evaluate the effectiveness of any learning design (e.g., unit introductory videos) is reading the student analytics. If the learning designers of this course had added a module for learners to understand how to review student analytics, faculty could gather information on how to improve their course based on their student analytics reports. For example, student analytics would reveal when students disengage (click out of, close a window, etc) during videos. Let’s say that a faculty member’s students are disengaging around the one-minute mark during the unit introductory videos for two weeks in a row. During this added module about student analytics, faculty could have learned about ways to promote engagement during a video activity. In addition, this course could be improved if placed on a learning management system (LMS), allowing measurable student analytic reports to evaluate student engagement. Research shows that the data collected during student use of content and assessments of learning can be used to identify specific portions of the instructional materials that are not successfully supporting student learning (Wiley et al., 2020). The second improvement recommendation for this course would be to provide downloadable videos and resources for learners to save on their computers. While the learners could bookmark the learning experience, it may not be on their University-issued device. In addition, adding thumbnails to each microlearning video with the specified learning task would provide a visual cue for learners to connect necessary steps during video production. Research has found that visual elements have the potential to increase motivation and foster improved learning outcomes, but only when the appropriate role of visual messages in the communication of information is taken into account (Sentz, 2020). Lastly, implementing short, multiple-choice graded assessments during the learning experience (in addition to the current assessments) would allow learners to connect and summarize provided content and assist faculty with evaluating the alignment of learner outcomes to the intended objectives and goals. Accurate assessment results help instructional designers plan future instruction, adapt current instruction, communicate levels of understanding to students, and examine the comprehensive effectiveness of instruction and course design (Wiley et al., 2020). 

Sources:

Branch, R. M., & Dousay, T. A. (2015). Survey of instructional design models (5th ed.). Bloomington, IN: Association for Educational Communications & Technology.

Chittur, Debra, "A phenomenological study of professors and instructional designers during online course development leading to enhanced student-centered pedagogy" (2018). Theses and Dissertations. 935. https://digitalcommons.pepperdine.edu/etd/935

Gustafson, K. L. (1991). Survey of instructional development models (2nd ed.). Syracuse, NY: ERIC Clearinghouse on Information Resources.

Gustafson, K. L., & Branch, R. M. (1997). Survey of instructional development models (3rd ed.). Syracuse, NY: Syracuse University.

Gustafson, K. L., & Branch, R. M. (2002). Survey of instructional development models (4th ed.). Syracuse, NY: ERIC Clearinghouse on Information & Technology.

Jordan, Lawson, & Motto Mena. (2024, April 23). Fundamentals of Video Production Using DaVinci Resolve 18. https://sites.google.com/asu.edu/ldt502c-1davincitutorial/home?authuser=0

Martin, R. E. (2009). The revenue-to-cost spiral in higher education. Raleigh, NC: John William Pope Center for Higher Education Policy. Retrieved from http://files.eric.ed.gov/fulltext/ED535460.pdf

Merrill, M. D. (2018). Using the First Principles of Instruction to Make Instruction Effective, Efficient, and Engaging. In R. E. West (Ed.), Foundations of Learning and Instructional Design Technology: The Past, Present, and Future of Learning and Instructional Design Technology. EdTech Books. Retrieved from https://edtechbooks.org/lidtfoundations/using_the_first_principles_of_instruction

O'Meara, K. A. (2005). Encouraging multiple forms of scholarship in faculty reward systems: Does it make a difference? Research in Higher Education, 46(5), 479-510. https://doi.org/10.1007/s11162-005-3362-6

Reiser, R. A. (2017). What field did you say you were in? In R. A. Reiser & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology (4th ed., pp. 1–7). New York, NY: Pearson Education, Inc.

Sentz, J. (2020). Using Visual and Graphic Elements: While Designing Instructional Activities. In J. K. McDonald & R. E. West (Eds.), Design for Learning: Principles, Processes, and Praxis. EdTech Books.

Wiley, D., Strader, R., & Bodily, R. (2020). Continuous Improvement of Instructional Materials. In J. K. McDonald & R. E. West (Eds.), Design for Learning: Principles, Processes, and Praxis. Retrieved from https://edtechbooks.org/lidtfoundations/instructional_design_models

Previous
Previous

Reflection: Disabilities in Online Learning

Next
Next

A Future Look Into: the Learning Management System