Unlocking the Power of Quiz Analytics for Smarter Assessments (Issue 35)

Computer screens displaying various data analytics graphs

Author:Dr. Jo Ann Smith, University of Central Florida 

Editor: Dr. Denise Lowe, University of Central Florida

Dear ADDIE, 

I’ve been using quizzes in my online course to assess student learning, but I’m not sure how to make the most of the data provided by the LMS quiz analytics tools. I want to ensure my quizzes are effective and truly reflective of student understanding. What should I look for in the data, and how can I use this information to improve my quizzes? 

Signed,

Quizzer in Quandary 

The goal of data analytics is to enhance the quality of quiz items and support an overall assessment strategy.

Dear Quizzer, 

What a great and relevant question in today’s data-driven landscape! Learning Management Systems (LMS) like Canvas, Blackboard, and Moodle offer quiz analytics tools based provide you with useful information about student learning and engagement. By understanding the types of analytics available and how to interpret them, you can enhance the effectiveness of your quizzes and ensure assessments are providing you with meaningful information about your students’ learning (Brown & Race, 2012; Deetjen-Ruiz et al. 2023; Kumar, et al 2021; Gamage, et al, 2019). Let’s break down the key types of analytics commonly provided by LMS quiz tools and discuss how you can leverage these insights for quiz improvement. 

Using Quiz Analytics to Refine Assessments 

When analyzing quiz data, there are several key statistics that can guide your decision-making and provide valuable insights into student performance and question effectiveness. The goal is to use this data to enhance both the quality of your quiz items and support the overall assessment strategy. Below is a table outlining five common types of analytics provided by LMS quiz tools and how to use these insights to improve your quiz items. Each data type is paired with an example to illustrate how it can inform specific actions for you to take to refine assessments. 

Data Type Applications table with item analysis, description, improvement applications, and examples provided.

By analyzing these types of data, you can refine your quizzes to ensure that each question is clear, fair, and aligned with your student learning objectives. This iterative process of reviewing and revising assessments based on data helps create more effective learning experiences and ensures that quizzes serve as meaningful tools for both learning and evaluation. 

You could also experiment with adaptive tests that adjusts question difficulty based on student performance. Heat map quiz analytics are also a good visual to quickly identify question items where rows represent a question, and each column represents a student. A specific question might show mostly green colors except for a few patches of red, suggesting that while most students answered quickly and correctly, a few took longer and answered incorrectly. This visual clue could prompt a review of that question to determine if it was unclear or too difficult for those students. 

I also like to use data from open-ended responses to understand the depth of student understanding beyond what multiple-choice questions can provide. Additionally, you could explore the impact of quiz timing (e.g., open-book vs. closed-book quizzes) and provide immediate feedback to further enhance student learning and engagement. 

We’d love to hear your thoughts on innovative uses of LMS quiz tools to enhance learning outcomes. What other ideas or plans for the use of data analytics have you applied or are exploring at your higher education institution? Please share your thoughts with our TOPkit community on LinkedIn!

References 

Brown, S., & Race, P. (2012). Using Effective Assessment to Drive Student Learning. Higher Education Academy. 

Deetjen-Ruiz, R., Esponda-Pérez, J. A., Haris, I., García, D. S., Osorio, J. L. Q., & Tsarev, R. (2023). Evaluating the Reliability of Tests Used in LMS Moodle for E-Learning. In Proceedings of the Computational Methods in Systems and Software (pp. 1-8). Cham: Springer Nature Switzerland. 

Kumar, D., Jaipurkar, R., Shekhar, A., Sikri, G., & Srinivas, V. (2021). Item analysis of multiple choice questions: A quality assurance test for an assessment tool. Medical Journal Armed Forces India, 77, S85-S89. 

Gamage, S.H.P.W., Ayres, J.R., Behrend, M.B. et al. (2019). Optimising Moodle quizzes for online assessments. IJ STEM Ed 6, 27 

Considerations of Generative AI for Content Creation (Issue 32)

Robot working on a laptop computer

Author: Anastasia Bojanowski, University of Central Florida

Editor: Dr. Denise Lowe, University of Central Florida

Dear ADDIE,

Some of my colleagues are leveraging Generative AI for content creation.  I have content that needs to be updated and would like to use AI platforms. However, I soon became overwhelmed when I searched for options. Further, I have reservations regarding the accuracy and reliability of AI platforms.  Any suggestions?

Lost in the Matrix

Dear Lost in the Matrix,

I can sympathize with feeling overwhelmed when considering AI platforms for content creation. The options are plentiful.  You are also wise to question the accuracy and reliability of content creation. Indeed, the rate of Generative AI to hallucinate or invent information is estimated to be between 3-27% (Metz, 2023).  Further, the output can contain biases and misinformation (Zewe, 2023).  However, with a healthy level of pragmatism, Generative AI has a place in content creation if users follow the ASSURE and ADDIE models and include design and learning theory in prompts.

When updating content, consider using the design features of platforms and applications.  For example, older PowerPoints can be refreshed with a newer theme, stock images, and the “Designer” feature native to the program.  Additionally, presentations can be uploaded into free versions of Google’s Gamma, Adobe Express (webpage), and Canva.

Generative AI prompts should follow design and learning theory models for content creation.

Another option is to use platforms such as Microsoft’s Copilot or Open AI’s ChatGPT to generate an initial draft of anything from lesson plans to grading rubrics.  Consider experimenting with each by using the same prompt and vetting results. Regardless of the selection of the platform, the prompt should be composed to include instructional design and learning theory. Christie (2024) recommends constructing prompts that include Gagné’s (1985) events of instruction:  

  • Gaining attention
  • Informing the learner of the objective
  • Stimulating recall of prerequisite learning
  • Presenting the stimulus material
  • Providing learning guidance
  • Eliciting the performance
  • Providing feedback
  • Assessing the performance
  • Enhancing retention and transfer
Prompt using Gangé’s Events of Instruction: You are an expert in creating an exercise supporting experiential and active learning.  The activity requires peer collaboration for an asynchronous online course and is based on the creation and review of a digital portfolio for professional purposes.  Students will be sorted into groups of 3 for peer review.  Upon completion of the activity, students will create a digital portfolio that includes a skills-based resume, statement of philosophy, projects, and community service.  Afterward, students will be asked to provide feedback on design and content. The activity is a 3-week activity. Please create a brief yet compelling description of the activity that can be posted in Canvas as an introduction.  Create step-by-step directions to create a digital portfolio that includes content pages for the following: skill-based resume, personal statement, projects, and community service, and contact. Include any resources that can help create a college student digital portfolio.  Create a peer review worksheet that students can use to evaluate design and content. Make sure that the worksheet can be filled out online. Generate a grading rubric that evaluates design, skills-based resume, alignment of skills-based resume with projects, and peer review. Consider each part of the instructions given and generate the most engaging lesson.

Keep in mind that the generated lesson may require several iterations to achieve the desired output.  Further, the output is an initial draft that requires vetting for accuracy and integrity and adding legitimate resources. Output requires careful review to support universal design principles, protect privacy, and avoid bias. 

The one platform that lists sources used to generate output is Microsoft’s Copilot. A follow-up prompt that asks for specific sites or resources to be included has mixed results.  Consequently, the problem with legitimacy and accuracy persists.

Moving forward, content creators should require Generative AI platforms to safeguard privacy and provide robust bias reviews—even with free versions of their platforms.  Just as textbook companies were forced to ensure universal design to accommodate learners with disabilities to remain competitive, Generative AI platforms should protect privacy and minimize bias to be a viable candidate for content creation.

What other ideas or plans for the inclusion of AI have you applied or are exploring at your higher education institution? Please share your thoughts with our TOPkit community on LinkedIn!

References

Christie, B [Alchemy]. (2024, January 17). AI for the new year: Integrating AI into academic work [Video]. YouTube. https://www.youtube.com/watch?v=Lx0gGJ18MlI

Gagné, R. M. (1985). The conditions of learning and theory of instruction (4th ed.). Holt, Rinehart & Winston.

Metz, C. (2023, November 16).  Chatbots may ‘hallucinate’ more often than many realize.  The New York Times. https://www.nytimes.com/2023/11/06/technology/chatbots-hallucination-rates.html

Zewe, A. (2023, November 9). Explained: Generative AI: How do powerful generative AI systems like ChatGPT work, and what makes them different from other types of artificial intelligence? MIT News. https://news.mit.edu/2023/explained-generative-ai-1109

The Road to Instructional Designer Credibility (Issue 19)

Credibility road sign

Author(s): Dr. Rohan Jowallah

Editor: Dr. Denise Lowe

Dear ADDIE,

I am new to the instructional design profession. I got this job–which I love–because I taught online, used flipped classroom strategies, and redesigned my course several times—not because I have any training or education in this field. I also did not have the support I needed with my first supervisor (who has since left). I’m looking to re-image myself because my faculty see me and automatically think “Oh you just want me to teach online!” OR “You demanded I take a survey. Who are you to tell ME what to do?”

What do you suggest? I really want to help my faculty move toward the 21st Century in higher education.

Signed,

Reformation in Progress

Dear Reformation,

Thanks for your question! First of all, I hope you are keeping yourself safe. Even in the best of times, the scenario you present is challenging – yet quite commonly encountered. During the additional challenges posed by the COVID-19 situation, it can be even more difficult to engage faculty in quality online instruction. If ever there was a time to consider yourself as a central figure in your institution, it is now. Instructional designers have been called upon to assist and play a vital role in supporting faculty in teaching remotely. There are a few strategies that can be used to navigate various organizational cultures.

More than ever, instructional designers are central figures for institutional teaching and learning goals.

#1 Ensure that you locate a faculty member who is an advocate for online teaching and learning. Once you do so, make every effort to build a healthy relationship. This will create a way for you to provide needed instructional guidance. Once you find this online champion, you will have access to others who may be interested. Remember, any cultural shift will take time.

#2 I would also recommend that you use the online Faculty Development Decision Guide (FDDG) to assess your organizational needs. Doing so will also provide you with a pathway for developing initiatives for supporting faculty.

#3 Since you have also taught online and have developed various courses, it will be necessary to model/show faculty members the endless possibilities of online teaching and learning. The most significant way to show your skills will be to demonstrate them in practice.

#4 Finally, I would also recommend that you start a series of communication with your faculty members. Your conversation could focus on current practices and research in online teaching and learning, tools, and technologies used in online learning and teaching, and your research. Importantly, I recommend that you consider hosting some informal sessions. These sessions could take the form of one-to-one meetings or group meetings. Your ultimate aim is to build rapport with faculty. Doing so will require time, understanding, support, engagement, and effective communication.

I’m sure there are many more tips that others in the community have found useful. What strategies do you use for effectively engaging faculty in the online course design process – especially during such challenging times as we are currently facing? Please share your thoughts with our TOPkit community on LinkedIn!