The Influence of Revising an Online Gerontology Program on the Student Experience

Posted 2021/04/08

Acknowledgements

We acknowledge the support of the University of Utah Teaching and Learning Technologies, the University of Utah College of Nursing, and the University of Utah Consortium for Families and Health Research.

Funding

Program revisions were funded through a University of Utah Teaching and Learning Technologies Online Program Development Grant.

Declaration of Interest

We have no conflicts of interest to declare.

Abstract

The recent adoption of gerontology competencies for undergraduate and graduate education emphasizes a need for national standards developed to enhance and unify the field of gerontology. The Gerontology Interdisciplinary Program at the University of Utah revised all of the Gerontology course offerings to align with the Association for Gerontology in Higher Education’s (AGHE) Gerontology Competencies for Undergraduate and Graduate Education (2014), while also making improvements in distance instructional design. In this study, we examined student course evaluation scores and written comments in six Master of Science in Gerontology core courses (at both 5000 and 6000 levels) prior to and following alignment with AGHE competencies and online design changes. Data included evaluations two semesters prior to and two semesters following course revisions and was assessed using paired t-test and thematic analysis. No significant statistical findings were found between pre and post revisions. Qualitative comments post revision did show an increased focus in comments about interactive and engaging technology. These findings will be used for course and program quality improvement initiatives, including enhanced approaches to documenting and assessing competency-based education.

Keywords

Competency-based education, course evaluation, course revision, distance education

Background

Competency-based education (CBE) is growing in popularity and demand (Burnette, 2016; McClarty & Gaertner, 2015). Gerontology curriculum development has moved towards CBE with national standards developed to enhance and unify the field of gerontology (Association for Gerontology in Higher Education [AGHE], 2014; Damron-Rodriguez et al., 2019).  The Academy for Gerontology in Higher Education (AGHE) approved the Gerontology Competencies for Undergraduate and Graduate Education (AGHE, 2014); designed to serve as a curricular guide for undergraduate (i.e., majors, minors, certificates) and master’s degree level programs. Benefits of using competencies for curricular revisions, include: shifting focus to measurable outcomes (Burnette, 2016; Damron-Rodriguez et al., 2019; Wendt, Peterson, & Douglass, 1993), increasing program accountability for learning outcomes (Burnette, 2016; Damron-Rodriquez et al., 2019; McClarty & Gaertner, 2015), preparing students to graduate with necessary skills (McClarty & Gaertner, 2015), and training the gerontological workforce by bridging the gaps between aging services and gerontology education (Applebaum & Leek, 2008; Damron-Rodriquez et al., 2019).

At the same time CBE has increased, online teaching and learning are more accessible and in demand (Means, Toyama, Murphy, Bakia, & Jones, 2010; Woldeab, Yawson, & Osafo, 2020). For programs looking to enhance curriculum and program accessibility, considering both CBE and distance course design are vital. Quality course design for courses incorporating CBE, emphasize opportunities for student application and practice, active learning strategies, and timely instructor response and feedback (Krause, Dias, & Schedler, 2015). In a previous paper (Dassel, Eaton, & Felsted, 2019) we described an approach to the program-wide revisions to align with the AGHE competencies and to meet current recommendations in cyber-pedagogy. The University of Utah Gerontology Interdisciplinary Program (GIP) was in a position to make revisions to enhance both CBE and online instructional design using a course/credit model that embeds competencies within a traditional approach to higher education that offers credit hours towards a degree (Council of Regional Accrediting Commissions [C-RAC], 2015). The University’s Online Teaching and Learning Technologies (TLT) office released a funding opportunity for programs wanting to move completely online. The GIP took the opportunity to apply to use funds with the following purpose: 1) transition the Masters of Science program into a completely online format, and 2) improve the quality and consistency of existing gerontology courses through a full curriculum review by the experts at TLT. The goal was to make the fully online transition in a manner that allowed for dynamic online learning and to incorporate CBE within the program. In 2015, the GIP began the work to revise all program courses to meet best practices of online learning and map program curricula to National Competencies in Gerontology Education (AGHE, 2014).

            Course revisions were complete in 2017. We then applied for and received official UOnline Program status and accreditation as a fully online program through the Northwest Commission on Colleges and Universities (2020). This accreditation allows us to be recognized as an official UOnline Program at the University of Utah. The University is a member of the State Authorization Reciprocity Agreement (NC-SARA) which reduces the number of other state regulations to continually monitor, resulting in more efficiency in the authorization process. Through NC-SARA, GIP is able to offer and expand certain educational opportunities to students in and out of the state of Utah (National Council for State Authorization of Reciprocity Agreements, 2020). In 2017, we were also awarded Program of Merit (POM) status from the Academy for Gerontology in Higher Education (AGHE), at the master’s degree level. The process of curricula review, competency mapping, and online revisions/planning, facilitated our application, review, and award of the POM.

            Course revision and development followed a model that incorporated best practices in teaching pedagogy and online learning. These incorporated Fink’s (2003) approach to designing college courses, using the DREAM exercise, situational factors exercise, course alignment grid, and taxonomy of significant learning. A backward design approach (Wiggins & McTighe, 2005) helped faculty begin with competencies and learning objectives followed by identifying assessments that then measure those objectives. Bloom’s (1984) taxonomy was used to design assessments that evaluate the learning experiences accurately and active learning principles (Bonwell & Eison, 1991; Prince, 2004) guided choices to facilitate dynamic online learning. Instructional designers met individually with instructors to work through, enhance, and redesign courses to facilitate this work.

            Upon completion, the program continued to assess student learning using individual course assessments, grades, progress towards graduation, annual and exit student interviews, and alumni surveys. However, we wondered about the student experience and reaction to changes pre and post revision of the entire curricula. As this process spanned four years and courses, it became of interest to see if existing data might facilitate better understanding of the student experience pre compared to post program revision.

The purpose of this paper is to compare student course evaluations from six core courses of the Master of Science in Gerontology program before and after alignment with AGHE competencies and online design changes. The objective of this study is to analyze pre and post qualitative and quantitative student evaluations in order to assess indicators of program quality and improvement. We hypothesize that course evaluations will improve from pre to post revision. Testing of this hypothesis occurred through two aims:

Aim 1: Assess the changes in numerical course ratings provided by students pre to post course evaluation.

Aim 2: Assess the changes, pre to post course revision, in student open-ended feedback submitted with course evaluations.

Methods

Course Selection

For the purpose of the current study, we compared de-identified, anonymous student course evaluations in six of our Master of Science core courses before and after the course revision and alignment. The six core courses required in our Master of Science program are: 1) GERON 5001/6001: Introduction to Aging, 2) GERON 5370/6370: Health and Optimal Aging, 3) GERON 5002/6002: Services Agencies and Programs for Older Adults, 4) GERON 5500/6500: Social and Public Policy in Aging, 5) GERON 5604/6604: Physiology and Psychology of Aging, and 6) GERON 5003/6003: Research Methods in Aging (Note. 5000 and 6000 level courses are considered to be graduate level by the University of Utah). Two additional core courses, GERON 5990/6990: Gerontology Practicum and GERON 6970/6975: Gerontology Thesis/Project, were omitted from this study as they were newly created in an online format, are mentor-based (one instructor to one or two students), and don’t receive evaluations due to the small course size.

These six core courses underwent significant redesign across three consecutive semesters. Each instructor worked one on one with an instructional designer provided through the UOnline grant mechanism. Instructional designers, associated with the University of Utah’s TLT, aided course instructors in updating their courses with the latest technological media to provide online content in innovative and effective ways. 

Course Evaluations

Faculty were guided in assessing and revising courses through the use of AGHE competencies (2014) and Fink’s (2003) and Bloom’s (1984) taxonomies. AGHE competencies were first mapped across all gerontology courses, identifying redundancy, overlap, and missing content. A detailed description of this process is described in Dassel et al. (2019). Faculty noted recommended revisions based on competencies specific to objectives and modified content. These were incorporated as faculty worked with instructional designers on their assigned course. Next, instructors used the framework of taxonomies to redesign the student learning experience for an active online format. Fink’s taxonomy is a non-hierarchical model, which defines six major domains that need to be present for a complete learning experience – foundational knowledge, application, integration, human dimensions, caring, and learning to learn (Fink, 2003). Bloom’s taxonomy, revised posthumously by a group of cognitive psychologists in 2001, is a hierarchical model which defines and distinguishes six categories of learning (Bloom, 1984; Anderson & Krathwohl, 2001). Bloom’s six categories, which are each intended to be individually mastered before moving to the next category, are remember, understand, apply, analyze, evaluate, and create. These designations allow for the design of the accompanying assessment to accurately evaluate the learning experience by level.

Permission to analyze student course evaluations was submitted and reviewed by the Institutional Review Board (IRB) at the University of Utah. The IRB determined oversight was not required as this work does not meet the definition of Human Subjects Research. All student evaluations are completed on an anonymous basis. Evaluations are used as a tool of quality improvement to assess course outcomes and faculty instructions. In order to obtain a representative sample of student evaluations, we assessed evaluations from two consecutive semesters immediately prior to the course revision and the two consecutive semesters immediately following the course revision.

Course evaluations were emailed to students during the last month of the semester. Students were asked to voluntarily complete the anonymous course evaluations. The data, consisting of numerical scaled response options and open-ended comment sections, was summarized and provided to course instructors at the end of the semester once grades have been submitted. From the full list of course evaluation questions, we selected 10 quantitative questions that we felt were most relevant to course revision. The questions selected include: 1) Overall course evaluation, 2) The course objectives were clearly stated, 3) The course objectives were met, 4) The course content was well organized, 5) The course materials were helpful in meeting course objectives, 6) Assignments and exams reflected what was covered in the course, 7) I learned a great deal in this course, 8) As a result of this course, my interest in the subject increased, 9) Course requirements and grading criteria were clear, and 10) I gained an excellent understanding of concepts in this field. Response options were based on a 5-point Likert scale with 1= strongly disagree to 6 = strongly agree. Open-ended questions ask students to comment on: 1) course effectiveness, 2) the online components of the course, and 3) comments intended for the instructor.

Data Analysis

Data analysis occurred in two phases. Phase one focused on quantitative data from the course evaluations. Pre and post data were aggregated for each course. Since students do not take a course multiple times, analyzing data pre to post by individual student is impossible. Rather than focus on the individual student as the unit of analysis, we assessed pre and post evaluations using the course as the unit of analysis. The means of each sample were calculated for each of the course evaluation questions (e.g., overall course rating, course objectives) as a proxy for evaluating the effectiveness of curriculum revision and course mapping. We used univariate statistics to describe frequencies and mean responses for each evaluation question. Paired-samples t-tests were conducted on the course means to examine score changes from pre to post course revision. Each course was compared separately and then data was pooled for all courses to assess program change over time. For the qualitative portion of this study, we compiled and organized all of the course evaluation open-ended student responses by course and semester. Data was uploaded into NVivo (QSR International, 2018) and assessed in a two-phase process. First each comment was read and coded into four a priori codes: 1) pre-commendations, 2) pre-recommendations, 3) post-commendations, 4) post-recommendations. The second phase of coding used thematic analysis to assess the main themes presented by students (Saldaña, 2009). This allowed us to assess potential change in student thoughts pre- to post-revision.

Results

Data are anonymous and demographics were not gathered as part of student evaluations. However, we do have a general idea of student demographics within the GIP. During a recent fall semester, we had 189 unique students enrolled in gerontology courses. Students represented 6 master’s degree programs, and 3 doctorate programs; with 9 students undeclared and 4 nonmatriculated. The average age of students was 29; 137 female (72.5%), 50 male (26.5%), and 2 unknown (1.05%). The majority of students were white (67.72%), with others identifying as Hispanic/Latino (13.76%), Asian (7.40%), unknown ethnicity (4.23%), multi-racial (3.70%), international (1.59%), Black/African American (1.05%), and Native Hawaiian or other Pacific Islander (0.53%).

A summary of the t-test data results is found in Table 1. Some data were unavailable due to too few responses. One course, GERON 5500/6500, did not have sufficient data for analysis (less than 2 observations per class), as it was a newly developed course and did not have sufficient pre-revision data. This course was retained in the overall comparison of results pre to post. Paired t-tests comparing overall course ratings pre and post course revision revealed a trend in improvement in the GERON 5001/6001: Introduction to Aging (t=4.09; p=.05) course. Examination of aggregate data from all of the courses in relation to individual course evaluation questions showed trends in improvement for the following two areas: 1) “The course objectives were met” (t=1.47; p=.09), and 2) “I learned a great deal in this course” (t=1.36; p=.09). There were no significant differences on overall or individual course evaluation questions pre to post course revision.

Table 1. Assessment of Course Evaluation Questions Pre- to Post-Revision

Open-Ended Student Comments

Qualitative analysis summarizes both overall number of commendations and recommendations and the content of comments to assess change pre to post revision. A total of 298 codes were documented pre-revision (see Table 2). Of these 71% were commendations, focusing on positive feedback about course content, online teaching, and instructor efficacy. Comments focusing on recommendations for change comprised 29% of the total pre-revision codes. These recommendations centered on issues with course content, technology, and instruction. Comments in the recommendation’s category included both negative reviews and constructive ideas for change. Post-revision comments were coded 257 times. Seventy-three percent of these were commendations and 27% were recommendations (Table 2). Percentages are very similar pre to post, demonstrating that overall positive or negative comments did not alter much from pre to post revision.

Table 2. Overall Pre to Post Coding of Course Evaluation Qualitative Comments

The second phase of qualitative analysis assessed the content of the comments to understand the topics focused on pre to post revision. Student comments were evaluated for each course; pre-revision comments were analyzed first followed post the post-revision comments. After identifying themes within pre-revision comments, a summary was written of the main ideas. Following this, post-revision comments were read and coded for the same course. A summary was then written about the main themes for the post-revision codes. Representative quotes were included in each summary, in order to present examples of themes. At this time the pre and post revision summaries were compared for each course. Any major thematic changes were noted in a final course comparison summary. Once this process was complete for each course, all course comparison summaries were re-read and coded for similarities and differences across the group of courses. Table 3 includes a summary of each course, including representative quotes.

Table 3. Analysis of Student Comments by Class

Summary of Qualitative Comments Pre to Post Revision

The following summarizes overall findings from qualitative analysis of student open-ended course evaluation comments. Student comments increased in two main areas post-revision when compared to pre-revision: 1) connection to the instructor, and 2) organized content.

            Connection to the Instructor. Students expressed not wanting all the extra technological features integrated into courses such as screen and video recorded Power Point lectures, interactive quizzes, and movie creation apps. A variety of apps (e.g., Flipgrid, Lucid chart, Pathbrite) led to confusion and overwhelmed students. However, students emphasized the importance of technology in helping them maintain connection with the instructor. For example, one student stated, “I especially liked the introduction videos before each module because it felt like the instructor was in constant communication with the class.” The adoption of video was particularly useful in helping students feel this connection.

            Organized Content. Comments emphasized the importance of balancing assignments, content, and the amount of work. Students noted that spreading assignments out throughout the semester helped them disperse their stress. This was most often mentioned when a course had multiple assignments due the last week of the semester. One student commented, “Assign one of the larger projects to be due at mid-term, to space out the stress.” Students value learning and in an online environment this requires incorporating moments of accountability to help students interact with the content. Students emphasized wanting these opportunities for accountability and when a course was lacking this, they acknowledged their lack of course interaction. “I have mixed feelings about the assignments. On the one hand, I feel that the small amount of assignments was nice, but also allowed for me to be less involved in the course than perhaps I should have.”

Discussion

In this mixed method, multi-year study examining student evaluations from pre to post course revisions, quantitative analysis did not produce statistically significant differences in mean course evaluation scores. This may be attributed to the small sample size, use of aggregate data rather than individual data points, missing data, and little variation in scores with most courses receiving high mean scores. In the qualitative analysis of student evaluations, we gained useful information. We found that students value technology that augments their connection to the instructor and course organization. Some students do not want all the extra features that come with a wide variety of technology (e.g., external sites to create blogs, mini podcasts, video creation). Students noticed video introductions, video lectures, and video summaries, often stating it made them feel connected to the instructor. This aligns with the quality indicators in CBE online courses that emphasize the importance of technology and navigation as one of seven recommended areas for measurement (Krause et al., 2015). Students want to learn. Learning online necessitates the incorporation of one or more forms of accountability (which the students want). In addition, students desire forms of accountability throughout the semester, rather than just at semester’s end. The balance of assignments, content, and amount of work matters to students. Instructional design is vital in quality online courses. Accountability should be an area that faculty and instructional designers collaborate on to facilitate enhanced quality in online CBE. Two quality indicators of accountability include 1) assessment and evaluation, and 2) competence and learning activities (Krause et al., 2015). We also observed an increase in student comments specific to a certain topic each time a major adjustment occurred, whether pre or post revision. This could be an outcome of the “growing pains” related to trying something new. Similar to piloting research, faculty pilot testing teaching strategies often need student feedback to improve changes in a manner that actually works for students. Checking in with students demonstrates the quality indicator of learner support, and allows faculty to assess and evaluate their course as part of quality assurance (Krause et al., 2015).

The information obtained from this study is relevant to course and program quality improvement. Strengths include the mixed method format and multi-year analysis. Limitations include data that did not allow for pre and post data from the same students, as it is impossible to require students to take a course twice. In some cases, there was not sufficient data for analysis, as t-tests require at least two observations per class (e.g., GERON 5500/6500). This insufficient data was attributed to new course development and changes in student evaluation questions that occurred across the University of Utah. This meant that questions were different pre to post revision for some courses. In addition, conducting a technology revision simultaneously with competency revisions makes it difficult to tease out changes due to course format versus curriculum. Instructors need to remind students which competencies are being covered and how they will expect to interact with this content during the course. Clear learning outcomes and student comprehension of the proficiencies they are working on enhances CBE (Burnette, 2016).

Mapping the entire GIP curriculum to the AGHE competency guidelines (Dassel et al., 2019) prepared us to apply for and receive Program of Merit designation through AGHE. This Program of Merit status has provided the foundation for future application for accreditation through the Accreditation for Gerontology Education Council (AGEC), which requires that the programs under review align with the AGHE Gerontology Competencies for Undergraduate and Graduate Education (2014). Students from all health science disciplines participate in undergraduate and graduate level certificates available through our program. Improving program quality and demonstrating the efficacy of such changes should strengthen the ability of students to work with older adults in community and health care settings.

Programs should build on CBE by developing measures to assess student achievement of competencies. This process can be used to improve the quality of the student learning experience (Damron-Rodriguez et al., 2019; McClarty & Gaertner, 2015). Our program is developing a tool that will allow faculty to assess program learning outcomes and AGHE competencies within each class. Data will be gathered every 3 years and will facilitate progress at both the course and program levels.  Tools, such as this, can be shared in an effort to develop tool-kits for other gerontology programs to build quality models of competency-based education (Damron-Rodriguez et al., 2019). It is our goal to enhance the ability of graduates to demonstrate the competencies and skills they have gained through high quality gerontology education as they work with employers and older adults. We will enhance our approach to CBE by assessing the path alumni take and their use of competencies to communicate their knowledge, skills, and contributions within the workforce. Advancing CBE in gerontology needs to happen through organizational leadership (Damron-Rodriguez et al., 2019). Our program benefits from being housed within a College of Nursing that follows a CBE model and process for accreditation. We can learn from this process of documentation, tracking, assessment, and quality improvement to enhance the rigor and approach we take to CBE in gerontology programs. Finally, plan to share our CBE strategies, assessment tools, and models gerontology programs  in the Utah State Gerontology Collaborative.

The results of this study have implications beyond the Gerontology Interdisciplinary Program to the larger Health Sciences campus where our program and college are housed. Many interprofessional health science students enroll in our courses. Thus, improving program quality and demonstrating efficacy ultimately strengthens students’ ability to work effectively with older adults in a variety of settings.

References

Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.

Applebaum, R. & Leek, J. (2008). Bridging the academic/practice gap in gerontology and geriatrics: Mapping a route to mutual success. Annual Review of Gerontology and Geriatrics, 28, 131-148. doi: 10.1891/0198-8794.28.131

Association for Gerontology in Higher Education [AGHE] (2014). Gerontology competencies for undergraduate and graduate education. Washington, DC: Association for Gerontology in Higher Education. Retrieved from: https://www.geron.org/images/gsa/AGHE/gerontology_competencies.pdf

Bloom, B. S. (1984). Taxonomy of educational objectives: The classification of educational goals. New York: Longman.

Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. ASH-ERIC Higher Education Report. Washington, DC: School of Education and Human Development, George Washington University.

Burnette, D. M. (2016). The renewal of competency-based education: A review of the literature. The Journal of Continuing Higher Education, 64, 84-93. doi: 10.1080/07377363.2016.1177704

Council of Regional Accrediting Commissions [C-RAC]. (2015, June 2). Framework for competency-based education [Press release]. Retrieved from https://download.hlcommission.org/C-RAC_CBE_Statement_6_2_2015.pdf

Damron-Rodriguez, J., Frank, J. C., Maiden, R. J., Abushakrah, J., Jukema, J. S., Pianosi, B., & Sterns, H. L. (2019). Gerontology competencies: Construction, consensus and contribution. Gerontology & Geriatrics Education, 40(4), 409-431. doi: 10.1080/02701960.2019.1647835

Dassel, K., Eaton, J., & Felsted, K. (2019). Navigating the future of gerontology education: Curriculum mapping to the AGHE competencies. Gerontology & Geriatrics Education, 40(1), 132-138.

Fink, L.D. (2003) Creating significant learning experiences: An integrated approach to designing college courses. San Francisco: Jossey‐Bass.

Krause, J., Dias, L. P., & Schedler, C. (2015). Competency-based education: A framework for measuring quality courses. Online Journal of Distance Learning Administration, 18(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring181/krause_dias_schedler181.html

McClarty, K. L. & Gaertner, M. N. (2015). Measuring mastery: Best practices for assessment in competency-based education. AEI Series on Competency-Based Higher Education. Washington, DC: Center on Higher Education Reform & American Enterprise Institute for Public Policy Research.

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Dept. of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service website. Retrieved from https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf

National Council for State Authorization Reciprocity Agreements [NC-SARA]. (2020). About NC-SARA. Retrieved from https://nc-sara.org/about-nc-sara

Northwest Commission on Colleges and Universities. (2020). Accreditation. Retrieved from https://www.nwccu.org/accreditation%20/

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223-231.

QSR International Pty Ltd. (2018). NVivo qualitative data analysis software (version 12) [Software]. Retrieved from https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home. Accessed May 17, 2020.

Saldaña, J. (2009). The coding manual for qualitative researchers. Thousand Oaks, CA: SAGE.

Wendt, P. F., Peterson, D. A., & Douglass, E. B. (1993). Core principles and outcomes of gerontology, geriatrics, and aging studies instruction. Washington, DC: Association for Gerontology in Higher Education and the University of Southern California.

Wiggins, G.P., & McTighe, J. (2005). Understanding by design. (2nd Ed.).  Alexandria, VA: Association for Supervision and Curriculum Development.

Woldeab, D., Yawson, R.M, & Osafo, E. (2020). A systematic meta-analytic review of thinking beyond the comparison of online versus traditional learning. E-Journal of Business Education & Scholarship of Teaching, 14(1), 1-24.



Return to Table of Contents: 2021 Journal of the Academy of Health Sciences: A Pre-Print Repository

The Influence of Revising an Online Gerontology Program on the Student Experience by Jacqueline Eaton, PhD, Kara Dassel, PhD & Katarina Friberg Felsted, PhD