We need new quality guidelines for career-focused,
competency, technology-forward online programs.
For
example, the Quality Matters Higher Education Rubric includes eight General
Standards and 43 Specific Review Standards in order to evaluate the design of
online and blended courses, and specifically addresses objectives, assessment,
instructional materials, course activities, learner interaction, course technologies
(Quality Matters, 2015).
However,
in a world of quickly evolving jobs, where industries have made entire
professions obsolete, and have created demand for new knowledge, skills, and
abilities, additional tools and evaluations are needed. Disruptive technologies
and practices are also having a profound effect, which necessitates the
development of a flexible workforce that can quickly be retrained.
Further,
with online learning, which correlates with team-based collaborations and
distributed workplaces, delivery options are also critical. Learning analytics, which include quality
assessments must now address a fairly wide range of programmatic attributes
that are not addressed in the more traditional instruments such as the OLC
Scorecard or QM’s rubric.
Interestingly,
there has been a renewed emphasis on education provided by professional
societies in addition to colleges and universities. Part of the impetus has
been due to the fact that there have been major shifts in the student population
and their reasons for pursuing education. Further, there have been major
changes in higher education, as for-profit providers and those with high
student loan default rates coming under fire.
Finally,
while online programs have been in place for 20 years, the constant development
of new mobile technologies along with the expansion of high-speed internet and
wifi networks has profoundly altered the way that learners pull information,
interact with others, and participate in knowledge sharing. Further, it has
changed how learners can approach content that requires problem-solving,
creative solutions, collaboration, and hands-on projects. A renewed focus on outcomes
as well as a collaborative, mobile, “information pull” (rather than “data
push”) approaches have profoundly affected the learning process.
Learning
Analytics
Learning
analytics, which incorporate educational data mining, process analytics, and
data visualization can be used to address some of the new concerns and focal
areas in educational programs. An effective approach was employed by Scheffel,
etal (2014) to analyze learning analytics for hybrid and online programs. In
developing quality indicators for learning analytics, Scheffel etal made
specific assumptions about the main elements to include in an instructional
program, and they also assumed that both student and instructor perceptions
were uniformly valid.
In
the Scheffel etal’s meta-analysis and ultimate determination of quality
indicators for learning analytics, a matrix emerged with five criteria and four
quality indicators (2014):
Five Criteria
and Four Quality Indicators for Each (Scheffel, 2014):
Objectives
(Awareness, Reflection, Motivation, Behavioral Change)
Learning Support
(Perceived Usefulness, Recommendation, Activity
Classification, Detection of Students at Risk)
Learning Measures and Output
(Comparability, Effectiveness, Efficiency, Helpfulness)
Data Aspects
(Transparency, Data Standards, Data Ownership, Privacy)
Organizational Aspects
(Availability, Implementation, Training of Educational Stakeholders,
Organizational Change)
Scheffel’s
work is in an early stage, and the next step will be to apply the criteria and
quality indicators to application-focused educational programs
Student-Driven
Metrics: Return on Investment (ROI)
With
the increasing cost of education, combined with the profound economic changes
that occurred in the years after 2007-2008, learners have focused on a positive
return on investment (ROI) for their investment in education.
However,
there is no clear consensus on how to measure an education ROI, particularly
across disciplines.
- Job-Focused Competency-Based (ROI for investment in education)
- Technology for Applied Knowledge (mobile / collaborative)
New Instructional
Strategy Focal Points and Areas for Quality Assessment:
The
technological advances in mobile devices as well as an enhanced infrastructure
have resulted in the need for ubiquitous access to cloud-based assets.
While
it may not yet be possible to achieve universal and continuous access to the
cloud, an increasing number of cloud-hosted applications facilitate constant
updating of information, as well as collaboration and information sharing. These often form the cornerstone of the enhanced
learning opportunities for professional development and competency-building for
new jobs.
Additional
focal points for quality assessment.
*e-texts with Collaborative Capability. Cloud-based access of e-texts, with focus on
collaborative annotations and guidance by instructor. The relevance of the
texts, as well as the robustness of the collaborative capability should be
assessed.
*Applications. Mobile devices that
utilize applications that facilitate information sharing. How effective are the
applications being used? Do they facilitate the achievement of outcomes? Some
applications foster engagement and deeper learning through immediate feedback
(Kovach etal 2015).
*Learning Management System transition,
with more organizations using a “light” version of an LMS, and focusing more on
content management in the cloud
*Collaboration: Competency-based
education often required teamwork, and thus educational / training programs
should have a capstone as well as collaborative activities that reflect the
types of activity that they’ll need to perform in professional and career
settings (Huss, etal 2015).
*Engagement: Students who desire enhanced access to
employment opportunities as well as the chance to diversify / expand their
abilities quickly lose interest if their coursework seems irrelevant, outdated,
or disconnected from the marketplace.
*Persistence: Persistence is tied to
engagement, as well as motivation. Persistence (course completion) is critical,
particularly in a context where education is expensive and industries are
transitioning, requiring workforces to retool themselves.
*Career Competencies: One clear measure
of quality (and relevance / utility for students) has to do with competencies. Competency
rubrics differ, based on the overall goals and outcomes. The development and validation of competency
models has been particularly impressive in the healthcare field (Garman &
Scribner, 2011).
Single-course
competencies:
often developed in response to compliance needs and require an assessment at
the end of the course.
Competency
clusters: often
tie to career paths, especially those that are being disrupted by new
technologies or contexts, and thus involve multiple courses, each of which
includes an assessment. There is often a summative assessment at the end
(Boahin etal, 2014).
*Integrated / multi-disciplinary capstones
and/or supervised practice and internships: Education programs that claim
to be able to place their graduates in a viable career path generally require a
problem-based capstone that is often multi-disciplinary and integrative. Further, internships and supervised practice
are also often required (McKnight, 2013).
*Project-Based
/ Task-Based Outcomes: Seamless incorporation of prior learning / experiential learning
is very desirable in career-focused professions and higher education. Thus, a
project-based activity, which requires a literature review, analysis of a
problem, creative problem-solving, an evaluation of different methods. Collaboration and teamwork are often highly
desirable, particularly if the career itself involves significant teamwork
(King & Spicer, 2009).
A View to the
Future
It
is important to continue to implement the quality assessment processes that
have been implemented with success for online and blended courses and programs.
The standards continue to be relevant and they allow a degree of
standardization in terms of expectations and practice.
However,
there are gaps in assessment thanks to the changes that have emerged due to the
factors discussed earlier, which include a focus on careers and a need to
incorporate new technologies.
Learning
analytics can be utilized in order to assess new and emerging areas of
instruction, and to assure the validity of the quality assurance process.
Assessment can be performed by means of quality assurance instruments. It can
also be performed by means of onsite trainers and evaluators, as in the case of
ADCO’s approach to oil and gas professional training (Dawoud, 2014).
*****************
References
Boahin, Peter , Eggink, Jose & Adriaan Hofman (2014)
Competency-based training in international perspective: comparing the implementation
processes towards the achievement of employability, Journal of Curriculum
Studies, 46:6, 839-858, DOI: 10.1080/00220272.2013.812680
Chico
State University (2015) Exemplary Online Instruction.
http://www.csuchico.edu/eoi/
Chico
State University (2015) Rubric for Online Teaching.
http://www.csuchico.edu/eoi/facultyrecognition/index.shtml#/csuchico/www/roi/the_rubric
Chico
State University (2015) Online Teaching and Learning Tool
http://www.csuchico.edu/eoi/
Garman A; Scribner L. Leading for
Quality in Healthcare: Development and Validation of a Competency Model. Journal
Of Healthcare Management [serial online]. November 2011;56(6):373-382.
Available from: Academic Search Elite, Ipswich, MA. Accessed September 5, 2015.
Huss, John A.; Sela, Orly; Eastep,
Shannon. A Case
Study of Online Instructors and Their Quest for Greater Interactivity in Their
Courses: Overcoming the Distance in Distance Education. Australian Journal of
Teacher Education, v40 n4 Article 5 Apr
2015
King
K. N., Spicer C. M. (2009) Badgers &
Hoosiers: An Interstate Collaborative Learning Experience Connecting MPA
Students in Wisconsin and Indiana Journal of Public Affairs Education, Vol. 15,
No. 3 (Summer, 2009), pp. 349-360
Kovach J, Miley M, Ramos M. Using
Online Studio Groups to Improve Writing Competency: A Pilot Study in a Quality
Improvement Methods Course. Decision Sciences Journal Of Innovative
Education [serial online]. July 2012;10(3):363-387. Available from:
Business Source Premier, Ipswich, MA. Accessed September 5, 2015.
McKnight S. (2013) Mental Health
Learning Needs Assessment: Competency-Based Instrument for Best Practice. Issues
In Mental Health Nursing [serial online]. June 2013; 34(6):459-471.
Available from: Academic Search Elite, Ipswich, MA. Accessed September 5, 2015.
Online
Learning Consortium (2015). Online Quality Scorecard. http://onlinelearningconsortium.org/consult/quality-scorecard/
Quality
Matters (2015) Quality Matters Higher Education Rubric.
https://www.qualitymatters.org/rubric
Scheffel,
Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus. (2014) Quality
Indicators for Learning Analytics. Journal
of Educational Technology & Society, Vol. 17, No. 4, Review Articles in
Educational Technology (October 2014), pp. 117-132