Posts tagged inservice
Improving Our Self-Directed Professional Development in Moodle
0Moodle has become the primary vehicle of our online learning opportunities. Our Inservice portal was moved off the outdated Moodle server to the newer one that is currently used by teachers and students as an online classroom management tool. The district will be piloting its first entirely online course for K-12 students in the near future, but we do have a number of self-directed courses available. Instructional topics consist of things like using AESOP (our online leave reporter), MyStudent (our administrative desktop application), the Video Portal for playing licensed videos (only accessible inside the district), and others. The content in most of these courses are divided into multiple sections (Beginner, Intermediate, and Advanced). Participants typically watch one or more videos showing the features of the tool, after which they must pass a quiz with an 80% or greater score. If they pass, they receive a printable certificate recognizing their completion of the course, and earn certification points that can be used toward CACTUS re-licensure credit.
We also use Moodle to deliver the annual instruction, required for all district employees, on Blood Borne Pathogens, hazardous materials in the workplace, and the district’s policy on harassment and discrimination. Moodle will continue to be the chosen means to deliver these forms of online professional development.
However, I think we can do better. I question the effectiveness of these self-directed courses, for a few reasons:
- There is no measurement of preexisting skills and knowledge to use as a baseline, to evaluate learning.
- The short 10-question assessment accompanying each course’s section does not properly measure the effectiveness of the courses. Users are free to retake the quiz as many times as they need to get it right. There is no penalty for retaking the quiz.
- We have no data on the impact of these courses. In fact, from personal experience I can vouch that many of the participants who passed some of these courses, are later stumped on very basic aspects of the tools in question. An adequate instructional program should minimize these instances. But more importantly, we should be able to assess how many people are experiencing these post-training difficulties.
- Some of the courses haven’t been revised in over 3 years.
Without proper assessment, a self-directed course isn’t much different than just posting the tutorials on WeberTube. There is value to self-directed learning, but we need to take a more active role in evaluating these courses. Participants need to be given the chance to put their knowledge into practice. Follow-up surveys could be conducted to determine if the training impacted their behavior 1 month, 3 months, or even 6 months down the road. And revisions to the courses should be frequent and consistent with the data gathered.
Evaluating Online Professional Development
0I’ve been musing over how to revise the online professional development offerings in our district for awhile. Our district is getting close to the point where we can start implementing changes that ensure meaningful learning. I’ve been studying different aspects of evaluations, namely the basics of how to conduct them and collect data.
A goal-based evaluation would be ideal for our online inservice. This type of evaluation measures efficiency (the timeliness in which the learning is conducted), effectiveness (whether the participants actually learned the material following the instruction), and impact (how their behavior is affected long-term). There is value to both qualitative and quantitative measurement tools, and the data we gather should consist of both.
It’s important to understand the stakeholders involved as well. I would like to tie our online inservice with curricular standards, particularly if any online learning is extended to students, and not just employees. We already allow our teachers to earn state CACTUS credit through our inservice portal, but I think without proper assessments the credit given does not demonstrate actual learning.
It’s strange that we have overlooked evaluation in a lot of our online professional development. It seems obvious now. We should set clear goals and objectives, outlining what we wish to accomplish. Evaluation should occur every step of the way, through both formative and summative assessments. Self-directed courses should be kept to a minimum, since it can be more difficult to collect formative assessment in this venue. In directed courses, the instructor can observe how the learners interact with the material, and take notes. I tend to favor project-oriented learning, so I don’t necessarily prefer quiz-based summative assessments. Final projects which effectively demonstrate all the material learned in the online class could be constructed instead, and assessed through a rubric. Another assessment, perhaps conducted through observation only, should also provide a means to determine the impact of the training one, three, or six months down the road. Has the material been applied to the participant’s instructional practices? Has their behavior changed? For example, if they participated in introductory blog training, are they now actively using their blog for instructional purposes and parent outreach?
Determining exactly how to form these assessments is what I’m still unclear about, and I still struggle with deciding how to form the questions in an evaluation, and knowing what to ask. I would like to focus more and get some practice determining and writing questions that lead to clear process descriptions and goal statements.