By Stefanie Panke
Editor, Social Software in Education
Dr. Curtis P. Ho is professor and former chair of the Educational Technology department at the University of Hawai’i at Manoa. His research interests comprise collaborative learning through distance education, instructional design strategies for teaching online, instructional technology standards in teacher education and the integration of technology into the curriculum. Dr. Ho serves on the executive committee for the AACE E-Learn conference (forthcoming in October 2013), where we co-chair a new Special Interest Group on Designing, Developing and Assessing E-Learning. Since we have never met in person (yet), I asked Curtis to comment on six statements related to assessment.
Statement 1: Authentic assessment is the “silver bullet” for deep, transfer-oriented learning – if only we knew how to do it right.
Curtis: Yes, I like the term “transfer-oriented learning” to define how we need to shape assessment. This is the gold standard for learning outcomes. After all, this is Robert Gagne’s 9th event of his Nine Events of Instruction. The challenge will be to create and implement authentic learning in an online course. How authentic can learning be if we are confining it to a 15-week semester at a distance?
Stefanie: I find David Jonassen’s work on problem solving (i.e., Jonassen 2011) a great starting point to think about the instructional design of assessment.
I am particularly interested in the design of assessment that fosters mastery orientation and offers gratification to performance-oriented learners. How can we make students want to improve and push themselves, and give them opportunities to shine and prove what they can do?
Statement 2: Assessment in online learning needs to move beyond multiple-choice quizzes in PowerPoint modules.
Curtis: I would generally agree. The ideal is to have authentic assessment at all times. However, multiple-choice quizzes may be useful in reinforcing short-term learning, and I see using this for self-check or practice. It may be used to scaffold lower level learning.
Stefanie: I am currently enrolled in the MOOC “Metadata: Organizing and Discovering Information” by Dr. Jeffrey Pomerantz from the University of North Carolina at Chapel Hill. It includes some great examples for non-multiple-choice assignments that work in an online environment. Mostly, these involve finding snippets of information in catalogues or providing meta-information in conformity with Dublin Core. This made me think of the WebQuest concept and other inquiry-based methods.
Statement 3: When we expect students to learn collaboratively, we should not grade them individually.
Curtis: Again, I would generally agree. In collaborative learning, the group should be accountable for their performance as a group. Of course, there is often a question of fairness especially if not everyone contributes equally in the project. I have experimented with using peer evaluation as a way to hold each other accountable. I have had mixed results.
Stefanie: How we assess group work, peer feedback and discussions needs to go beyond quantitative measures like “post two comments to the board by next Tuesday.” It should include creative as well as reflective tasks that center around problem solving.
Statement 4: Badges are an important trend that will impact assessment – in particular for open educational practices.
Curtis: Badging is definitely a trend we are seeing in recognizing and verifying learning, especially in professional development. I am curious how it will impact assessment in formal learning.
Stefanie: I fear that badges can actually distract from learning if they are offered and collected like Halloween candy. However, I really like how they introduce game-like incentives into the learning process. I have experienced this myself on the platform openstudy – I was thrilled to unlock the next “level” by answering questions. Without offering an actual, tangible reward, the environment is able to create a flow experience. Obviously, there is also the “professional development” aspect to badges. They are a nice way to certify that someone participated in a course, conference or learning activity.
Statement 5: Portfolios are a natural fit for assessing networked learners in their personal learning environments.
Curtis: I think portfolios are absolutely essential in assessing networked learners especially when applying authentic learning. Collaborative projects, case studies, reflection are some portfolio artifacts that can be effectively used in assessing learning.
Stefanie: I think it is important to distinguish between the marketing-oriented “look-at-me” portfolios and the learning portfolio that includes metacognitive reflection. Oftentimes we tend to “showcase” our products instead of reflecting the journey. I am currently working with the Carolina MPA program on the design of a portfolio class. It is very challenging and interesting, and I am looking forward to learning from the students.
Statement 6: The instructional design of assessment rubrics is an under-researched topic.
Curtis: Interestingly, this is true. Assessment is part of the instructional design process, but I don’t think much thought has been given to how rubrics are designed and how effective they are in measuring outcomes.
Stefanie: Well-designed rubrics give guidance to the formal and informal learner and scaffold the learning process. Poorly designed rubrics lead to robotic checklist-oriented output. We need more research to distinguish the former from the latter.
Making the complex clear, turning difficult into intriguing, raising intellectual curiosity, supporting creative problem solving, posing authentic challenges, building effective scaffolds – contemporary e-learning goes beyond the latest tools and technologies. The special interest group “Designing, Developing, and Assessing E-Learning” aims to bring together researchers and practitioners to debate and collaborate on a variety of instructional design themes.
Other special interest groups launched at this year’s E-Learn conference are “E-Learning in Developing Countries” and “E-Learning Trends and Innovations.” The special interest groups will promote activities for future E-Learn conferences, including suggestions for invited speakers and symposia. We hope they will provide a marketplace for joint research projects and plan to offer publication opportunities in the form of special issues and edited volumes.
Apart from the inaugural meetings of the three special interest groups, AACE E-Learn 2013 features a first-rate symposium on “MOOCs and Open Education Around the World.” Viva Las Vegas!
Filed under: Instructional Design |