This article is about both assessment in peer learning and an exercise in assessment, as we put our strategy for assessment into practice by evaluating the Peeragogy Handbook itself.
Adapting strategies for learning assessment to the peer-learning
context
In "Effective Grading: A Tool for Learning and Assessment," Barbara E. Walvoord and Virginia Johnson Anderson have outlined an approach to grading. They address three questions:
Who needs to know, and why?
Which data are collected?
How does the assessment body analyze data and present findings?
The authors suggest that institutions, departments, and assessment committees should begin with these simple questions and work from them towards anything more complex. These simple questions provide a way to understand - and assess - any strategy for assessment! For example, consider "formative assessment" (in other words, keeping track of how things are going). In this context, the answers to the questions above would be:
Teachers need to know about the way students are thinking about their work, so they can deliver better teaching.
Teachers gather a lot of these details on learning activities by "listening over the shoulders" of students.
Teachers apply analysis techniques that come from their training or experience -- and they do not necessarily present their assessments to students directly, but rather, feed it back in the form of improved teaching.
This is very much a "teacher knows best" model! In order to do something like formative assessment among peers, we would have to make quite a few adjustments.
At least some of the project participants would have to know how other participants are thinking about their work as well as analyzing their own progress. We are then able to "deliver better teaching" and work together to problem-solve when difficulties arise.
It may be most convenient for each participant to take on a share of the work (e.g. by maintaining a "learning journal" which might be shared with other participants). This imposes a certain overhead, but as we remarked elsewhere, "meta-learning is a font of knowledge!" Outside of persistent self-reflection, details about others' learning can sometimes be abstracted from their contributions to the project ("learning analytics" is a whole topic unto itself).
If a participant in a "learning project" is bored, frustrated, feeling closed-minded, or for whatever other reason "not learning," then there is definitely a question. But for whom? For the person who isn't learning? For the collective as a whole? We may not have to ponder this conundrum for long: if we go back to the idea that "learning is adaptation," someone who is not learning in a given context will likely leave and find another context where they can learn more.
This is but one example of an assessment strategy: in addition to "formative assessment", "diagnostic" and "summative" strategies are also quite popular in mainstream education. The main purpose of this section has been to show that when the familiar roles from formal education devolve "to the people," the way assessment looks can change a lot. In the following section, we offer and begin to implement an assessment strategy for evaluating the peeragogy project as a whole.
Case study in peeragogical evaluation: the Peeragogy project itself
We can evaluate this project partly in terms of its main "deliverable," the Peeragogy Handbook (which you are now reading). In particular, we can ask: Is this handbook useful for its intended audience? If so, in what ways? If not, how can we adapt? The "intended audience" could potentially include anyone who is participating in a peer learning project, or who is thinking about starting one. We can also evaluate the learning experience that the co-creators of this handbook have had. Has working on this book been a useful experience for those involved? These are two very different questions, with two different targets for analysis -- though the book's co-creators are also part of the "intended audience". Indeed, we might start by asking "how has working on this book been useful for us?"
A methodological interlude: "Follow the money"
The metrics for learning in corporations are business metrics based on financial data. Managers want to know: Has the learning experience enhanced the workers' productivity? When people ask about the ROI of informal learning, ask them how they measure the ROI of formal learning. Test scores, grades, self-evaluations, attendance, and certifications prove nothing. The ROI of any form of learning is the value of changes in behavior divided by the cost of inducing the change. Like the tree falling over in the forest with no one to hear it, if there's no change in behavior over the long haul, no learning took place. ROI is in the mind of the beholder, in this case, the sponsor of the learning who is going to decide whether or not to continue investing. Because the figure involves judgment, it's never going to be accurate to the first decimal place. Fortunately, it doesn't have to be. Ballpark numbers are solid enough for making decisions.
The process begins before the investment is made. What degree of change will the sponsor accept as worthy of reinvestment? How are we going to measure that? What's an adequate level of change? What's so low we'll have to adopt a different approach? How much of the change can we attribute to learning? You need to gain agreement on these things beforehand. Monday morning quarterbacking is not credible. It's counterproductive to assess learning immediately after it occurs. You can see if people are engaged or if they're complaining about getting lost, but you cannot assess what sticks until the forgetting curve has ravaged the learners' memories for a few months. Interest also doesn't guarantee results in learning, though it helps. Without reinforcement, people forget most of what they learn in short order. It's beguiling to try to correlate the impact of learning with existing financial metrics like increased revenues or better customer service scores. Done on its own, this approach rarely works because learning is but one of many factors that influence results, even in the business world. Was today's success due to learning or the ad campaign or weak competition or the sales contest or something else? The best way to assess how people learn is to ask them. How did you figure out how to do this? Who did you learn this from? How did that change your behavior? How can we make it better? How will you? Self-evaluation through reflective practice can build both metacognition and self-efficacy in individuals and groups. Too time consuming? Not if you interview a representative sample. For example, interviewing less than 100 people out of 2000 yields an answer within 10% nineteen times out of twenty, a higher confidence level than most estimates in business. Interviewing 150 people will give you the right estimate 99% of the time.
Roadmaps in Peer Learning
We have identified several basic and more elaborate patterns that describe "the Peeragogy effect". These have shaped the way we think about things since. We think the central pattern is the Roadmap, which can apply at the individual level, as a personal learning plan, or at a project level. As we've indicated, sometimes people simply plan to see what happens: alternative versions of the Roadmap might be a compass, or even the ocean chart from the Hunting of the Snark. The roadmap may just be a North Star, or it may include detailed reasons "why," further exposition about the goal, indicators of progress, a section for future work, and so forth. Our initial roadmap for the project was the preliminaly outline of the handbook; as the handbook approached completion at the "2.0" level, we spun off additional goals into a new roadmap for a Peeragogy Accelerator. Additional patterns flesh out the project's properties in an open "agora" of possibilities. Unlike the ocean, our map retains traces of where we've been, and what we've learned. In an effort to document these "paths in the grass," we prepared a short survey for Peeragogy project participants.
We asked people how they had participated (e.g., by signing up for access to the Social Media Classroom and mailing list, joining the Google+ Community, authoring articles, etc.) and what goals or interests motivated their participation. We asked them to describe the Peeragogy project itself in terms of its aims and to evaluate its progress over the first year of its existence. As another measure of "investment" in the project, we asked, with no strings attached, whether the respondent would consider donating to the Peeragogy project. This survey was circulated to 223 members of the Peeragogy Google+ community, as well as to the currently active members of the Peeragogy mailing list. The responses outlining the project's purpose ranged from the general: "How to make sense of learning in our complex times?" -- to much more specific:
Anonymous Survey Respondent 1: Push education further, providing a toolbox and techniques to self-learners. In the peeragogy.org introduction page we assume that self-learners are self-motivated, that may be right but the Handbook can also help them to stay motivated, to motivate others and to face obstacles that may erode motivation.
Considering motivation as a key factor, it is interesting to observe how various understandings of the project's aims and its flaws intersected with personal motivations. For example, one respondent (who had only participated by joining the Google+ community) was: "[Seeking] [i]nformation on how to create and engage communities of interest with a shared aim of learning." More active participants justified their participation in terms of what they get out of taking an active role, for instance:
Anonymous Survey Respondent 2: "Contributing to the project allows me to co-learn, share and co-write ideas with a colourful mix of great minds. Those ideas can be related to many fields, from communication, to technology, to psychology, to sociology, and more."
The most active participants justified their participation in terms of beliefs or a sense of mission:
Anonymous Survey Respondent 3: "Currently we are witnessing many efforts to incorporate technology as an important tool for the learning process. However, most of the initiatives are reduced to the technical aspect (apps, tools, social networks) without any theoretical or epistemological framework. Peeragogy is rooted in many theories of cooperation and leads to a deeper level of understanding about the role of technology in the learning process. I am convinced of the social nature of learning, so I participate in the project to learn and find new strategies to learn better with my students."
Or again:
Anonymous Survey Respondent 4: "I wanted to understand how peer production really works. Could we create a well-articulated system that helps people interested in peer production get their own goals accomplished, and that itself grows and learns? Peer production seems linked to learning and sharing - so I wanted to understand how that works."
They also expressed criticism of the project, implying that they may feel rather powerless to make the changes that would correct the course:
Anonymous Survey Respondent 5: "Sometimes I wonder whether the project is not too much 'by education specialists for education specialists.' I have the feeling peer learning is happening anyway, and that teens are often amazingly good at it. Do they need 'learning experts' or 'books by learning experts' at all? Maybe they are the experts. Or at least, quite a few of them are."
Another respondent was more blunt:
Anonymous Survey Respondent 6: "What problems do you feel we are aiming to solve in the Peeragogy project? We seem to not be sure. How much progress did we make in the first year? Some... got stuck in theory."
But, again, it is not entirely clear how the project provides clear pathways for contributors to turn their frustrations into changed behavior or results. Additionally we need to be entirely clear that we are indeed paving new ground with our work. If there are proven peer learning methods out there we have not examined and included in our efforts, we need to find and address them. Peeragogy is not about reinventing the wheel. It is also not entirely clear whether excited new peers will find pathways to turn their excitement into shared products or process. For example, one respondent (who had only joined the Google+ community) had not yet introduced current, fascinating projects publicly:
Anonymous Survey Respondent 7: "I joined the Google+ community because I am interested in developing peer to peer environments for my students to learn in. We are moving towards a community-based, place-based program where we partner with community orgs like our history museum for microhistory work, our local watershed community and farmer's markets for local environmental and food issues, etc. I would love for those local efforts working with adult mentors to combine with a peer network of other HS students in some kind of cMOOC or social media network."
Responses such as this highlight our need to make ourselves available to hear about exciting new projects from interested peers, simultaneously giving them easier avenues to share. Our work on developing a peeragogy accelerator in the next section is an attempt to address this situation.
Summary
We can reflect back on how this feedback bears on the main sections of this book with a few more selected quotes. These motivate further refinement to our strategies for working on this project, and help build a constructively-critical jumping off point for future projects that put peeragogy into action.
How can we build strong collaboration?
"A team is not a group of people who work together. A team is a group of people who trust each other."
How can we build a more practical focus?
"The insight that the project will thrive if people are working hard on their individual problems and sharing feedback on the process seems like the key thing going forward. This feels valuable and important."
How to connect with newcomers and oldcomers?
"I just came on board a month ago. I am designing a self-organizing learning environment (SOLE) or PLE/PLN that I hope will help enable communities of life long learners to practice digital literacies."
How can we be effective and relevant?
"I am game to also explore ways attach peeragogy to spaces where funding can flow based on real need in communities."
Conclusion
We can estimate individual learning by examining the real problems solved by the individual. It makes sense to assess the way groups solve problems in a similar way. Solving real problems often happens very slowly, with lots of practice along the way. We've learned a lot about peer learning in this project, and the assessment above gives a serious look at what we've accomplished, and at how much is left.
Subpatterns
Next steps
“Usefulness” is an appropriate metric for assessment in peeragogy, where we’re often concerned with devising our own problems rather than than the problems that have been handed down by society. We use the idea of return on investment (the value of changes in behavior divided by the cost of inducing the change) to assess the Peeragogy project itself, as one example.
Researching Peeragogy
Researching Peeragogy talks about how students (and others) can do research on learning.
Technologies
Technologies is relevant in relation to figuring out if you have the right tool for the job.