Tagged Online Learning

css.php
Two women smile in a library room.
0

The Help Desk as a Community-Building Tool for Online Professional Development

Abstract

COVID-19 safety measures have forced professional development programs to pivot to online environments, which affects how participants interact and collaborate. When the University of Rhode Island hosted their annual, week-long teacher professional development event as a fully-online program, the staff of the Summer Institute in Digital Literacy provided an online, real-time help desk service, knowing that some participants would benefit from targeted, individualized support. Using evidence from the help desk incident log and post-event qualitative interviews, this research deepens understanding of what teacher professional development can look like in online environments. Through the provision of personalized, real-time assistance that created a relationship between the participant and the staff member, those who used the Help Desk reduced their feelings of isolation, increased a sense of connectedness, and demonstrated agency as co-learners in a professional development learning experience. By providing intrapersonal, technical, and navigational support, the help desk deepened a sense of community connectedness in an online professional development program for educators who faced a dramatic pivot to online learning as a result of the COVID-19 pandemic.

Closures and physical distancing measures due to COVID-19 have shifted the way we interact, forcing many organizations to eliminate programs in teacher professional development (TPD) or move them to online platforms for the first time. In this shift, educators have faced some obstacles and adjustments. Although online learning is not a new model for digital literacy education, the COVID-19 pandemic has changed how and to what extent educators are expected to utilize online platforms for learning and community, bringing with it challenges to and opportunities for growth.

Given this backdrop, we look to understand how current research in TPD translates for fully-online experiences, exploring principles of community-building to understand the affordances of online learning. Importantly, our work seeks to understand the possibility of successfully applying known, effective in-person practices to online learning and professional development. This study documents a key feature of the 2020 Summer Institute in Digital Literacy (SIDL), a TPD program affected by COVID-19 restrictions. In its eighth year, SIDL was held completely online for the first time, gathering around 150 participants—mostly from the United States but including more than two dozen from 10 countries around the world. Educators, school leaders, researchers, librarians, and media literacy advocates come together annually for the week-long intensive program to learn about digital literacy, practicing skills and instructional techniques that support student learning via digital platforms (Hobbs and Coiro 2019; 2016). When pandemic restrictions emerged in March, program planners decided to use a combination of synchronous and asynchronous learning, using a learning management system plus video conference meeting rooms, along with flexible scheduling

Because of the intensive nature of the program, with its focus on hands-on media production activities and the activation of digital literacy competencies, they also decided to add an online help desk component to act as a support mechanism. The help desk would rely on a dedicated Zoom video conference room and text service (Google Voice) staffed continuously to offer hands-on, real-time support throughout the six-day, 42-hour event. By visiting the Lounge/Help Desk, participants could hang out and engage in informal dialogue but also get questions answered or receive individualized coaching.

In this study, we aim to better understand the value of the SIDL Lounge/Help Desk as a component of a teacher professional development program. Through the provision of personalized, real-time assistance that created a relationship between the participant and the staff member, we wondered if it could replace the “elbow-to-elbow” support that the program embodies when implemented in face-to-face learning contexts, where faculty and participants work side-by-side to create to learn (Hobbs and Coiro 2016).

Literature Review

The academic scholarship most relevant to this work focuses on the characteristics of professional learning environments that address the identity of teachers as learners and the role of help desks in community-building for both face-to-face and online learning contexts.

Teachers as co-learners

COVID-19 restrictions have required educators to adopt online teaching methods not as an option but as a necessity, and the suggestion that “what works in effective traditional learning environments may or may not work in online environments” has proven true in the forced remote learning of the 2020 pandemic (McCombs and Vakili 2005, 1582). In these unusual circumstances, teachers must “unlearn” traditional concepts in order to be receptive to new approaches that work better in online settings. While some teaching and learning habits are useful, they can also be detrimental, especially in unpredictable and unstable moments in time. Not only must educators learn new forms of social engagement, they must also “unlearn habits that have been useful in the past but may no longer be valuable to the future” (McWilliam 2008, 263).

One of the most dynamic settings where a teacher can embrace the identity of the learner is a TPD program. Ann Lieberman (1995, 592) argued for teachers to be actively involved in their own learning, noting that “the ways teachers learn may be more like the ways students learn than we have previously recognized.” When teachers actively learn from each other, they may create communities of practice where participants share, reflect on, and build new knowledge (Darling-Hammond et al. 2017; Desimone 2009).

During professional development, educators are placed in student roles, where they may enter into a “troubling zone” that can be also described as a discomfort, and it is this discomfort that helps to build a critical inner reflection leading to openness and empathy (Fasching-Varner et.al. 2019).

In online learning contexts, the ability to critically reflect on the identity of the learner is crucial for the design of effective TPD (Baran, Correia, and Thompson 2011). A profound learning opportunity can be created by the temporary disequilibrium caused by switching from “expert” to “learner” (O’Mahony et al. 2019). An aggregate review of how to improve TPD for online and blended learning confirms this, stating that teachers must have “the opportunity to reflect on the roles that they ascribe to themselves and their students in (online) environments” (Philipsen et al. 2019, 1157). This empathy for learners creates critical awareness that can be used during times of acute situational adjustments, such as with COVID-19.

Help desks as spaces for online community building

Online learning creates many opportunities for communities to form. Smith (2013) notes that community is variously developed by place, interest, and communion and is built through tolerance, reciprocity, and trust. But community doesn’t make itself: Connections are made through interaction, thus enabling people to build those communities (Smith 2013).

So what does one do when interaction becomes virtual, such as occurred during the COVID-19 pandemic? Coryell (2013) contextualizes collaborative and comparative inquiry in cross-cultural adult learning by framing learning as participation (partaking in knowledge), rather than learning as acquisition (possessing it). In referring to Sfrad’s (1998) work, she argues that in learning-as-participation mode, learners recognize knowledge as an interactional journey (Coryell 2013).

Community cannot exist without shared experience, and TPD programs must activate a sense of community if they are to be successful. A sense of community informs the formation of collective identity, which “is demonstrated when group members work interdependently with a shared purpose and responsibility for collective success” (Vrieling et al. 2018, 3).

Help desks may be spaces that support collaborative learning. In the field of information science, computer help desks located in universities have been studied to understand their organizational or technical functions, with focus on staffing, training and other issues. Some researchers have explored how help desk activity is used to create, manage, and share knowledge (Halverson et al. 2004). But there is limited research on collective identity, participation, or co-learning in help desk scenarios. Only one study is especially relevant to our work: it looked at what kind of learning takes place between those who need support and those who offer it. In this help desk research, a consistent sequence of four phases emerged to support communication, learning, and engagement in a face-to-face help desk. The phases included the processes of introduction, knowledge establishment, conceptual change, and agency. Findings showed that these interactions (consisting of two professionals of different expertise) activated metacognition, a type of reflection, leading to learner agency and personal fulfillment (O’Mahony et al. 2019).

With this understanding of community-building via help desks, we can consider the unique opportunities and challenges of online learning environments, including for TPD. As a result of the rise of social media, digital interaction has become normative for most people around the world. Yet for many educators, online learning has been thought to be inferior to face-to-face learning. For example, researchers who conducted a meta-analysis of various TPDs and how they affect student outcomes found that TPDs with online components yielded lower student achievement than programs that were entirely face-to-face. Yet, in that same study, several online learning practices were associated with gains, including having space to “troubleshoot and discuss implementation” of digital tools (Hill et al. 2020, 54).

To prepare teachers for online learning, online TPD may be a powerful treatment. But an understanding of the full potential of online TPD is still in development. Based on participant comments regarding collaborative and face-to-face engagement in Collins and Liang’s (2015) study of online TPD, little advancement in both the approach and implementation of these programs seems to have occurred. They report:

A number of individuals expressed they did not find OTPD as effective or meaningful as traditional face-to-face protocols…hardly anyone mentioned the online environment as engaging or encouraging participation through support or collaboration. A high number explicitly expressed that interaction was lacking … and many reported that even though they appreciated online delivery and its accessibility … they still missed the dialogue and collaboration of face-to-face PD. (Collins and Liang 2015, 28–29)

Online learning pedagogies are still primarily viewed through a prism of limitations when it comes to community-building. But scholars and practitioners are beginning to reimagine the use of technology and digital devices for collaborative learning. Bhati and Song (2019) conceptualize the creation of a dynamic learning space (DLS) in combination with mobile collaborative experiential learning (MCEL) as a means to encourage “high-level learning” and personalization. To our knowledge, approaches that level up these experiences by using the collaborative value of peer-to-peer synergy—proven instrumental to successful social learning—have not yet been studied.

Research Methods

The purpose of the paper was to understand the role of the help desk in online TPD as a form of informal learning and community building. Because this is a form of exploratory research, we asked: How did adult learners experience the value of an online help desk in the context of teacher professional development?

Participants and program context

The Lounge/Help Desk was fully-integrated into the Summer Institute in Digital Literacy (SIDL), the six-day, 42-hour TPD program, which included 135 participants and fifteen staff members. SIDL is an established program with a long history (Hobbs and Coiro 2019; 2016) but 2020 was the first time the TPD program was offered as a fully online program. Thus, many features of the program required adaptations that were new to the event organizers, faculty, staff, and returning participants.

The SIDL Lounge/Help Desk was conceptualized as an informal gathering space, where participants could go to get help—but also to interact with other participants and staff. Describing the Help Desk as a lounge was also intentionally designed as a means to reduce the stigma of asking for help. Participants were reminded of the Lounge/Help Desk every day. Each morning of the six-day program, participants received an email with the links to the learning management system, where links to video conference Zoom rooms and the Lounge/Help Desk were provided. The first and second authors were responsible for staffing the Zoom room Lounge/Help Desk, and the third author served as their supervisor.

The Lounge/Help Desk was both a synchronous and asynchronous communication channel for program participants and faculty, open to join at any time throughout event hours (9 AM–5 PM). Participants joined the Zoom Room or sent texts or emails, and these were handled throughout the day as the TPD program was in operation. Program faculty also participated in the Lounge/Help Desk, joining the online Zoom room for 1–3 hour shifts. In cases where the staff could not answer questions, one member would reach out to program organizers via a private Signal chat, which was used as a backchannel tool, in order to gain information needed to answer questions or solve problems.

As Lounge/Help Desk staff members, we gradually came to recognize that we were teachers in the TPD program and that our role was truly educational. We were not just providing a transactional service: Through our interaction, we were demonstrating the depth of community building that is at the heart of the SIDL program (Hobbs and Coiro 2019). People came to the Lounge/Help Desk needing different kinds of personalized support. Some were clearly beginners in their use of technology, while others had considerable expertise. But each of these individuals were people that we had a chance to interact with and learn from; during other components of the program we sometimes encountered them, particularly in small breakout groups and informal discussions. Indeed, it was the awareness of our own experience as co-learners with the participants that inspired our interest in this research project.

Data collection and analysis

Incident Log

During the program, we logged every visit to the Help Desk in an incident log to identify each time a participant visited the Zoom room or interacted via Google Voice text messages. During the real-time TPD program, this practice helped event organizers to understand participant pain points for particular learning activities that involved digital media and technology. It also functioned to help staff contact participants when reaching out to those whose questions could not be resolved in real time. The log documented: who contacted the help desk; who assisted them; what the question or problem was; and how the resolution occurred. The incident log was not initially designed for research, as we merely imagined its function as a tool for formative assessment during the program implementation.

During the program, the Help Desk Zoom room was accessed 76 unique times by 41 different participants. Fourteen text messages were sent to Help Desk staff. In our first phase of data analysis, the first and second authors used data from the incident log to categorize our encounters with participants. We worked independently to develop categories to account for the variety of interactions in order to increase divergent interpretations and reduce confirmation bias. By reviewing the categories created by each researcher using simple description, we identified emerging themes like: “emotional support needed after confusion caused by new platforms,” “tech glitches,” and “wanting to be told what to do.”

Interviews

After the event, we reached out to 41 participants who had used the Help Desk and eight agreed to participate in a research interview; one male and seven females. In terms of race and ethnicity, six participants were White, one Black, and one Latina. Seven participants were from the United States, while one participant was from Great Britain. The average age of the participants varied from 40 to 65. The demography of the research participants closely represents the SIDL demography, with the majority of participants white, female, and based in the United States.

The interview was conducted through Zoom and included ten scripted questions regarding the participant’s experience using the Help Desk. Participants were asked to describe what led up to their decision to access the Help Desk, the emotions they could recall at play before, during, and after its use, and how the experience compared to other help desk services they may have experienced in the past. Interviews were conducted three weeks after the event. The University’s institutional review board approved the research and participants gave permission for audio recording.

In the second phase of data analysis, we analyzed both the transcribed interview data from the individual interviews and the incident log data collected during the TPD program. The interview data helped to more deeply contextualize the documentation in the incident log. For example, interviews suggested that areas first coded as “tech glitches” may also relate to “confusion,” and that participants who we initially perceived to be “needing to be told what to do” were navigating the social loss of community interaction.

Findings

Three themes emerged from this work which give insight into how informal learning was experienced in the context of using a Help Desk during an online teacher professional development program. Participants came with a variety of very specific questions and problems during the week-long program. Of the 76 visits to the Help Desk, many were easy to answer, requiring only a few minutes. Examples of these include finding a link to a Zoom room, recalling a password, or noting the day’s agenda and schedule. These were often merely a matter of visiting a web page and clicking a link.

But some questions required some additional form of co-learning as Help Desk staff needed to answer a question by modeling a learning process with a participant. Some of the questions that participants asked could not be easily answered by Help Desk staff. For example, one participant needed help learning how to edit a post on Wakelet, a digital curation tool, while another wanted a tutorial on ThingLink, a visual annotation tool. Neither staff member was familiar with these digital tools but both were able to demonstrate co-learning with participants to answer their question or solve their problem. Another participant struggled to find a solution to the microphone on her laptop, which suddenly stopped working. In each case, the Help Desk staff demonstrated through inviting the participant to share their screen, using coaching that enabled participants to solve their own problem with scaffolded support from a member of the staff. For questions that Help Desk staff could not solve on their own, they explained and modeled how they reached out for help from the larger faculty team. In those cases, staff were able to find answers within an hour or two of the request being made. Considering the nature of the help provided in the context of the participant interviews, we found that many of the Help Desk encounters created a rich interpersonal relationship between participant and staff member that functioned to reduce isolation, deepen a sense of community, and increase learner agency.

Co-learning as a journey borne of isolation

The Lounge/Help Desk reinforced the perception that the TPD program was a co-learning journey that involved the participants and the staff as collaborators. Many participants (and program faculty) were experiencing online TPD for the first time; it was a new experience for everyone.

While describing initial feelings and the scenarios leading up to accessing the Lounge/Help Desk, participants mentioned experiencing “confusion,” “nervousness,” and “anxiety.”[1] For example:

  • “Before [coming to the help desk], it was confusion and a little bit of…I wouldn’t go as far as to say panic, but close.”
  • “I was a little lost a couple of times in terms of where I was supposed to be going.”

In the TPD program, the novelty of a fully online event was made even more intense by the expectation that participants would be practicing the use of new digital tools, including Pathwright LMS, Adobe Spark, Padlet, and many other platforms. This may have exacerbated concerns that participants naturally have in new learning scenarios, except that, instead of being able to organically turn to the person next to you and ask questions, participants were, in that moment, alone.

Interview data clearly reveals that awareness of a sense of isolation was a precipitating incident. Participants noted feeling confused about “where” to go and when, unsure of which “Zoom room” they belonged in. At various points during the week, there was uncertainty regarding task details and/or deadlines for completion. These are common in learning environments, and the accessibility of the Help Desk acted as a bridge in lieu of the missing opportunity to “turn to your neighbor,” thus helping participants keep involved and engaged.

Some veteran SIDL participants (attending for a second or third time) hesitated in reaching out to the help desk out of concern for others, downplaying their own need for support. Feelings of demoralization and inadequacy were also referenced in the moment of realizing help was needed.

  • “[Y]ou think, ‘should I know the answer to this—is this something I can figure out myself?’ … my hesitancy was that people might need [the Help Desk] more than I did.”
  • “Everybody sort of doesn’t want to take time away from other people or you don’t want to bother people. So there’s always that, but I felt more comfortable using it after I used it the first time…”
  • “The feeling before I joined the lounge was ‘I’m “supposed” to be doing this, but I can’t.’”

One participant said that she felt much more comfortable coming to the help desk when she realized she knew one of the staff members. Clearly, such relationships and bonds can support not only successful learning but also continued community development.

Co-learning as a journey to connect

The decision to share Google Voice numbers with participants offered additional options to connect with the Lounge/Help Desk staff through calling or texting. One participant noted this as particularly helpful; as a non-native English speaker, it was easier for her to write her question. Because the help desk was continuously available during the six days of the program, it created a sense of immediacy, efficiency, and effectiveness, as participants saw how the help desk embodied the empathy of the program’s tagline: “Everyone Learns from Everyone,” a phrase that made adult learners feel welcomed as peers (Hobbs and Coiro 2019). For example, participants noted:

  • “The people there were very helpful and compassionate … about leading me through where something was and actually, one time, the assistant was confused as well. They didn’t quite know where to go. So we were learning together—how to navigate the site. So it felt like a very welcoming place.”
  • “I was very reassured. I was helped immediately; I wasn’t kept waiting … and I felt as though my concerns were being dealt with.”

Many participants had experienced help desks at their workplace or school. There, they encountered a generally asynchronous system: submit query, wait for response, hope for solution. But the SIDL Lounge/Help Desk was different. Participants who reached out for help mentioned appreciating the immediacy and liveliness of the help desk interaction. The help desk was an online “place” for congregation; after all, it doubled as The Lounge. Participants noted:

  • “Having a real person to talk to is a bonus. It’s better than either a chatbot or talking with somebody online—having somebody to actually talk to and have working through it is definitely a good thing.”
  • “When you contact a regular help desk, you feel like you’re just lost—your request is out there; you may or may not hear from anybody. That wasn’t the case here.”

During the interaction, some participants realized their initial confusion was a result of inattention. In being able to focus and talk through a concern and visualize it on a shared screen with the help desk staff, participants gained awareness of what they had overlooked. As they worked together, the missing piece of information would often be noticed by participants themselves. The sense of pleasure in solving a problem transformed the sense of isolation into a shared experience.

Co-learning as a journey toward agency

Interview subjects described the calm and confident feelings they experienced upon resolving their questions or concerns through the help desk interaction. Important to supporting this sense of agency was the ability for both the staff and the participants to share their screens. Screen-sharing enabled help desk staff to model the iterative process of learning to use digital platforms and the shared experience of confronting and solving a problem together built trust and independence for the participants. For example, participants noted:

  • “I could see things that I needed to see and know that I wasn’t missing anything.”
  • “Afterward, I had very clearly seen where to go. So it was a sense of relief that now I could do that by myself.”
  • “I learned that it wasn’t as complicated as I thought it to be. And that there was more than one way to approach the issue we were having.”

Almost all interviewed noted how their own struggles aligned with what their students may experience with online learning. In fact, contrary to Collins and Liang’s (2015) suggestion that honoring the adult is part of effective PD (the idea that while learners, they are first and foremost experienced adults and professionals), we found that participants who could embrace the role of learner—complete with the requisite insecurities, needs, problems and questions—gave them the opportunity to deepen empathetic connections to their own learners. This is one way to understand how an online help desk can provide value to adult learners in the context of teacher professional development.

We found that three forms of support—intrapersonal, technical, and informational—all contributed to increased participant agency as co-learners. Intrapersonal support occurred as participants entered the Lounge/Help Desk with strong feelings, the full range of feelings that manifest when something does not work as expected or when obstacles occur. Emotions varied from frustration to panic. Sometimes, these feelings emerged from intra-actions related to self-imposed expectations; in other cases, external pressures like time constraints were activating strong emotion. Feelings often coincided with information structure and technical scenarios, as when one is distracted or flustered and forgets simple things like how to log in. The ability to acknowledge and validate participant concerns in real-time provided an immediate sense of relief to participants—even when a solution wasn’t immediate.

Technical support included both hardware, software, and online platform glitches, as well as password problems. During the week-long program, a variety of forms of basic IT support was provided, such as updating software, changing passwords, checking settings, and restarting computers. In one instance, the Help Desk assisted a participant who was experiencing prohibitive technical problems (e.g. a poor network connection) by emailing PDF copies of online content. Participants learned more about their digital devices from the transparent way in which these forms of support were modeled by staff.

Navigation support was provided to participants in helping them find what they needed using the learning management system, which was unfamiliar to them. Help desk staff demonstrated how to find specific information, and in the process, they recognized that some of the challenges that participants were experiencing was the result of errors made by program staff, including mislabeled or broken links or poorly expressed language or wording. The help desk participants enabled the TPD faculty to recognize weaknesses in their own explanations of program activities. For example, in one instance, a set of Zoom links were presented using a red font color, which led them to be easily overlooked on a page full of text, even as the red color was intended to make them stand out visually. Help desk staff thanked participants for calling attention to the problem—but participants were equally grateful, expressing feelings of relief as they realized the problem was not “their fault.”

By supporting participants emotionally, technically, and navigationally, feelings of community emerged, because despite the lack of face-to-face encounter in this fully online TPD program, participants felt taken care of. As one experienced participant put it:

all the things that I think made Summer Institute special for me (in-person in past years) … were present this year … And the Help Desk was part of that. So the Help Desk was an even bigger part because without it, SIDL couldn’t have flowed—somebody could get lost.

Through the provision of personalized, real-time assistance, those who used the Lounge/Help Desk reduced their feelings of isolation, increased a sense of connectedness, and demonstrated agency as co-learners in an online professional development learning experience.

Discussion

Our findings provide strong support for the ability of help desks to function as vital components of online teacher professional development programs. SIDL’s Lounge/Help Desk enabled participants to move through an arc of learning-as-participation that not just supports but enhances learning. Rather than conceptualizing the help desk as a merely transactional experience, at the 2020 Summer Institute in Digital Literacy, it functioned as a meaningful part of the overall learning experience.

Of course, this study has several limitations: the small sample size and potential respondent and research bias must be considered as limitations, given the researchers’ own roles as staff during the TPD. We aimed to minimize this limitation by developing the initial analysis of the incident log separately in order to increase divergent interpretations and minimize confirmation bias. We recognize that our ideas of community-building in TPD are framed through an American, Westernized cultural lens, though effort was made to review work from across the globe. The research reviewed for this study is gleaned mostly from abled/neurotypical interactions of spoken or auditory communication, potentially limiting outreach and input.

This research makes a unique contribution to new knowledge by re-framing the online help desk as a novel feature of teacher professional development. Because the online help desk was available throughout the TPD, it functioned to engage participants much like in face-to-face interactions, qualifying it as space to troubleshoot and discuss implementation, a category found to be successful in creating student learning gains from teachers’ TPD learning (Hill et al. 2020).

Key features of the Help Desk design were critical for its use as such an informal learning space: it was called the Help Desk/Lounge, and it was designated specifically as a hangout place online, thus reducing the stigma of being perceived as a place for “people who need help.” For those educators with insecurities about their digital competencies, there was no shame associated with visiting the Help Desk. Thus, it connected and strengthened the program’s core value of “Everyone Learns from Everyone” (Hobbs and Coiro 2018).

The potential to build personalized engagement is another feature needed for a help desk to be part of successful TPD. As designed and implemented, the Help Desk provided the situational context needed to question and solve problems immediately and in real time, running in parallel to the formal program. It also exposed pain points in the event and platform infrastructure, offering a form of continuous evaluation of the TPD experience and enabling event producers to make adjustments during the event itself, further enhancing the program’s overall quality. This tailored approach, so aligned with teacher needs and experiences during COVID-19, enhanced the TPD’s sense of relevance for participants, a requisite dimension of effective training (Stein et al. 2011). The Lounge/Help Desk contributed to this sense of relevance by engaging one-on-one with individuals on the emotional, technical, and navigation challenges they were likely to face as educators heading into an unparalleled 2020-21 school year. The process of engaging with a help desk that offered individualized support offered participants the opportunity to develop understanding of possible hiccups that may be encountered in their own classes and the confidence to troubleshoot these problems themselves. This finding aligns with research that demonstrates the value of helping educators critically reflect on how they approach their work and consider their roles in the educational dynamics of learning (Baran et al. 2011).

While some researchers claim that TPD support must come “from an educational technologist or an expert within the field” (Philipsen et al. 2019, 1155), we found that a help desk intentionally staffed as a peer-supported environment was effective in modeling how to investigate problems together. In such paradigms, trust helps to bridge the implied power dynamics between the helper and the “helped.” Because the help desk staff positioned themselves as participants and partners in the process, they offered the support for collaboration so valued as a critical ingredient for teacher learning (Bates and Morgan 2018; Darling-Hammond et al. 2017) As Bates and Morgan (2018, 623) point out, “a co-learner stance” ultimately contextualizes and personalizes support, guaranteeing “that actual problems are addressed.” The question moves from an individual, isolated/ing concern to a social learning opportunity, something Vygotsky (1978) addresses as essential to meaning-making.

By viewing an online help desk as a shared learning experience with value as a programmatic feature of TPD, we will need to consider how it could be adapted in post-pandemic times, as teacher professional development returns to be provided in face-to-face contexts. The help desk offers the value of providing that “in the moment” experience for individualized grappling and reflecting on problems, helping to meet the needs of every learner. Because the online format was new to everyone involved, including the help desk staff, the co-learning journey in finding answers offered value to faculty, staff and participants alike. Although it was intended to provide individualized support for those experiencing technology problems, the Lounge/Help Desk actually became a part of the overall TPD experience, enabling it to be a programmatic feature that extended the value of the Summer Institute in Digital Literacy as a genuinely collaborative learning experience.

Notes

[1] The quotations in this section come from research interviews with 2020 SIDL participants (names withheld) and were administered by Salome Apkhazishvili and Serene Arena in August, 2020.

Bibliography

Bates, Celeste C., and Denise N. Morgan. 2018. “Seven Elements of Effective Professional Development.” The Reading Teacher 71, no. 5. (Mar/Apr): 623–626. https://doi.org/10.1002/trtr.1674.

Baran, Evrim, Ana-Paula Correia, and Ann Thompson. 2011. “Transforming Online Teaching Practice: Critical Analysis of the Literature on the Roles and Competences of Online Teachers.” Distance Education 32, no. 3: 421-439.

Bhati, Abhishek, and Insu Song. 2019. “New Methods for Collaborative Experiential Learning to Provide Personalized Formative Assessment.” International Journal of Emerging Technologies in Learning (iJET) 14, no. 7. http://doi.org/10.3991/ijet.v14i07.9173.

Coryell, Joellen E. 2013. “Collaborative, Comparative Inquiry and Transformative Cross-Cultural Adult Learning and Teaching: A Western Educator Metanarrative and Inspiring a Global Vision.” Adult Education Quarterly 63, no. 4: 299–320.

Collins, Linda J., and Xin Liang. 2015.“Examining High Quality Online Teacher Professional Development: Teachers’ Voices.” International Journal of Teacher Leadership 6, no. 1 (Fall): 18–34.

Darling-Hammond, Linda, Maria E. Hyler, and Madelyn Gardner, with assistance from Danny Espinoza. 2017. Effective Teacher Professional Development. Palo Alto: Learning Policy Institute.

Desimone, Laura. 2009. “Improving Impact Studies of Teachers’ Professional Development: Toward Better Conceptualizations and Measures.” Educational Researcher 38, no. 3: 181–199.

Fasching-Varner, Kenneth J., Michaela P. Stone, Roberto M. Mella, Francisco O. Henriquez, and Macarena Y. Palma. 2019. “‘…4542 Miles from Home…’: Repositioning English Language Learners as Power Brokers and Teachers as Learners in the Study Abroad Context.” Education Sciences 9, no. 2 1–13. MDPI. http://dx.doi.org/10.3390/educsci9020146.

Halverson, Christine A., Thomas Erickson, and Mark S. Ackerman. 2004. “Behind the Help Desk: Evolution of a Knowledge Management System in a Large Organization.” In Proceedings of the 2004 ACM conference on Computer Supported Cooperative Work, 304–313.

Hill, Heather C., Kathleen Lynch, Kathryn E. Gonzalez, and Cynthia Pollard. 2020. “Professional Development that Improves STEM Outcomes.” Phi Delta Kappan 101: 50–56.

Hobbs, Renee, and Julie Coiro. 2019. “Design Features of a Professional Development Program in Digital Literacy.” Journal of Adolescent & Adult Literacy 62, no. 4: 401–409.

Hobbs, Renee, and Julie Coiro. 2016. “Everyone Learns from Everyone: Collaborative and Interdisciplinary Professional Development in Digital Literacy.” Journal of Adolescent & Adult Literacy 59, no. 6: 623–629.

Lieberman, Ann. 1995. “Practices that Support Teacher Development.” Phi Delta Kappan 76, no. 8: 591–596.

McCombs, Barbara, and Donna Vakili. 2005. “E-Learner-Centered Framework for E-Learning.” Teachers College Record 107, no. 8: 1582–1600.

McWilliam, Erica. 2008. “Unlearning how to Teach.” Innovations in Education and Teaching International 45, no. 3: 263–269.

O’Mahony, Timothy, Jason Petz, Jonathan Cook, Karen Cheng, and Marco Rolandi. 2019. “The Design Help Desk: A Collaborative Approach to Design Education for Scientists and Engineers.” PLoS ONE 14, no. 5: e0212501. https://doi.org/10.1371/journal.pone.0212501.

Philipsen, Brent, Jo Tondeur, Natalie Pareja Roblin, et al. 2019. “Improving Teacher Professional Development for Online and Blended Learning: a Systematic Meta-Aggregative Review.” Education Tech Research Development 67: 1145–1174. https://doi.org/10.1007/s11423-019-09645-8.

Smith, Mark K. 2001, 2002, 2013. “Community.” The Encyclopedia of Pedagogy and Informal Education. https://infed.org/mobi/community/.

Stein, Sarah J., Kerry Shephard, and Irene Harris. 2011. “Conceptions of E-Learning and Professional Development for E-Learning Held by Tertiary Educators in New Zealand.” British Journal of Educational Technology 42, no. 1: 145–165. https://doi.org/10.1111/j.1467-8535.2009.00997.x.

Vrieling, Emmy, Antoine van den Beemt, and Maarten de Laat. 2019. “Facilitating Social Learning in Teacher Education: A Case Study.” Studies in Continuing Education 41, no. 1: 76–93.

Vygotsky, Lev S. 1978. Mind in Society: The Development of Higher Psychological Processes. Cambridge, Massachusetts: Harvard University Press.

About the Authors

Salome Apkhazishvili is a media and communication researcher from the country of Georgia where she coordinates the media and digital literacy program for the conflict-affected youth in the South Caucasus. She is a Fulbright communication graduate from the University of Southern Indiana. Apkhazishvili is a communications officer at the European Communication Research and Education Association Children, Youth, and Media section and a staff member of the Media Education Lab.

Serene Arena is a communication design expert focused on language use and collaborative development in communication and social systems. She has a Masters in Civic Media from Columbia College Chicago, where she studied social power dynamics and informal social spaces as foundations for community and personal identity.

Renee Hobbs is a professor of communication studies and director of the Media Education Lab at the University of Rhode Island’s Harrington School of Communication and Media. She has offered professional development to educators on four continents and authored 12 books and more than 150 scholarly publications on digital and media literacy.

Screenshot of Mac OS computer interface, with an image of an Egon Schiele painting in a small window at the left, and a larger Gravit Designer window with a composition of rectangles based on the painting.
0

Trauma-Informed Pedagogy in the Digital Media Pandemic Classroom

Abstract

After CUNY suspended in-person instruction during the COVID-19 pandemic, I started teaching a half-semester long digital media production course. Rapidly migrating a digital media production course to remote learning creates problems specific to our software-based classrooms, as many of our students lack access to the fixed technology used in the course. As I negotiated these problems, I sought first and foremost to reduce the harm this course would cause my stressed students. To promote care, and minimize harm, I made several decisions that prioritized students’ needs and limits, without sacrificing the rigor that would prepare them for the subsequent courses in the program. These decisions included: delivering asynchronous lessons via Blackboard with flexible assignment deadlines and using two Adobe web app clones, Photopea, and Designer.io, rather than Adobe software itself. This essay articulates these decisions as a trauma-informed pedagogy of care. This theoretical framework builds on feminist ethics of care, public health principles of harm reduction, and social welfare’s trauma-informed practice. This approach allowed me to destigmatize illness and late assignments, and reduce the stress that this course would have on the already traumatized lives of my students, colleague, and myself.

As the COVID-19 pandemic took hold of New York City, the City University of New York moved all of its courses online. One of my courses was a half-semester-long digital media arts course which was scheduled to begin in the middle of March, after the transition to distance learning. As I converted the course, I made several key decisions that prioritized students’ needs and limits, and minimized complexity. Because I knew many of my students wouldn’t have access to fixed technology, I used two Adobe web app clones, Photopea and Designer.io, rather than Adobe software itself. Anticipating that COVID-19 would prevent students from participating consistently, I taught the course through asynchronous lessons, with a flexible timeline. Despite Blackboard’s many problems (Lapowsky 2015), I used it because I knew my students’ other classes would use this official CUNY platform and I didn’t want them to learn anything new or to remember any extra passwords.

In all of these decisions, I sought first and foremost to reduce the harm this course would cause my stressed students. In that mid-March moment, all signs indicated that this recently-designated pandemic would get really bad, and I didn’t want this class to make it worse. I balanced two competing pedagogical principles: the imperative to make this process as easy as possible and not produce unnecessary stress for the students, the other adjunct instructor who would be using my materials, and for myself; and the need to ensure that the students actually learned the material well enough that they would be able to succeed in the courses that follow this class. Or to put it more bluntly: I tried to prioritize care and reduce harm, without sacrificing rigor. The approach seemed to work, as more students successfully completed the class than during a regular semester.

In this essay I will articulate a theoretical framework for how I was thinking about these decisions as I made them and how I have come to understand these decisions in retrospect. At the time, I framed my pedagogical choices through a feminist lens as decisions about care and harm. Additionally, my familiarity with harm reduction principles gave me a loose framework to assess my decisions, destigmatizing illness and late assignments, and reducing the stress that this course would have on the already traumatized lives of my students, colleague, and myself. In retrospect, I will frame these decisions through trauma-informed pedagogy, a practice I only recently learned of.

Care and Harm

In recent years, many differently motivated organizations, movements, and individuals have deployed or theorized discourses of care, caregiving, and self-care. Just weeks before the pandemic took hold, both Social Text and The Sociological Review published special issues on radical care (Hobart and Kneese 2020; Silver and Hall 2020). Writing in the intro to Social Text, Hi‘ilei Julia Kawehipuaakahaopulani Hobart and Tamara Kneese juxtapose these competing claims:

On the one hand, self-care is both a solution to and a symptom of the social deficits of late capitalism, evident, for example, in the way that remedies for hyperproductivity and the inevitable burnout that follows are commoditized in the form of specialized diets, therapies, gym memberships, and schedule management. On the other hand, a recent surge of academic interest in care … considers how our current political and sociotechnical moment sits at the forefront of philosophical questions about who cares, how they do it, and for what reason.

Care means something very different for childcare worker advocates and Gwyneth Paltrow’s Goop brand. Of course, COVID-19 only further exacerbated these tensions between corporate carewashing and the strain on essential care workers (Chatzidakis et al. 2020).

Care has an extended relationship to pedagogy. Nel Noddings articulated the feminist ethics of care philosophy to argue that care is a core element and value in pedagogical relationships between teachers and students (Noddings 1984). Her pedagogy of care has been very influential, especially in early childhood education, and has also been critiqued for its gender essentialism (Monchinski 2010). Others have explored the ways in which the theory would need to be transformed to be applicable to online education, with its shifts in contexts and relationships (Rose and Adams 2014).

My own engagement with care comes out of my work on Art+Feminism, an international community that strives to close the information gap about gender, feminism, and the arts on Wikipedia. Taking inspiration from Audre Lorde’s statement that “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare” (Lorde 1988), we design our events to support our participant’s minds, bodies, and psyches. We do this by constructing welcoming and accessible trainings, providing food and childcare at our events, and by maintaining a friendly spaces policy (Evans, Mabey, and Mandiberg 2015). We do this because we know that activism takes physical and emotional energy and is often met with resistance. We seek to care for the participants and to reduce the potential of any harm that may come to them (Tamani et al. 2020).

My work with Art+Feminism has caused me to think a lot about how to reduce the harm that our participants experience. It may be unconventional, but in that intense moment in March, I used my admittedly surface level understanding of harm reduction as a loose framework to assess my decisions. Originally articulated during the 1980’s HIV epidemic to describe needle exchanges, harm reduction is a public health theory that eschews an abstinence-only approach to risk and disease in favor of practices that minimize negative outcomes (Des Jarlais 2017). To be clear, I am in no way equating taking a digital media course in a pandemic with opioid addiction; the concept of harm reduction can be implemented in different circumstances. While harm reduction remains most frequently discussed in terms of drug and alcohol addiction or sex education, society has widely adopted many other harm reduction strategies: seat belt laws and rest stops reduce traffic deaths endemic to automotive travel; hard hats, bicycle, motorcycle, hockey and football helmets reduce serious brain injuries; life vests and fences around pools help prevent drowning; and sunscreen mitigates the danger of skin cancer inherent in being outdoors (UNAIDS 2017). During the pandemic, societies have encouraged social distancing, hand washing, and wearing masks to help reduce the likelihood of contracting COVID. These are forms of harm reduction: we accept that it is not possible for most people to completely abstain from interacting with other people and, for those that continue this inherently risky behavior, certain practices can help reduce the potential for physical harm (Kutscher and Greene 2020). Of course, these measures have not mitigated the pandemic’s significant mental health impact (Choi et al. 2020; Abbott 2021).

While it may be unconventional to apply harm reduction principles to the mental health impacts of the pandemic classroom, a small number of practitioners have discussed applying harm reduction principles to mental health (Krausz et al. 2014). While writing this essay I learned about social welfare’s use of trauma-informed practice to provide services that are sensitive to their clients’ traumatic histories. Trauma-informed practice is built around five principles: ensuring safety, establishing trustworthiness, maximizing choice, maximizing collaboration, and prioritizing empowerment (Fallot and Harris 2001). Educators have adapted these principles into a trauma-informed pedagogy in K–12 education (Thomas, Crosby, and Vanderhaar 2019), and more recently in post-secondary education—first in social welfare (Carello and Butler 2015), and to other disciplines in the wake of the pandemic (Imad 2020). At their core, these practices seek to minimize the potential for retraumatization and maximize students emotional and cognitive safety.

While I was motivated by discourses of care and harm, trauma is probably a more precise definition. I always enter the classroom with the knowledge that my students have already experienced trauma, as people living in an unequal society, whose divisions, imbalances and punishments are marked by the intersections of race, gender, and class. I knew that many of my students would be physically vulnerable, and the rest would be economically vulnerable. Most of my students hold jobs, many of which are full time. I knew that most of my students either work in parts of the service sector that would be deemed essential or in retail jobs that would be laid off or furloughed. They live at home with their parents, who are similarly vulnerable.

Thus, as I redesigned the course, I centered care by reducing the potential for harm and trauma that this course might cause my already traumatized students. I knew that the COVID-19 crisis would amplify and transform a task that would have previously been a productive challenge into a debilitating barrier to completing an assignment or the course. We had to get our way through the semester amidst a public health crisis, and I wanted to make sure I removed as many barriers as possible and reduced the stress that this course would have on my students, colleagues, and myself.

Digital Foundations Online

COM 115 Introduction to Media Environments is a one-credit, 7 ½-week course that introduces students to the basics of digital media production. It is required of all students in the Communications major at the College of Staten Island and is the prerequisite for all courses in the Design and Digital Media specialization. COM 115 is the only course in our program with a standardized syllabus used across all sections and instructors. I typically teach two to three sections a year, while the other six to eight sections are taught by adjunct faculty. During the second half of the Spring 2020 semester there was only one other section, taught by an adjunct instructor.

I developed the course alongside the Digital Foundations textbook I co-authored with xtine burrough (burrough and Mandiberg 2008) and maintain the wiki version of Digital Foundations, which is kept up to date with Adobe software releases. Like Digital Foundations, COM 115 integrates historical examples and the design principles of the Bauhaus into an introduction to digital media production. For example, in COM 115 students use the Josef Albers color theory exercises in order to understand the Color Picker tool, integrating history, aesthetics, and technique in the same lesson. While the course emphasizes design principles and techniques over software training, the class does function as the “Intro to Adobe” for our department. The Adobe Creative Cloud has developed a monopoly on design and digital imaging software in the creative industries and in the classrooms of students who aspire to enter those industries. Like all monopolies, Adobe extracts a hefty price—one that has become more unavoidable since they shifted to a subscription-only model.

In March, faculty from across CUNY converged on the usually quiet Media, Arts, and Technology Discipline Council group on the Academic Commons to participate in a thread titled “Moving production courses online for COVID // the Adobe problem.” Though media arts courses are well suited for online delivery, rapidly migrating a digital media production course to remote learning creates problems specific to the tools and techniques used in our software-based classrooms. This is not just a problem for this course, or for all digital media arts instructors, but for all courses that rely on fixed technology—especially those at underfunded public institutions like CUNY with students who have difficulty accessing fixed technology (Andre Becker, Bonadie-Joseph, and Cain 2013; Smale and Regalado 2017); as opposed to mobile technology like smartphones, fixed technology refers to desktop computers, printers, and other resources that are often only available to our students in computer labs.

My own experience piloting an online version of the class in 2012 confirmed this challenge. Because of a failure in the CUNY First registration software, very few of the students realized they were registering for an online course. Roughly half of the students dropped the class, and many of those that remained struggled to succeed because they were unable to access the necessary Adobe software. I made my decisions to ensure my students would not experience this kind of trauma as a result of my course.

Avoiding Adobe

Two side-by-side screenshots of the Photopea and Adobe Photoshop user interfaces, with the mouse cursor hovering over the Transform menu item in the Edit menu; the Transform sub menu is almost identical in both screenshots.
Figure 1. A comparison of the nearly identical Free Transform menu options in Photopea and Adobe Photoshop.

I decided to not use Adobe software, opting instead to use two Adobe web app clones: Photopea and Designer.io. I made this decision by following my goal to prioritize care and reduce harm, without sacrificing rigor. I can say with confidence that this was the right decision.

At the time that I made the decision to use web apps, we did not know if our department’s Mac lab or the library labs would stay open—they did not. Nor did we know whether or not students would be able to access the Adobe software on their own computers—they were, because Adobe granted a special license, but we only learned this a week after the course began. I knew that if Adobe didn’t make that special license available, most of my students would not be able to secure their own copies of the software because of its subscription model’s substantial cost. Once it was made available by the school several weeks into the course, only a small percentage of the students installed it—the majority used the web apps, including all the students who came to video office hours.

Many of our students do not have easy access to fixed technology, and those that do have access to a desktop or laptop may not be comfortable installing software, nor may the computer be powerful enough to effectively run the resource intensive Adobe software. Because so many of our students lacked access to computing, CUNY made an emergency purchase of 25,000 Chromebooks and 25,000 Android tablets to ensure our students would be able to access online learning. Many of my students used these Chromebooks for the course. I chose to avoid the open source Inkscape and GIMP because of the difficulty of installing software, because of the uncertainty about the capabilities of the Chromebooks, and because their interfaces diverge from the Adobe software more than the two web apps.

Two side-by-side screenshots of the Designer.io and Adobe Illustrator user interfaces, with the mouse cursor clicking on the Shape Tool, showing the tool options; the tool icon and options are similar, but not identical, and they are located in different areas of the interface.
Figure 2. A comparison of the Shape Tool in Designer.io and Adobe Illustrator.

The user interface for Photopea is almost identical to that of Photoshop, and Designer.io follows the principles of Illustrator though its interface diverges more. In Photopea, the menu item names are almost all exactly the same, in the same place, with almost identical iconography; in Designer.io the tools are very similar, if located in slightly different places. Photopea even exports the PSD format with layers. They aren’t exactly identical: for example, Designer.io a has a slightly different pathfinder tool, and intermediate tools such as unsharp mask are missing in Photopea. Neither has the kind of advanced features that the Adobe software has, but these are sufficient for an introductory class at the 100 or 200 level. Most importantly, these tools scaffold directly into using the Adobe software: the tools, menus and concepts are so similar.

Using these web apps was the right decision. I experienced very few difficulties with students getting set up to use the software, and it worked on macOS, Windows, and Chrome OS. While neither is set up for mobile use, both officially work on tablets, though they were a bit glitchy in my testing. Most importantly, I am confident that the students who will continue on to other 200-level courses in the program will be able to seamlessly move into the Adobe software.

A caveat: I cannot predict the longevity of these two websites. I couldn’t find much information about Gravit, the for-profit Canadian company that develops Designer.io. Confusingly, Photopea has a GitHub repository for “bug reports and general discussion” without any code, but “Photopea is not fully open-source.” I don’t know what their business models are nor if they have sufficient resources to continue keeping these tools up and running.

Another caveat: both Photopea and Designer.io are freemium web apps. They include advertisements, and up-sell pitches for their premium versions. Designer.io requires you to create a login. I’m conscious of the adage that “if you’re not paying for something, you’re not the customer; you’re the product being sold” (blue_beetle 2010). Working within my principles of harm and care, I felt confident that a few more ad targeting cookies were a lesser harm than not being able to access any software.

Asynchronous instructional design and learning experiences

Knowing that the pandemic was certain to destabilize my students’ lives, and their schedules, I chose to design an asynchronous course in order to prioritize care and minimize harm. I knew that some students’ lives would be completely derailed by the pandemic, but if I structured enough flexibility into the course so students could complete the assignments when their pandemic timelines allowed, I could reduce the potential that they would fail to complete the course. I used Blackboard as the platform for asynchronous videos, pairing these with video office hours during the regularly scheduled two-hour class period. As much as I detest Blackboard, I knew that my students would be using it in their other classes, and would be most comfortable there. In the best of times, I substantially but productively challenge my students when I use Wikipedia or the CUNY Academic Commons as the course platform (Davis 2012). In these worst of times, I feared a new interface would be harmful.

Figure 3. First video demonstration: Dynamic and Static Compositions.
Video demonstrating Mac OS computer interface, with an image of an Egon Schiele painting in a small window at the left, and a larger Gravit Designer window with a composition of rectangles based on the painting.

Using the web apps, I made video demonstrations of each of the exercises we cover from Digital Foundations. I shared these videos with my students and the adjunct instructor via a public Dropbox folder, which also includes the course syllabus, and the text of each of the assignments; additionally I posted a Study Guide in preparation for the exam. In the principle of reducing strain for everyone, including my adjunct colleague, I shared all of my assessment and communication materials via a private folder. These materials included: quizzes and exam for Blackboard; exam text and individual images; and all emails and announcements. I decided to make the weekly quizzes for practice only, rather than grading them, in an effort to lower the stakes.

Course dynamics were starkly different from teaching this in-person, or as a synchronous online course. Essentially everyone was doing their own private effort. They had almost no interaction with each other. Trauma-informed pedagogy emphasizes communication between students, collaboration, and peer support, which were entirely absent from this model (Imad 2020). And yet, they were able to finish the course. Of the fifteen students in the course, one dropped, and one never completed any assignments, but the remaining thirteen completed the majority of the assignments and all of them passed. Of these, five came to my video office hours; one student came every week, two students came every other week, and the other two popped in once. Four of the students who never came to office hours were very self-directed and highly motivated, and maybe had some previous experience with digital imaging. The remaining four who struggled and never made it to office hours made clear that they were impacted by COVID-19.

In retrospect, I recognize that I fostered the trust and safety that trauma-informed pedagogy advocates by encouraging my students to keep their video off if they wanted, which they did. During video office hours I only saw one student’s face, briefly, when they pressed the wrong button when trying to share their screen. While some of my colleagues actively complained that they couldn’t see their students, I knew my students needed privacy. They have a right to not let their classmates and their professors see the inside of their messy bedroom, or the closet, bathroom where they retreated from their other family members to get quiet and privacy. I found I was able to build rapport with the five who came to office hours despite the absence of video; and maybe I succeeded precisely because I didn’t ask them for video.

Challenges

The course was not without challenges specific to the online format and the larger pandemic context. The main instructional challenge that I faced was in demonstrating resolution. We typically do this by scanning objects; we set the resolution on the scanner and analyze the image in Photoshop. Knowing they would not have access to scanners—I didn’t even have a scanner at home—I reframed the exercise on photographic composition, with the emphasis on printing the image in multiple resolutions so they could see the different print sizes. I should have seen this coming, but I falsely assumed that they would have access to printers, so I reframed the printing process with an emphasis on taking screenshots of the print preview interface, which shows how big the print is in relationship to the paper size. There were more hiccups: the “print actual size” option is not available in the default Windows tool, something that I didn’t know because I don’t have access to a Windows computer at home, so I worked with one of the students to figure out a workaround, and she made a video demonstrating it. Unfortunately, we did not find a workaround for Chromebooks.

The larger challenge, as expected, was that half of my students were mostly disengaged from the course. At the start of the course, I told the students that the course was self-paced, but they should try to do one chapter a week, and to have the first three chapters done by halfway through the course. At the halfway mark only half of the students had completed the three chapters, and I was worried. I spent a lot of time writing them to encourage them to complete the work and relied heavily on a College of Staten Island Student Affairs COVID specific “EDUCares” team that succeeded in reaching the students I could not get engaged. EDUcares’ mandate included checking in on unresponsive students, performing a hybrid wellness check/late homework reminder. I shared a list of students who had not responded to my emails with EDUcares, and they emailed the students on their non-CUNY email addresses and/or called them at home and in some cases on their cell phones. They were able to get responses from all but one of the students and all but that one student (who never responded throughout the course) completed the first three chapters shortly after. During these exchanges I learned what I suspected: many of them did not have internet access, were without a computer until they received a CUNY loaner Chromebook, or were sharing a computer with other members of their family.

Outcomes

Certain aspects of the course (and the knowledge they produce) were simply not possible in this format: when I teach color theory in person, we spend fifteen minutes of class looking at and describing the colors of the clothes that everyone in the class is wearing. By the end of those fifteen minutes, they understand that there really is no such thing as black or white and they start to see the blues and purples in the very dark grey they previously would have called black and the yellows and oranges in the 5 percent grey that they would have called white. That simply isn’t possible to do as a group, in this online format. It isn’t really possible to do it with colors on a screen, as these are so removed from their lived experience, and each person’s screen will have a different color profile. I tried to do it with the one student who came to office hours every week, and it took us thirty minutes of one-on-one discussion—it is very strange asking a student to describe the color of the computer they are working on and persuading them that it isn’t actually dark grey, as they claim, but rather is a very low saturation dark blue.

To speak more broadly, it seems so hard to do radical pedagogy online. The software is structured around the banking model of education (Freire [1970] 2000), except instead of the human instructor at the front of the classroom depositing knowledge into the students’ presumed empty minds, it is a video of the instructor. Paolo Freire would be sad to see this (Boyd 2016). When I teach this course again, I will try to work against that as much as possible. This is one place where I might have sacrificed too much of the rigor in favor of care and reducing harm.

On the other hand, I feel like more of the students demonstrated baseline competency in the techniques that we covered. More specifically: in a typical in-person class of fifteen students, two-to-four students fall behind and never catch up because they always came twenty minutes late, missed the first class, couldn’t complete assignments on time, etc. This format alleviated some of this problem. In a typical in-person section those two-to-four students per class sit for the final exam and still fail the course, but in this format everyone who took the final passed; the only person who did not pass the course, never completed a single assignment and earned a WU (Withdrew Unofficially) grade. In this online course, the students who struggled did show their limits: in the final exam they performed better than the typical students who sit for but fail the final exam in an in-person class, but worse than the students who had steadily completed assignments throughout the course. Overall, the cohort did as well or better than most in person classes on the hands-on section of the final exam.

This was an emergency effort. None of these students expected to take an online class. My adjunct colleague and I never expected to teach one. Despite this strained context, my decisions to prioritize care in order to minimize harm helped us all get through the semester, while preparing the students to succeed in future courses. Though our classrooms will “return to normal” some point—or whatever our new normal will be—our students will carry the trauma of this pandemic. I hope to continue this trauma-informed pedagogy of care, finding a new balance between reducing harm and maintaining rigor in what will hopefully be a less traumatic post-pandemic teaching environment.

Bibliography

Abbott, Alison. 2021. “COVID’s Mental-Health Toll: How Scientists Are Tracking a Surge in Depression.” Nature 590, no. 7845: 194–95. https://doi.org/10.1038/d41586-021-00175-z.

Andre Becker, Danielle, Ingrid Bonadie-Joseph, and Jonathan Cain. 2013. “Developing and Completing a Library Mobile Technology Survey to Create a User-Centered Mobile Presence.” Library Hi Tech 31, no. 4: 688–99. https://doi.org/10.1108/LHT-03-2013-0032.

blue_beetle. 2010. “User-Driven Discontent.” MetaFilter (blog). August 26, 2010. https://www.metafilter.com/95152/Userdriven-discontent#3256046.

Boyd, D. 2016. “What Would Paulo Freire Think of Blackboard: Critical Pedagogy in an Age of Online Learning.” The International Journal of Critical Pedagogy 7.

burrough, xtine, and Michael Mandiberg. 2008. Digital Foundations: Intro to Media Design with the Adobe Creative Suite. Berkeley: New Riders in association with AIGA Design Press.

Carello, Janice, and Lisa D. Butler. 2015. “Practicing What We Teach: Trauma-Informed Educational Practice.” Journal of Teaching in Social Work 35, no. 3: 262–78. https://doi.org/10.1080/08841233.2015.1030059.

Chatzidakis, Andreas, Jamie Hakim, Jo Littler, Catherine Rottenberg, and Lynne Segal. 2020. “From Carewashing to Radical Care: The Discursive Explosions of Care during Covid-19.” Feminist Media Studies 20, no. 6: 889–95. https://doi.org/10.1080/14680777.2020.1781435.

Choi, Kristen R., MarySue V. Heilemann, Alex Fauer, and Meredith Mead. 2020. “A Second Pandemic: Mental Health Spillover From the Novel Coronavirus (COVID-19).” Journal of the American Psychiatric Nurses Association 26, no. 4: 340–43. https://doi.org/10.1177/1078390320919803.

Davis, LiAnna. 2012. “Digital Media Professor Gives Students Real-World Experiences through Wikipedia Assignment.” Wikimedia Foundation Blog (blog). January 4, 2012. https://diff.wikimedia.org/2012/01/04/design-professor-gives-students-real-world-experiences-through-wikipedia-assignment/.

Des Jarlais, Don C. 2017. “Harm Reduction in the USA: The Research Perspective and an Archive to David Purchase.” Harm Reduction Journal 14, no. 1: 51. https://doi.org/10.1186/s12954-017-0178-6.

Evans, Siân, Jacqueline Mabey, and Michael Mandiberg. 2015. “Editing for Equality: The Outcomes of the Art+Feminism Wikipedia Edit-a-Thons.” Art Documentation: Journal of the Art Libraries Society of North America 34, no. 2: 194–203. https://doi.org/10.1086/683380.

Fallot, Roger D., and Maxine Harris. 2001. “Creating Cultures of Trauma-Informed Care: A Self-Assessment and Planning Protocol.” https://doi.org/10.13140/2.1.4843.6002.

Freire, Paulo. (1970) 2000. Pedagogy of the Oppressed. 30th anniversary ed. New York: Continuum.

Hobart, Hi‘ilei Julia Kawehipuaakahaopulani, and Tamara Kneese. 2020. “Radical Care: Survival Strategies for Uncertain Times.” Social Text 38, no. 1 (142): 1–16. https://doi.org/10.1215/01642472-7971067.

Imad, Mays. 2020. “Leveraging the Neuroscience of Now.” Inside Higher Ed (blog). June 3, 2020. https://www.insidehighered.com/advice/2020/06/03/seven-recommendations-helping-students-thrive-times-trauma.

Kutscher, Eric, and Richard E. Greene. 2020. “A Harm-Reduction Approach to Coronavirus Disease 2019 (COVID-19)—Safer Socializing.” JAMA Health Forum 1, no. 6: e200656. https://doi.org/10.1001/jamahealthforum.2020.0656.

Lapowsky, Issie. 2015. “The Reeducation of Blackboard, Everyone’s Classroom Pariah.” Wired, July 21, 2015. https://www.wired.com/2015/07/blackboard-reinvention/.

Lorde, Audre. 1988. A Burst of Light: Essays. Ithaca, New York: Firebrand Books.

M. Krausz, Reinhard, Gregory R. Werker, Verena Strehlau, and Kerry Jang. 2014. “Applying Addictions Harm Reduction Lessons to Mental Healthcare.” Advances in Dual Diagnosis 7, no. 2: 73–79. https://doi.org/10.1108/ADD-01-2014-0003.

Monchinski, Tony. 2010. Education in Hope: Critical Pedagogies and the Ethic of Care. Counterpoints, v. 382. New York: Peter Lang.

Noddings, Nel. 1984. Caring, a Feminine Approach to Ethics & Moral Education. Berkeley: University of California Press.

Rose, Ellen, and Catherine A. Adams. 2014. “‘Will I Ever Connect with the Students?’ Online Teaching and the Pedagogy of Care.” Phenomenology & Practice 7, no. 2: 5–16. https://doi.org/10.7939/R3CJ8803K.

Silver, Dan, and Sarah Marie Hall, eds. 2020. “Radical Care.” Special issue. The Sociological Review, March. https://www.thesociologicalreview.com/radical-care-as-the-foundation-for-a-better-world/.

Smale, Maura A., and Mariana Regalado. 2017. Digital Technology as Affordance and Barrier in Higher Education. 1st ed. 2017. Cham: Springer International Publishing : Imprint: Palgrave Macmillan. https://doi.org/10.1007/978-3-319-48908-7.

Tamani, Melissa, Michael Mandiberg, Jacqueline Mabey, and Siân Evans. 2020. “What We Talk About When We Talk About Community.” In Wikipedia @ 20: Stories of an Incomplete Revolution, edited by Joseph M. Reagle and Jackie L. Koerner. Cambridge, Massachusetts: The MIT Press.

Thomas, M. Shelley, Shantel Crosby, and Judi Vanderhaar. 2019. “Trauma-Informed Practices in Schools Across Two Decades: An Interdisciplinary Review of Research.” Review of Research in Education 43, no. 1: 422–52. https://doi.org/10.3102/0091732X18821123.

UNAIDS. 2017. “Explaining Harm Reduction with Hard Hats, Seatbelts and Sunscreen.” UNAIDS (blog). June 23, 2017. https://www.unaids.org/en/resources/presscentre/featurestories/2017/june/20179623_harm-reduction.

About the Author

Michael Mandiberg is an interdisciplinary artist who created Print Wikipedia, edited The Social Media Reader (NYU Press), and co-founded Art+Feminism. Their work has been exhibited at Los Angeles County Museum of Art, The Whitney Museum of American Art, and Musée d’Art Moderne de la Ville de Paris, amongst others. Mandiberg is Professor of Media Culture at the College of Staten Island, CUNY and Doctoral Faculty at The Graduate Center, CUNY.

Figure 2: An image of a student sitting in front of a camera with hands clasped together in front of her face.
1

Reflecting on Reflections: Using Video in Learning Reflection to Enhance Authenticity

Emma J. Rose, University of Washington Tacoma
Jarek Sierschynski, University of Washington Tacoma
Elin A. Björling, University of Washington Tacoma

Abstract

Reflection is commonly used in the classroom to encourage students to think about and articulate what they have learned. However, when students produce reflections they typically create a written text for the instructor, outside of the classroom and as a summative retrospective account of learning. In this paper, we present the details of how we implemented Ecological Momentary Reflection (EMR), a video enabled reflection within the classroom environment to help students assess their perceptions of self and learning across time. In this paper, we recount how we implemented EMR in an informal learning environment and provide our own assessment of its effectiveness. We argue that using video makes the reflection experience more authentic and meaningful for both student and teacher.

Introduction

Reflection is commonly used in the classroom to encourage students to articulate what they have learned and to aid them in thinking about how they have learned. Traditionally, students reflect on their learning process through the act of writing. According to Yancey, written reflections benefit students by helping them remember details of how they completed an assignment, as a generative process to create meaning for future writing, and as a way to develop authority and expertise (Yancey 1998). While written reflection has its strengths, it also has some inherent limitations. Written reflection is typically geared toward oneself and is often produced as a text for the audience of the instructor — perhaps limiting the student’s authenticity.

Moreover, writing is a form of culturally constructed expression with its own peculiarities (see, Chafe 1991; Chafe & Tannen 1987) that simultaneously differentiate and distance written texts from more direct or immediate forms of communication such as speech, sign or gesture. Even though texts are a profound means of representing human thought and introspection, the process of writing a text can become an impediment to self expression. For example, when writing skills are underdeveloped, not available, or stymied by other factors, writing can be limiting rather than productive. Additionally, much of the writing process relies on drafting and revising, a reiterative process aimed at clarifying expression and distancing the writer from the initially captured “raw” and momentary expression. At the same time, it can be argued that the strength of writing as a reflective tool lies precisely in a symbolic and temporal chasm between the individual and experience that nurtures reflection.

Given the benefits of the reflection process, and the inherent downsides of written expression, we wanted to explore a mode of reflection that could be incorporated authentically into the context of science learning in an informal setting, in this case during a summer STEM (Science, Technology, Engineering, Mathematics) camp for teens. We ground our use of the term, authenticity, in Buxton’s (2006) framework for authenticity in science education. He conceptualizes a youth-centered model that focused on the lives and learning of underserved and marginalized youth and thus on equity and social justice (see Medin & Bang 2014; Barron, Mertl & Martin 2014; Barton 1998; Barton & Young 2000; Nasir & Cooks 2009). These youth-centered models view authentic science learning and knowledge as a sociocultural process situated in lived experiences, such as cultures, identities, communities, homes, and the wide range of informal environments where learning occurs. Implementing a reflection method that leverages the experience of the community and captures reflection in context was synergistic to the authentic, youth-centered model of learning at the heart of the summer STEM summer camp experience we were investigating.

To create an integrated and authentic notion of reflection in the learning environment, we introduced an exercise that asked students to record a series of videos. Adding video to the reflection process helped students see that their thoughts about themselves and the STEM subjects have changed. This activity also layered an additional element of a shared and community based reflection to the learning experience. Furthermore, the video reflections provided instructors and program directors with an authentic representation of the students’ struggles and triumphs throughout the duration of the camp. These factors helped students see their own learning and helped instructors in getting feedback on the course to inform future improvements of the camp.

In this article, we provide our own reflection on the process of introducing a new method of reflection into a learning environment. The aim of this article is to introduce the concept of Ecological Momentary Reflection (EMR) and to recount its effective use in an informal STEM learning environment. We propose that the addition of momentary, that is in the moment, video to capture real-time student reflections in the classroom provides an authentic reflective practice leading to valuable insights for both learner and instructor. First, we articulate the context of the learning environment where we implemented EMR. Second, we define reflection as a pedagogical practice and how it is used in writing and how video can support reflection on practice. Third, we provide details of how we implemented video reflection in the summer camp, a method we are calling Ecological Momentary Reflection (EMR), and invite educators and researchers to consider this method of reflection in their own teaching environments.

Context: Informal Learning in a STEM Summer Camp

Every summer at a Pacific Northwest University, middle and high school students come together for a summer camp that is focused on learning about STEM (science, technology, engineering and math). The STEM camp mission is to encourage and increase diversity in STEM fields by providing informal learning experiences for students (grades 7-12) that remain underrepresented in the sciences. Underrepresented groups included low-income, minority, female and potential first-generation college students, among others. The campers who attend the STEM Camp tell us they are drawn to it year after year because it is fun, and ‘you get to do things’. Participants build robots, design video games, wade out into the muddy banks of our local waterways to collect water samples, and more. It is a break from school and existing social pressures; it is a safe place. Students make new friends and deepen existing relationships as they interact with their peers, some who return each year.

Informal learning is a broad concept that refers to any learning that occurs outside of the formal realm of school (Dierking et al. 2003). Informal learning includes people engaging with their environment in a variety of contexts and settings. Learning experiences that are designed for broad audiences (i.e., museums, summer camps, etc.) are considered types of informal learning both inside and outside of the STEM disciplines. In these settings, informal learning tends to be momentary, unplanned, problem-based, learner-centered, driven by individual interests (National Research Council 2009). Many STEM summer camps can be categorized as informal learning environments in that they promote experiential learning and exist outside of the realm of formal schooling. The camp instructors include current college students or professionals such as educators or scientists from the local community. The authors of this article were involved with the STEM camp in the roles of faculty mentors to the instructors.

It is within this informal learning setting that we implemented EMR as both a pedagogical tool and a research method aimed to enable students to reflect on their changing identities as well as their relationship to STEM subjects. In the Summer of 2015, we conducted an IRB approved research study where we used EMR with 9th grade participants in the STEM camp. The students spent three weeks designing a video game using Kodu, a visual programming language. In this paper, we focus on the promise of EMR for use as a pedagogical tool in the classroom.

Reflection as a Pedagogical Practice

Reflection is a common pedagogical practice where students are asked to think about and articulate what they have learned. Reflection has long been viewed as synonymous with thinking and learning (Dewey 1933). Moreover, reflection is considered a core element of metacognition. Metacognition, a multifaceted term connected with reflection, refers to knowledge about, and the regulation of, cognitive processes such as self-regulated learning (Brown, Bransford, Ferrara, and Campione 1983; Flavell 1979; Zimmerman 2002). In other words, metacognition is a student’s awareness of how to learn and also an awareness of herself as a learner. Metacognition is also connected to students’ ability to transfer their learning across contexts (Bransford, Brown & Cocking 2000). In K-12 settings, both elements of metacognition, knowledge of strategies associated with specific academic tasks (such as reading, writing or math) and self-regulatory strategies (such as self-monitoring or self-evaluation) are commonly used in teaching and learning tasks. There is a rich variety of established pedagogical approaches that apply metacognitive strategies for learning. For instance, in writing, students use think-aloud and self-questioning strategies (Scardamalia, Bereiter & Steinbach 1984). In reading, students self-monitor to check for comprehension through questioning, summarizing or making predictions about a text (Palincsar & Brown 1984). In math, students can use self-assessment to evaluate their own mathematical capabilities (Schunk 1996). These strategies have been associated with increased achievement and also with higher self-efficacy (see Schunk 1996). Furthermore, reflection often occurs as an important process in the development of expertise. Looking at how expert practitioners engage in their work, Schön observes: “Reflection tends to focus interactively on the outcomes of action, the action itself, and the intuitive knowing implicit in the action.” (Schön 1982, 56).

According to Yauncey (1998), reflection is both a process and product and the product that is created is available to the world and is therefore a social act. She states, “because it works both inside and outside, reflection-in-presentation is personal, but it’s social as well” (Yancey 1998, 94). However, in the writing classroom, a reflection tends to be a written text, constructed by a student for the instructor and often disregards this social aspect referred to by Yauncey. When produced for the sole audience of the teacher, written reflections can pressure students to attempt to perform the type of writing expected by the teacher: demonstrating what they should have learned rather than reporting on what they actually learned (Jenson 2010).

Ecological Momentary Reflection

Our goal in implementing video enabled reflection in the classroom was twofold. We wanted to see how students’ ideas about and in relation to STEM were impacted by their experience in the STEM camp. But we also wanted students to see how their perceptions and attitudes may have changed over time. We wanted the reflections to be as natural, immediate and embedded as possible within the practices of the camp. We wanted the reflections to be as close to the learning experience as possible, both in terms of the timing of the reflections and where the reflection would take place. In other words, we wanted them to be momentary (i.e. quick and timely) and also ecologically valid (i.e. within the environment where learning is taking place). This rationale for this embedded aspect of the reflections was driven both by our research focus but also by past experience.

The design of our reflection method is drawn from an approach used in behavioral health, medicine and psychology known as Ecological Momentary Assessment or EMA (LaCaille et al. 2013). In EMA, research participants provide feedback on symptoms, feelings, or other measures in real time and these assessments are often repeated over time. This real time reporting is enabled by a variety of technologies, such as mobile phones. As proponents of EMA report, its strength is in the authentic context where the research takes place and the ability to capture data as it happens (Shiffman, Stone, and Hufford 2008). Additionally, EMA has been proven an effective method to capture change within individuals and avoids the “pitfalls and limitations of reliance on autobiographical memory” (Shiffman, et al. 2008, 7).

Based on our previous experience in the STEM camp, we had limited success with interview methods with students. Although we had seen the students’ progress in a variety of ways, their own assessment of their experience did not include an expression of awareness of these changes. We also felt that the interview environment seemed superficial and separate from the classroom activities, likely influencing the authenticity of the students’ responses.

As a result, we designed our methodology to be informed by the concept of reflection and also containing the ecological and momentary characteristics of EMA. Because of our use of video to capture student reflections in the moment we named this method Ecological Momentary Reflection (EMR).

Implementing EMR in the Classroom

In the summer of 2015, we worked closely with the 9th grade cohort of the STEM camp program and their instructors to implement the EMR method. We explained to students that they were creating the videos for themselves but also for each other as a way to reflect on their learning and to capture their experiences at summer camp. Therefore, students were aware there was a larger audience for the reflections. Students were also told that highlights of the reflections would be compiled and they would watch this highlight compilation together on the last day of the camp. They were given digital copies of their personal reflections, their group videos and a copy of the final edited compilation to take home with them as a keepsake from their camp experience.

Students created three video reflections during the three-week summer camp: an introduction, mid-point, and final reflection. For the first reflection, students created an introductory video. They were asked to introduce themselves, talked about their hobbies or interests, and reflect on how they felt about STEM and about themselves. In the second video reflection, students were given two photographs of themselves from previous days at camp that captured them engaging in one of the main STEM camp practices, in this case coding a video game on a computer. Students were asked to reflect upon what they were doing in the photo, how they felt looking at themself and what the photographs reflected about them as individuals. In their final reflection, which took place at the beginning of the third week of the camp, the procedure was slightly different. Students watched the previous two reflection videos and were then asked to respond via video to the experience of watching themselves and how they changed over the course of the summer camp. The specific wording of the prompts is shown in Table 1 below.

Topic and timing Prompts
Video reflection 1: Introduction (Day 1)

 

Introduction reflection

1.     Who are you:
What do you like to do? What makes you special?

2.     You and technology:
Do you think of yourself as a technical or computer person? Why or why not?
Do you think other people in your life (friends or family) see you as a technical or computer person? Why or why not?

3.     Complete this sentence: By the end of STEM camp this summer I expect….

Video reflection 2: Photo reflection
(Day 8)
Photo reflection

During one of the early days when students start coding they will photographed while they are working. The photograph will serve as part of the prompt:

1.     How would you describe what you are doing in the photographs? How does this fit into the rest of your life?

2.     Can you talk a little about what you feel and think when you look at these photographs?

3.     What do these photos reflect about who you are?

4.     What can somebody looking at these photos learn about you?

5.     CHALLENGE—Come up with your own prompt (question for self) related to the photos and try to answer it.

3. Video reflection 3: Wrap up (Day 14_) Final reflection:

After watching the video of yourself from the start of the program, answer these questions:

1.     What do you think after watching that video?

2.     Do you see yourself any differently from when you started STEM camp?

3.     Have you learned anything new about yourself?

4.     What was the best and worst parts of STEM camp?

5.     What surprised you about this experience?

6.     Please complete the following sentence. “After participating in STEM camp this year, I feel that I… “

To create an appropriate space for the video reflections, we used a small, quiet, private room just outside of the main classroom where students were spending their days. The room was equipped with a GoPro Hero 4 camera and students could move or adjust the camera based on their comfort level (Figure 1 shows the room set up).

Figure 1: An image of a small room with two empty chairs and a table. On the table is a video camera on a tripod, and a list of questions that contain the prompts for the video.

Figure 1: Video reflection room set up with camera and prompts.

 

Our motivation for creating a private space adjacent to the classroom was to give students a place to be able to quietly reflect while still being close to the camp setting. Giving the students a private space, but one that is still connected in time and space to the learning environment maintained the ecological soundness of this method. Figure 2 shows a still photo from a student’s video reflection showing the setting where students made their recordings.

Figure 2: An image of a student sitting in front of a camera with hands clasped together in front of her face.

Figure 2: A still from a student’s video reflection.

 

In addition, we had anticipated, and hoped, that this mode of reflecting: speaking to a video camera, might emulate current, culturally appropriate and familiar practices. We took our inspiration from the many examples of young people posting reflections or product reviews on YouTube from their bedrooms. We often referred to the small room where the videos were being made as our “reality show confession booth.” This idea seemed to resonate with the students and they seemed very comfortable expressing themselves in front of the camera.

Assessing EMR

In order to retrospectively assess how EMR worked within this setting, our team applied thematic analysis (Guest et al. 2011) of the following qualitative data: (1) student video reflections (2) field notes, memos and reflections from the research team, and (3) data from personal interviews with the two instructors of the camp. The qualitative data was reviewed, coded and discussed by the research team to uncover common themes throughout the data. These data were discussed in relation to the researchers’ experiences of using standard textual reflections.

Theme 1: Initial Reticence, Overall Enthusiasm

During the creation of the first video reflection in week one, some students mentioned that they felt a little awkward creating the video diaries. In contrast, in the last video reflection, students commented that they looked awkward in the first video or remembered feeling awkward at the time. Although there was this initial reticence regarding the first video recording, students also described how much more comfortable they were recording their last reflection. Most students were overwhelmingly enthusiastic about the experience of having done the videos in retrospect. They mentioned how they enjoyed using the cameras, interviewing one another, and taking the cameras on their field trips. This enthusiasm was palpable when the students were showed the highlight compilation of their summer experience.

“Anyway, I really liked that video. I’m feeling good because it’s kinda like the whole three weeks packed into one little video and it kinda shows my progress, like what I thought before and what I think now. And it’s kinda different to think that like in the beginning, I didn’t think I could do it, and I know how to get it done.” – P6

As one instructor noted, “we even asked them if they liked [it]… and they all responded with an overwhelming “YESSSS!!” (Solis-Bruno 2015).

The instructors in the program while supportive of EMR, had a variety of questions concerning the feasibility of this technology. However, they helped to creatively embed the videos reflections into the environment and the curriculum. They were also helpful in communicating the purpose of the videos in a student-centered way by referring to it as “the documentary.” Similar to the students’ experiences, instructors grew to value this novel method. By the end of the program, they reflected that the felt EMR was highly valuable for the students and both instructors said they would incorporate video reflection in this way in future classes (Solis-Bruno 2015; Jordan 2015).

Theme 2: Seeing Themselves

In the previous year of the STEM summer camp, we had asked students to think about how they had changed over the three-week experience in a series of semi-structured interviews. Overall, students in previous years did not express seeing much change in themselves over the short experience in the camp. However, students who used EMR were able to visibly see themselves in retrospect and comment on the changes in their learning. They compared their feelings about technology, coding, and engineering over the three-week period and were able to see for themselves how their thoughts had changed. These “a-ha” moments were the most visible during the second and third video reflections when students were looking at pictures of themselves or looking back at their previous videos.

During the second video reflection, we gave students photographs of themselves at work during the camp. Figure 3 shows an example of a photograph that was used as a prompt. In this video reflection in particular, students expressed surprise and amazement at seeing themselves from this outside perspective.

Figure 3: An image of a student sitting in front of a computer, designing a video game.

Figure 3: A photograph of a student that was used as a prompt in the second video reflection.

 

Many said they had never seen themselves in this way before. They described seeing themselves as focused or that they looked like someone who was programming. Many mentioned that their families would be surprised to see the person they saw in this picture: someone who was focused and working hard.

“I think that I look determined. I feel- I feel pretty good with um the fact that I can do this … like knowing that I can do this kind of stuff, that’s cool.” – P2

This reflection in particular points out the strength of the Ecological Momentary Reflection (EMR) method, and when combined with photographs, gives students an external or third person view of themselves.

In the third and last video reflection, many students had revelations and moments of surprise as they looked back on their previous videos. Several of them provided very clear and impassioned reflections on how this experience had fundamentally changed the way they saw themselves in relations to STEM topics. One student said that before the camp, she had seen herself as an “artsy” person and now she saw she was equally strong in things like engineering. Given the goals of this STEM summer camp learning experience, this student’s shift is encouraging.

“After watching the video that I made felt really confident in myself and I felt like …[I’m] doing what I’m supposed to in MSL.” – P3

“I see myself as more of a techy person I guess I… I realized that I really like technology and I really enjoy programming these games that we’ve been doing.” – P1

Several students had noted during their reflections that they had been struggling with some aspects of the coding tasks they were doing in the camp. These struggles were temporary frustrations and only moments in time. All of the students successfully completed a working, playable video game during their time in the camp. Watching themselves talk about these struggles in the video reflections allowed them to see how they had been able to overcome them. Therefore, they were able to talk about their resilience in terms of overcoming these challenges. Being able to see how they overcame challenges and that they could overcome these challenges, enables students to see that with hard work and by asking for help they can succeed, which enables the development of a growth mindset (Dweck 2006) and grit (Duckworth et al. 2007).

Theme 3: Broadening the Notion of Audience

In contrast to the other types of reflection done in classroom settings, that tend to be solely focused on writing, we saw how the addition of the videos helps to broaden the notion of audience. The expectation and the format of the videos implied an external and broader audience than just the student and instructor. This expectation had been communicated as part of the video project and it was clear that students were thinking broadly about audience. Students had mentioned their family members in the reflections and also used the pronoun, “you”, in their reflections to invoke the audience of ‘us’ (the faculty advisors, instructors, and their peers in the course). One student’s final reflection seemed like a dedication to his peers, as he proclaimed “how cool you guys are.” According to the instructors, some students wanted an even broader audience, and were disappointed that the compilation video was not played at the end of camp celebration for their family and friends (Jordan 2015). Evidently, they were proud of not just the accomplishments of the products of the summer camp, but also the process in which they discussed their learning through the video diaries.

Theme 4: Logistics, Implementation and Technical Challenges

When introducing any new pedagogical method that incorporates technology in the classroom, there is much to learn for future iterations. We learned a great deal about implementing EMR and areas for improvement in the future, both for the STEM summer camp learning context and beyond. One of our concerns at the beginning of the study was that students entering and exiting the main classroom to record the video diaries would be disruptive to the learning environment. However, the instructors stated that they did not feel that the activity was disruptive. They stated that from their perspective, the process of making and viewing the videos was highly valuable for the students (Jordan 2015; Solis-Bruno 2015). The lack of disruption could be attributed to the nature of the informal learning environment, which can be less structured than a formal school-based learning situations. However, we assert that the EMR method would complement a learning environment that is project or inquiry based.

One technical challenge we encountered in the study was audio quality. We were using GoPro Hero 4 cameras and while the video is of very high quality, the audio was not. With that video camera in particular, an external microphone would greatly increase audio quality. In addition, the battery life and size of the video recordings were limiting factors. An additional technical challenge is the storing of video files. It is important to set up an appropriate technical infrastructure for the video files to be securely stored but still accessible to the students and instructors.

The Promise of EMR

As we reflect on our experience with EMR, we turn to its strengths and promise for use as an authentic reflection tool to augment and make visible learning that occurs in informal and formal settings. We assumed that EMR would be congruent with teens’ “selfie” culture. While students at first were reticent to film themselves on camera, they did grow more comfortable over time especially in the impromptu videos, like the ones on the field trips. Given its basis in Ecological Momentary Assessment, EMR creates a fitting and even attractive tool for student engagement. It helps to capture learning both in the environment it is taking place and also the moment it is happening. In this way, EMR captures a fleeting moment of the student’s experience and enables reflection on that otherwise inaccessible moment, allowing students to witness their thinking across time.

EMR appears to be an effective tool for student reflection. The strong theme of Seeing Themselves supports the use of EMR as an effective reflecting process in a learning environment as it allows students to see themselves from as an outside observer. EMR also overcomes some of the limitations of written reflection that can influence students to perform in an academic manner and conceptualize the teacher as the sole audience for the reflection.

One of EMR’s strengths is broadening the audience of reflection and going beyond the idea of the reflection being produced by one student for one teacher. Requiring students to produce reflections for themselves but also their peers strengthens the learning environment. The majority of students were interested in keeping their videos and also their photographs from the prompts as keepsakes. Consequently, using student-centered technology, methods, and artifacts as tools for thinking not only provides students with more meaningful learning experiences, but also promotes recurring and persistent practices of reflection.

The benefits of EMR as a technology far outweigh the drawbacks. It leverages a technology that is familiar, yet novel or unexpected in a classroom setting. The video camera engages students in a way that is low risk yet high reward. While there may be challenges to videos, there are too with written reflection, such as the varying literacy skills available to a student. We conclude that the technology is congruent with tools and technologies that many adolescents are already familiar and comfortable with.

In conclusion, the common themes that emerged from our data highlight how using EMR in the classroom can support authentic reflection that enhances students’ learning experience and educators’ assessment of student learning and the learning environment. EMR departs from static written reflections and instead provides students a way to see and reflect on their own thinking and learning as it is happening.

Thus, EMR is a promising method for reflection in any complex learning environment by capturing real-time learning, maintaining ecological validity, and allowing for authentic and powerful reflection. We highly encourage others to explore this technology in their classrooms.

Bibliography

Barron, Brigid, Véronique Mertl, and Caitlin K. Martin. “Appropriating the process: Creative production within informal interactions and across settings.” B. Barron, K. Gomez, N. Pinkard, & CK Martin, The Digital Youth Network: Cultivating new media citizenship in urban communities (2014): 167-190.

Barton, Angela Calabrese. “Teaching science with homeless children: Pedagogy, representation, and identity.” Journal of research in science teaching 35, no. 4 (1998): 379-394.

Barton, Angela Calabrese, and Kimberley Yang. “The culture of power and science education: Learning from Miguel.” Journal of Research in Science Teaching 37, no. 8 (2000): 871-889.

Bransford, John D., Ann L. Brown, and Rodney R. Cocking. How people learn: Brain, mind, experience, and school. National Academy Press, 1999.

Brown, A. L, Bransford, J. D., Ferrara, R. A., & Campione, J. C. “Learning, remembering, and understanding,” In J. H. Flavell & H. M. Markman (Eds.), Handbook of child psychology, Vol. 3: Cognitive development (1983): 77-166.

Buxton, Cory A. “Creating contextually authentic science in a “low‐performing” urban elementary school.” Journal of Research in Science Teaching 43, no. 7 (2006): 695-721.

Chafe, Wallace L. “Sources of Difficulty in the Processing of Written Language,” Center for the Learning and Teaching of Literature, University at Albany, State University of New York (1990).

Chafe, Wallace, and Deborah Tannen. “The relation between written and spoken language,” Annual Review of Anthropology (1987): 383-407.

Dewey, John. How we think: A restatement of the relation of reflective thinking to the educative process. Lexington, MA: Heath 1933.

Dierking, Lynn D., John H. Falk, Léonie Rennie, David Anderson, and Kirsten Ellenbogen. “Policy statement of the “informal science education” ad hoc committee,” Journal of research in science teaching 40, no. 2 (2003): 108-111.

Duckworth, Angela L., Christopher Peterson, Michael D. Matthews, and Dennis R. Kelly. “Grit: perseverance and passion for long-term goals,” Journal of personality and social psychology 92, no. 6 (2007): 1087.

Dweck, Carol. Mindset: The New Psychology of Success. New York: Ballantine Books 2006.

Guest, Greg, Kathleen M. MacQueen, and Emily E. Namey. Applied thematic analysis. Thousand Oaks, California: Sage, 2011.

Jensen, Kyle. “The Panoptic Portfolio: Reassessing Power in Process-Oriented Writing Instruction.” JAC (2010): 95-141.

Jordan, Stephanie. E-mail message to authors, October 8, 2015.

LaCaille, Lara, Anna Maria Patino-Fernandez, Jane Monaco, Ding Ding, C Renn Upchurch Sweeney, Colin D Butler, Colin L Soskolne, et al. “Ecological Momentary Assessment.” In Encyclopedia of Behavioral Medicine, New York, NY: Springer New York. (2013): 647–48.

Medin, Douglas L., and Megan Bang. Who’s asking?: Native science, western science, and science education. MIT Press, 2014.

Nasir, Na’ilah Suad, and Jamal Cooks. “Becoming a hurdler: How learning settings afford identities.” Anthropology & Education Quarterly 40, no. 1 (2009): 41-61.

National Research Council. Learning Science in Informal Environments: People, Places, and Pursuits. Committee on Learning Science in Informal Environments. Philip Bell, Bruce Lewenstein, Andrew W. Shouse, and Michael A. Feder, Editors. Board on Science Education, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. 2009.

Palinscar, Aannemarie Sullivan, and Ann L. Brown. “Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities.” Cognition and instruction 1, no. 2 (1984): 117-175.

Scardamalia, Marlene, Carl Bereiter, and Rosanne Steinbach. “Teachability of reflective processes in written composition.” Cognitive science 8, no. 2 (1984): 173-190.

Schön, Donald A. The reflective practitioner: How professionals think in action. Vol. 5126. New York, Basic books, 1983.

Schunk, Dale H. Goal and self-evaluative influences during children’s cognitive skill learning. American educational research journal 33, no. 2 (1996): 359-382.

Shiffman, Saul, Arthur A. Stone, and Michael R. Hufford. “Ecological momentary assessment.” Annual Review of Clinical Psychology. 4 (2008): 1-32.

Solis-Bruno, Luis. E-mail message to authors, October 9, 2015.

Veenman, Marcel VJ, Bernadette HAM Van Hout-Wolters, and Peter Afflerbach. “Metacognition and learning: Conceptual and methodological considerations.” Metacognition and learning 1, no. 1 (2006): 3-14.

Yancey, Kathleen Blake, Reflection in the Writing Classroom. Utah State University Press. Book 120. 1998.

Acknowledgments

We would like to express our deep gratitude to Amanda Figueroa and DJ Crisostomo in the Student Transition Programs at the University of Washington Tacoma for their leadership and making the MSL program such a transformative learning experience for the students of our community. We also wish to thank Luis Solis-Bruno and Stephanie Jordan the instructors of the 9th grade cohort of MSL in 2015 who were so welcoming to us and embraced the idea of using videos. In addition, this work would not be possible without the amazing teens in the MSL program who shared their experience with us through their video reflections. Finally, we would like to thank special issue editors Tyler Fox & Carlos Hernandez for bringing this special issue to fruition.

About the Authors

Emma J. Rose, Ph.D. is an assistant professor in the School of Interdisciplinary Arts & Sciences at University of Washington Tacoma. Her research is motivated by a commitment to social justice and a belief that the way technologies are designed ultimately shapes our world. Her research interests include the practice of user experience, how people use expertise to overcome resource constraints, and the development of technical identity. She tweets @emmarosephd.

Jarek Sierschynski, Ph.D. is a learning scientist and assistant professor in Education at University of Washington Tacoma. His work examines definitions of STEM, scientific practices and technology integration by focusing on complexities inherent in cultural tools used by historically marginalized communities. Recently, he has been investigating how students think about their identities in relation to science and technology. His current project involves the design of an informal learning environment in which technology serves youths as an identity, cultural and scientific resource.

Elin A. Björling, Ph.D. holds both a professional research scientist position for the Office of Research and a clinical faculty position in the school of Nursing and Healthcare Leadership at University of Washington Tacoma. Over the past two decades, Elin has studied adolescent health utilizing mixed-methods in community based project designs. Her recent research has focused primarily on using an Ecological Momentary Assessment approach to study stress in adolescents. She tweets @elinbjorling.

Interactive Technology for More Critical Service-Learning?: Possibilities for Mentorship and Collaboration within an Online Platform for International Volunteering / By Willy Oppenheim, Joe O’Shea, and Steve Sclar
1

Interactive Technology for More Critical Service-Learning?

Possibilities for Mentorship and Collaboration within an Online Platform for International Volunteering

Willy Oppenheim, Omprakash
Joe O’Shea, Florida State University
Steve Sclar, Omprakash EdGE

Abstract

International service-learning programs have rapidly expanded in higher education in recent years, but there has been little examination of the potential uses of interactive technology as a pedagogical tool within such programs. This paper explores a case study of an interactive digital platform intended to add more reflexivity and critical rigor to the learning that happens within international service-learning programs at colleges and universities. The digital platform under consideration, Omprakash EdGE (www.omprakash.org/edge), facilitates collaboration between students, international grassroots social impact organizations, and a team of mentors that supports students before, during, and after their international experiences. The authors represent both sides of a collaboration between Omprakash EdGE and a program at Florida State University which works to help students find affordable, ethical, and educational opportunities for international engagement. The paper begins with an overview of the troubled landscape of international service-learning within higher education, and an explanation of the authors’ rationale for collaborating to develop a new program model revolving around a digital platform. Then it discusses the ways in which the authors have sought to cultivate international learning experiences that are dialogical, reflexive, personal, and experiential, and it explains how a digital platform has been central to this effort by enabling students to build relationships with host organizations, engage in pre-departure training, and receive support from mentors. It then explains some of the challenges and successes the authors have encountered in their collaboration thus far, and concludes with reflections on the pedagogical constraints and possibilities for interactive technology within programs aiming to generate critical consciousness through international engagement.

I. Introduction

Within the broader trend of internationalization sweeping through colleges, universities, and even some high schools in the United States and elsewhere (Gacel-Avila 2005; Harris 2008), the phenomenon of international service-learning raises a number of interesting pedagogical and programmatic questions. As educational institutions in resource-rich countries (the so-called “Global North”) increasingly endorse opportunities for students to travel to resource-poor countries (the so-called “Global South” or “developing world”) to volunteer or intern in settings that include schools, clinics, orphanages, and community centers, what forms of student learning are they hoping to promote, and how do they assume that this learning actually unfolds? What are the ethical and pedagogical principles  —if any—that inform the design and implementation of international service-learning programs?

It is well-established that young people are leaving their home countries to volunteer abroad at an unprecedented rate (Dolnicar and Randle 2007; Hartman et al. 2012; Mcbride and Lough 2010, 196; Ouma and Dimaras 2013). Some aspects of this trend are not new: its roots reach back at least as far as the founding of the United States Peace Corps in 1961 and the United Kingdom’s Voluntary Service Organisation (VSO) in 1958, and are entwined with older trends of faith-based international mission work. Yet regardless of these various historical precedents, researchers agree that the trend has spiked dramatically in recent decades, spurred on by both government programs and a huge range of program offerings in the private sector (Rieffel and Zalud 2006; Leigh 2011, 29). Recent data suggest that over 350,000 individuals aged 16–25 engage in some form of international volunteering each year (Jones 2005). Within the United States, tens of thousands of young people per annum volunteer through non-profit organizations such as churches and charities and through for-profit companies that chaperone group volunteer trips or “place” volunteers with foreign “community partners” (Rieffel and Zalud 2006). Recent reports estimate the value of this emergent “voluntourism” industry at anywhere from $150 million to over $1 billion per annum (Mintel 2008; Stein 2012).[1] Meanwhile, whether under the banner of creating global citizens, preparing students to compete in a global knowledge economy, or fostering intercultural competence, colleges and universities are seeking new partnerships, developing new programs, and mobilizing new discourses that all celebrate the value of immersive, non-traditional educational experiences in international settings. Within this context, programs that revolve around international service-learning have become increasingly popular, and such programs have been the subject of a considerable amount of recent academic research (e.g. Crabtree 2008; Green and Johnson 2014; Hartman et al. 2014).

Against this backdrop, academics and mainstream media outlets alike have recently put forth well-justified criticism of international volunteering (e.g. Biddle 2014; Hickel 2013; Zakaria 2014). Some authors (e.g. Ausland 2010) have usefully delineated between various forms of this phenomenon within and beyond universities—distinguishing, for example, between mission trips, slum tourism, middleman companies that “place” individual volunteers, and faculty-led group service trips. Many argue that the practice of sending untrained, unskilled young people into sensitive foreign contexts on short trips for the purpose of “serving” is a paternalistic impulse that smells of neocolonialism (e.g. Crossley 2012; Simpson 2004). At the same time, a growing body of peer-reviewed research has argued for the socially and personally transformative potential of student volunteering through university service-learning programs, especially when those programs take an explicitly critical stance and explicitly orient themselves towards the pursuit of social change (e.g. Crabtree 2008, 2013; Hartman and Kiely 2013; Mitchell 2008).

The authors of this paper represent a collaboration between the director of Florida State University’s Center for Undergraduate Research and Academic Engagement and the directors of Omprakash EdGE, a web platform that connects prospective volunteers with autonomous grassroots social impact organizations and provides intensive volunteer training and mentorship via an online classroom. We share many of the same concerns and hopes described above, but our aim here is not to restate the common refrain that “good intentions are not enough,” nor to offer another aspirational but abstract vision of what service-learning programs “should” achieve. Instead, we identify three common characteristics of service-learning programs that we find to be deeply troubling, and then explain our ongoing attempt to confront and improve upon these programmatic features via an innovative model that revolves around an interactive digital platform. By sharing the case study of our own experience, we aim to raise new questions about the educative capacity of interactive technology within the sphere of international service-learning, and to generate further debate and collaboration in this direction.

Our collaboration grew out of a shared concern that many, if not most, organizations in the business of selling or facilitating volunteer opportunities meet one or more of the following three conditions: 1) they act as a middleman; that is, they “place” volunteers with organizations or in communities from which they are distinctly separate; 2) they charge high fees for this service and more or less guarantee a placement to those who pay these fees; and 3) they promote their work by insisting that a) volunteers will be “making a difference” regardless of their background or qualifications, b) even a little bit of help is “better than nothing,” and therefore c) no significant pre-departure training or preparation is necessary (see Ausland 2010; Citrin 2011; Hartman et al. 2012). We contend that the convergence of these common program features is deleterious to student-volunteers and the organizations they purport to serve.

This paper centers on our attempt to develop an interactive digital platform that enables alternatives to these trends, and its central question is whether this model is indeed a viable one. Circling around this question are many others: If international service-learning is inherently a distance-based and loosely-defined educational experience, then how do we track learning, and what can be the role of technology in this tracking? How can a digital platform be used to remediate many of the broader problems of service-learning and ‘voluntourism’? How can an interactive digital classroom and mixed-media curricula be integrated toward that end? What role can a trained mentorship team play in facilitating learning before, during, and after students’ international trips? And most crucially, in a world characterized by stark inequalities, is it possible to use an interactive digital platform as a vehicle for critical pedagogy that sparks social and personal transformations?

In what follows, we attempt to answer these questions by sharing data and reflections from our own experience. We begin by elaborating our guiding pedagogical principles and then describing the online volunteer-matching platform, classroom, and mentorship system that are central to our program. Then we offer qualitative and quantitative data to illustrate some of the challenges and successes we have encountered thus far. We conclude by reflecting on the possibilities for interactive technology as an avenue towards more critical and transformative service-learning.

II. Programmatic Origins and Pedagogical Principles

Founded in 2004, Omprakash is an interactive digital platform that enables vetted international partner organizations to build profiles, post positions, and recruit volunteers. Prospective volunteers search and apply for positions posted by Omprakash partners, and partners have full autonomy to determine when and if they offer a particular position to a particular applicant. Volunteers pay for their own travel and in-country living expenses, but pay no program fee to Omprakash in exchange for the connective services offered by the Omprakash platform. In early 2012, Omprakash administrators launched Omprakash EdGE (Education through Global Engagement) as an attempt to actively confront the most problematic aspects of the service-learning industry described above: namely, that volunteers are often provided with little to no pre-departure training and mentorship, and that the learning half of “service-learning” is often a disconcerting grey area. The EdGE program couples volunteer trips with a 12-week pre-departure online classroom, a dedicated mentor, and a required field-based inquiry that culminates in a Capstone Project documenting local perspectives about the social issue(s) that the volunteer’s host organization is working to confront. The program is tuition-based, and one of the motivations for its design was to create sustainable, not-for-profit revenue to support the broader Omprakash platform. Omprakash sought university collaborators for the pilot year and found a strong partner in Florida State University (FSU).

In the fall of 2012, Omprakash and Florida State University’s Center for Undergraduate Research and Academic Engagement (CRE) partnered to create an FSU Global Scholars program (http://cre.fsu.edu/Students/Global-Scholars-Program) that would offer a combination of online training and immersive international volunteer opportunities to several dozen FSU students per year. A particular focus of the program is to recruit participants who are from low-income backgrounds and are first-generation college students, as this population is often underrepresented in these types of experiences and stands to benefit greatly (Finley and McNair 2013). In the first iteration (’12-’13 academic year), 37 students were selected to be Global Scholars by CRE administrators. These students each participated in the EdGE online classroom during the spring semester and then spent at least two months during the summer with one of Omprakash’s international partner organizations. In the second iteration (’13-’14 academic year), 28 students participated, and the online classroom was complemented with weekly in-person meetings among the Global Scholars on the FSU campus. At the time of writing, we are in the midst of the third iteration of the EdGE/Global Scholars collaboration, with 49 students involved.

Our work together has revolved around four pedagogical principles. First, we believe that the learning in international service-learning should be dialogical: learning should emerge via interactions with others and exploration of different perspectives, and various “truths” should be uncovered and interrogated in an ongoing process of exploration, rather than received as static “facts.” Second, learning should be reflexive: it should encourage students to reflect on their own positionality, to recognize and share their own biases, and to understand the process of learning about others as inextricable from a process of learning about selfhood and subjectivity. Third, learning should be personal, meaning that it should emerge through human relationships characterized by empathy, camaraderie, compassion, and humor. Finally, learning should be experiential, meaning that it should be grounded in empirical inquiry and exploration, and that students’ international experiences should recursively inform each other’s ongoing learning.

We make no claim to the originality of these guiding principles—indeed, we readily acknowledge the extent to which our own work has drawn inspiration from the broader trends of constructivist epistemology and critical pedagogy, in particular the work of Paulo Freire (1970). Yet our unique challenge has been to apply these principles to the creation of interactive technology intended to facilitate and support international engagement. The next section provides further details about why and how we have attempted to do so.

III. The Omprakash EdGE Digital Platform: Rationale, Functionalities, and Possibilities

Rationale for Using Interactive Technology

The prominent role of interactive technology within the EdGE/Global Scholars program is a response to several key contextual points. The first contextual point is one of geography and logistics: a digital platform is the most obvious solution to the parallel challenges of enabling students to connect directly with potential host organizations around the world, and also enabling students to maintain contact with each other and to maintain some semblance of intellectual continuity before, during, and after their field positions. Likewise, the chronological flexibility of digital learning means that a wider range of students can find ways to integrate the EdGE pre-departure curriculum into busy schedules.

The second contextual point concerns program costs and accessibility to a diverse group of students: in contrast to chaperoned “voluntourism” trips, the Omprakash digital platform can operate at scale for relatively minimal overhead costs, and thus Omprakash experiences are financially accessible to students whose less-privileged backgrounds might render them unable to afford more expensive “voluntourism” trips.[2] The key difference is that Omprakash does not spend administrative resources on placing volunteers or chaperoning trips; instead, its digital platform allows individuals and organizations to connect organically and arrange their own plans via direct communication. Omprakash invests time and resources into the initial vetting of its partner organizations to ensure a degree of quality and reliability, but the ongoing vetting process is largely driven by users’ reviews of their experiences, and this is another example of the ways in which Omprakash has been able to expand and strengthen its network without incurring expenses that must be passed on to users.

The third contextual point concerns the institutional and bureaucratic inertia faced by administrators at FSU and many other universities: despite a genuine intent to integrate rigorous academic content with students’ international experiences, universities often lack the funding and institutional will to incentivize or allow faculty to teach accredited interdisciplinary courses that explicitly prepare students to approach international service-learning with intellectual seriousness (Crabtree 2013; Hartman and Kiely 2014). Against this contextual backdrop, it made sense for FSU’s CRE to use the Omprakash EdGE online platform instead of developing a new on-campus course.

Browsing and Applying for Volunteer / Internship Positions

Omprakash administrators developed their volunteer matching platform as a deliberate alternative to the dominant “placement” model in which middlemen restrict direct contact between volunteers and their host organizations prior to arrival, and volunteers are not required to apply for specific positions. The basic rationale for this platform is that it provides greater power and autonomy to host organizations that tend to be marginalized within the dominant “placement” model.

In the dominant model, middleman organizations have little incentive to allow for direct dialogue between volunteers and hosts, because doing so might allow the volunteer to sidestep the middleman and avoid paying the middleman’s fees. Consequently, host organizations possess little to no autonomy to determine which volunteers might (or might not) be a good fit for their organization’s needs, values, and specific position openings. Likewise, volunteers have limited to no opportunity to learn more about different potential hosts and decide which one might be the best fit for their specific skills and interests.

The Omprakash model reverses this pattern by empowering partner organizations with full autonomy to solicit applications for specific positions, and to accept or reject applicants as they see fit. In addition, this programmatic feature is also an important component of students’ learning experiences: by requiring students to apply for specific positions and communicate directly with Omprakash partners, Omprakash challenges the embedded paternalistic assumptions that NGOs working in resource-poor contexts are desperate for foreign help, and that “anyone can do it.”

The EdGE Classroom

Course content in the EdGE digital classroom is divided into separate weeks that are sequentially accessible. Weeks are clustered into thematic sections, and each week is oriented around a single essential question. For example, the theme of Weeks 1–3 is “Good Intentions and Unintended Consequences,” and the essential question of Week 1 is “What might be wrong with international volunteering?”

Each week is divided into three sections: Learn, Respond, and Browse. The Learn section (see Figure 1) of each week is further divided into slides in which Omprakash administrators arrange learning content: a reading excerpt, an embedded video, a photo collage, a public service announcement from the Omprakash narrator, or any combination of the above. At the base of each slide, students are able to write observations and browse the observations of their peers. The Learn section of each week contains anywhere from ten to twenty slides and is designed to require 1–2 hours to complete. Upon completing the Learn section of a given week, students enter the Respond section (see Figure 2) and submit a written reflection or recorded video to a prompt related to the week’s essential question and associated content. After submitting their weekly response, students enter the Browse section (see Figure 3), where they explore and comment upon the responses of their peers. The end result is three ways for students to interact with classroom content and each other on a weekly basis: observations in the Learn section, responses in the Respond section, and comments in the Browse section. All participants are notified with an email whenever their response receives a comment, and mentors are notified with an email whenever their mentees post an observation or response.

EdGE/Global Scholars classroom.

Figure 1. Cropped screenshot of a slide within the Learn section of last year’s EdGE/Global Scholars classroom.

 

 

Respond section from EdGE/Global Scholars classroom.

Figure 2. Screenshot of Respond section from last year’s EdGE/Global Scholars classroom.

Browse section of EdGE/Global Scholars classroom

Figure 3. Cropped screenshot from the Browse section of last year’s EdGE/Global Scholars classroom.

EdGE Mentorship

In contrast to a typical Massive Open Online Course (MOOC), we sought to ensure that student experiences within our online classroom would involve a significant degree of personalized mentorship and instruction. With this in mind, Omprakash administrators solicited applications and built a team of EdGE Mentors who would work with FSU Global Scholars as they worked through the EdGE online classroom. The EdGE Mentor team is comprised mostly of graduate students and young professionals with deep experience as researchers and practitioners in fields such as international development, public health, gender studies, and anthropology. EdGE Mentors are geographically dispersed—the current team of seventeen Mentors is spread across locations including Atlanta, Berlin, New York, Oxford, Quito, Port au Prince, and Toronto—but collaborate with students and with each other via the EdGE digital classroom. For each cycle of the Global Scholars program, each mentor is matched with a handful of mentees and is expected to maintain contact with his or her mentees before, during, and after the mentees’ field positions. Mentors are compensated on a per-student basis.

The mentorship team has multiple nodes of contact with mentees. Firstly, mentors make themselves available to mentees via email and video calls. As students progress through the online coursework, mentors schedule “office hours” with their mentees, usually via Skype or Google+. Omprakash administrators request that mentors hold office hours at least four times throughout the 12-week pre-departure classroom, and mentors use this time to answer questions and build personal rapport with their mentees. Secondly, within the classroom itself, mentors are engaged in almost exactly the same way as their mentees: each week, mentors write observations, responses, and comments. Within the Browse section, mentors are required to provide a substantial comment to each of their mentees’ responses. Mentors are welcome to provide comments to any student, even if the student is not one of their designated mentees. While these comments are public to all users, the online platform also affords mentors the opportunity to send private weekly feedback to their mentees.

Blended Learning

Upon acceptance into the Global Scholars program, students are enrolled in a one-credit, pass-fail course during the spring semester. The on-campus course is facilitated by CRE administrators and meets weekly. These meetings are usually devoted to answering logistical questions and giving students time to discuss content encountered in the EdGE classroom thus far. These meetings constitute a key feature of the collaboration between Omprakash EdGE and the CRE: while the EdGE digital classroom provides space in which students can explore content and discuss with mentors and each other, the weekly on-campus meetings add another layer of personal interaction to the experience.

EdGE Curriculum and Capstone Projects

Rooted in Paulo Freire’s notion of conscientization (“raising critical consciousness”), the EdGE curriculum is designed to help students move beyond the superficial urge to ‘help others,’ and to work towards more holistic and reflexive understandings of the intersecting contexts in which they and their host partners are situated. The curriculum begins by challenging students to reflect upon the intentions and assumptions that underlie their desires to volunteer abroad. It dedicates a week to deconstructing the catch-all term “culture.” Another week is devoted to exploring the complexities of conflicting local interest groups and power dynamics that are often obscured by overly-romanticized notions of “helping the community.” Three weeks explore intersections of social, economic, and environmental inequality, and thereby help students locate themselves and their host organizations in relation to global configurations of power. The latter part of the curriculum teaches research methods—particularly the tools of ethnographic observation and community-based participatory research (CBPR)—so that students can complete observer-activist Capstone Projects which document the roots of a complex social issue and are meant to be shared with all members of the Omprakash network as well as other audiences back home.

Monitoring and Evaluation

The Omprakash digital platform allows Omprakash administrators to easily track each student’s observations, responses, and comments throughout the duration of the twelve-week pre-departure curriculum, and to qualitatively assess how a given student’s understandings seem to shift (or not shift) over time. Likewise, the platform also allows Omprakash administrators, EdGE Mentors, and FSU administrators to easily answer quantitative questions such as which pieces of classroom content elicit the most student responses, which mentors have the most consistent back and forth dialogue with their mentees, and how trends in classroom participation vary between students with differing background characteristics.

To supplement these data sources, we also administer surveys to our students on a periodic basis. Students complete a pre- and post- survey that builds upon established survey instruments, such as the Global Perspectives Inventory (https://gpi.central.edu/) and the College Senior Survey (HERI), and also includes original questions and constructs. In addition, students offer qualitative feedback and reflections on the quality of our program via surveys administered at the midpoint and conclusion of the pre-departure curriculum and upon returning home from their field positions. Finally, Omprakash administrators also solicit feedback from Omprakash partners about the contributions of each student-volunteer.

All of this is to say that we have managed to accumulate abundant data reflecting student experiences within our program. However, we will be the first to acknowledge that conducting meaningful analysis of this data is much more challenging than simply collecting it. In the next section, we attempt to draw some inferences from the various forms of data we have collected thus far.

IV. What We’ve Learned—Challenges and Successes

Having provided an explanation of the roots and structure of our program and collaboration, we now offer a deeper analysis of the challenges and successes we have encountered during our collaboration thus far. We focus on three core aspects of our digital platform: the volunteer matching system, the EdGE classroom, and the remote mentorship model.

Matching Volunteers and Partner Organizations

By requiring potential volunteers to apply for positions, establish dialogue, and build rapport with potential host organizations before they leave home, we encourage a bridging of the real and imagined gulfs that separate the two parties. In addition to reducing our administrative burden and thus making our program more affordable and accessible to students from a wide range of backgrounds, this aspect of our program actualizes a key dimension of our ethos: by allowing volunteers and hosts to engage autonomously, exchange information freely, and establish a preliminary relationship, we enact a preventive strategy to discourage distorted relations, perverse incentive structures and perpetuated biases.

Omprakash has been refining this aspect of its platform for a decade, and the platform has facilitated thousands of fruitful collaborations between volunteers and partner organizations. However, our experience with this platform has also uncovered some troubling ironies related to digitally-based dialogue and collaboration. To state the obvious: technologies intended to connect people do not always result in increased connectedness or in successful collaborations. Prospective volunteers can be very fickle when communicating with partners: in many cases, they delay in answering emails; they forget about scheduled meetings—whether due to time zone confusion or other distractions—and they express themselves in casual, lackadaisical terms which partners sometimes interpret as immature or unprofessional. Given that Omprakash partners are real organizations confronting real social issues and are not pop-up projects that exist only to facilitate feel-good volunteer experiences, this sort of digital interaction with prospective volunteers can be disconcerting or even offensive.

Omprakash prides itself on its commitment to providing an alternative to the dominant “placement” model, and collaborators at FSU and other universities share this commitment. However, at times it seems as though some students would be much more comfortable if Omprakash would just “place” them on a volunteer trip and save them the trouble of needing to browse real organizations and apply for specific positions. Likewise, some parents and university administrators balk at the lack of “on-site supervision” within the Omprakash model. Of course, all partners provide their own “on-site supervision,” but it seems that the embedded concerns of some parents and administrators will not be soothed unless supervision comes in the form of a well-credentialed American or European chaperone. The Omprakash digital platform is meant to facilitate direct collaboration between diverse people and organizations, but some prospective volunteers, parents, and university administrators seem that to prefer paying a high premium for a guaranteed “placement” rather than grapple with the complexities and uncertainties of building relationships with locally-run social impact organizations that may or may not actually want their help. The irony here is that volunteering abroad is ostensibly a process of collaboration, but many prospective volunteers seem intimidated by the fundamentally collaborative ethos that underpins the Omprakash platform and would prefer crisply packaged “voluntourism” products designed for mass consumption.

The EdGE Classroom

On the whole, FSU Global Scholars have interacted deeply and positively with the Omprakash EdGE online classroom. There was marked improvement in the level of engagement from the first year of the program (’12–’13) to the second (’13–’14). We attribute this to two main factors. First, FSU administrators facilitated on-campus weekly meetings in the second iteration of the program. Providing this structure seemed to help spur participation. Second, Omprakash administrators gave the EdGE curriculum a thorough makeover before the second year based on feedback from first year students. Omprakash administrators added a great deal of new content, removed content that had not resonated, and developed new tactics for structuring the material in the Learn section. For example, in the first year only one piece of content (reading, video, etc.) was put onto each slide, which meant that some weeks had over 20 slides. In the second year, to whatever extent possible, slides were crafted to deliver a specific message and multiple pieces of content were arranged on a single slide to tell that story.

The post-course evaluation completed by 82% (23/28) of the 2013-2014 Global Scholars provides a clearer snapshot of students’ experience in the online classroom. Only one student disagreed with the statement “I found the classroom intuitive to navigate,” and all students agreed that “the classroom was well-organized.” Nineteen of 23 (83%) students agreed that the weekly content was “stimulating” and 20 (87%) agreed that “the flow of the course from week to week was logical.” Nineteen (83%) agreed with the statement “I valued the opportunity to engage with peers and mentors in the weekly forums.” In response to a request for general feedback, one student wrote:

I liked how it had a curriculum set up that consisted of a learning, responding, and browsing stage. It really makes me feel engaged with my peers and administration. I liked how it felt like we were in a live class. It was enjoyable to learn things online at our own pace.

With regard to self-reported learning in this evaluation, 100% of students agreed with the statement “I am a better prepared international volunteer because of this course” (of which 14 (61%) “strongly” agreed). This is encouraging, but even more encouraging is the overwhelmingly positive written feedback, such as:

I really loved how Omprakash opened my eyes to a whole new world of international aid, public health, anthropology, and research that I’ve never known about.

This program changes your perspective on international volunteering and issues like no other. It helps you reevaluate any prejudices and biases you may not even be aware you have, and learn how to best be an informed and engaged intern. It gives you extremely valuable resources and a network of people to help along the way.

I’m sure that if I were left to my own devices, I would have been more likely to literally wait until possibly even now to START preparing. It made me consider a lot of things that I wouldn’t have considered, and throughout taught me several things I will use over the course of my volunteering experience.

IT was the absolute BEST program I could ever recommend to anyone looking to volunteer abroad. It was the most eye opening experience of my undergraduate experience.

It is exciting to find students expressing this level of appreciation for a web-based learning platform. The final quoted comment reads like a typical testimonial about a ‘life-changing’ experience in another country, and thus is all the more fascinating given that it was written weeks before the volunteer even left home.

Feedback like this makes us confident about the depth of learning that occurs in our online classroom, but we still see a great deal of room for improvement. The structure of our Learn section requires students to click through each slide, but it is difficult to gauge how carefully or carelessly students are engaging with weekly content unless they submit observations or responses that blatantly demonstrate a lack of understanding. Such instances are not rare and are certainly disheartening, but it is worth noting that the student tendencies of taking shortcuts and skimming are hardly problems unique to digital learning environments.

One of the most common critiques about digital learning is the high rate of attrition, estimated to be 93.5% for MOOCs such as Coursera (Jordan 2014). In this credit-bearing collaboration, we do not face this problem. If Global Scholars do not participate in the EdGE classroom, they will fail a course that appears on their FSU transcripts. But because it is a pass-fail course, our challenge is making sure that students are doing more than the bare minimum to pass.

Online Mentorship

EdGE Mentors are a crucial component of our effort to ensure that our online classroom is dialogical, reflexive, and personal. Our expectation that mentors provide thoughtful comments on each of their mentees’ weekly responses is the cornerstone of our strategy to ignite dialogue in the weekly forums. In ideal circumstances, every student’s weekly response will garner comments from his mentor and peers. In reality however, most but not all responses in any given weekly forum will spur this level of dialogue.

At the conclusion of the most recent Global Scholars session, we reviewed each mentor’s engagement with his or her mentees and provided substantial quantitative and qualitative feedback to each mentor. We analyzed mentor engagement and coded it according to seven possible categories: supportive, i.e. “This is a great post”; affirmative, i.e. “I liked when you said…”; follow-up, i.e. “How would you explain…?”; own opinion, i.e. “My perspective on this is…”; personal anecdote, i.e. “When I was in grad school…”; refer-to-material, i.e. “Freire would say…”; for-further-reading, i.e. “Check out this article about…”. This coding system allowed us to identify major trends in each mentor’s style of engagement. For example, one mentor’s evaluation reads that 93% of her comments were ‘supportive’ and ‘follow-up,’ while 40% were ‘affirmative’ and 7% were ‘own opinion.’ In addition, 27% of her comments resulted in a ‘back and forth’ (student responds to her comment at least once).

We believe our mentorship system is one of the most vital aspects of the EdGE program. It provides the personal touch that keeps students honest and involved. In the post-course evaluation mentioned earlier, all 23 respondents agreed that their mentor “makes himself/herself available to answer my questions”; 21 (91%) agreed that their mentor was “helpful” (two were neutral), and 20 (87%) agreed that their mentor “provided good comments on my weekly responses.” Nineteen (83%) Global Scholars agreed that mentorship was a “very valuable” aspect of EdGE and “plan to maintain communication with my mentor during and after my field position” (the remaining four were neutral to both questions).

V. Using Interactive Technology to Raise Critical Consciousness?

Interactive technology and service-learning are both on the rise within higher education, but there is little reason to assume that either will be a driver of social change rather than social reproduction. Despite the hype about egalitarianism and democratization that surrounds emergent digital learning platforms, we worry that the implementation and evaluation of such technologies are sometimes directed towards the goals of increasing efficiency and profit margins at the expense of student learning and transformation. Likewise, despite the buzzwords of global citizenship and collaborative partnerships that surround the proliferation of service-learning programs, we worry that many such programs lack substantive pedagogical vision and are oriented around placement models and paternalistic narratives that are intrinsically disempowering to those they purport to serve (Baillie Smith and Laurie 2011). The authors of this paper have sought to integrate the two trends of interactive technology and service-learning with the explicit aim of going beyond the buzzwords to cultivate critical inquiry and authentic collaboration in pursuit of social change. Our pedagogical vision derives from Freire’s notion of ‘raising critical consciousness’: a conviction that ‘knowing the world’ through dialogue and reflection is the first step towards creating change. The question, then, is whether or not such a vision can be actualized through an interactive digital platform—or at all.

It is surely too early to attempt any conclusive answer to this question, but the case study offered in this paper suggests that interactive technology might indeed be a useful tool for facilitating the sort of learning and collaboration that “critical service-learning” would seem to require. Further research should investigate not just what students are learning via the digital platform, but also how they translate this learning into their work on the ground while volunteering abroad and into the rest of their lives upon returning home.

Acknowledgements

Willy and Steve would like to thank the Omprakash EdGE Mentorship team, without which this article would not exist: Alex Frye, Eric Dietrich, Kalie Lasiter, Emily Hedin, Mayme Lefurgey, Miyuki Baker, Kit Dobyns, Shelby Rogala, Anabel Sanchez, Laura Stahnke, Matt Smith, Barclay Martin, Nathan Kennedy, Meredith Smith, Devi Lockwood, Nina Hall and Mary Jean Chan. W&S would also like to thank Lacey Worel for making sure the Omprakash trains run on time and Sonu Mahan and Adarsh Kumar for being brilliant web developers who can turn stale mockups into truly interactive technology. Finally, W&S would like to thank the other half of this great collaboration: Joe, Latika Young and Kim Reid. Joe would also like to thank Latika and Kim for making sure the FSU Global Scholars program runs so well.

Bibliography

Ausland, Aaron. 2010. Poverty Tourism Taxonomy 2.0. From <http://stayingfortea.org/2010/08/27/poverty-tourism-taxonomy-2-0/>. Accessed 31 May, 2014.

Baillie Smith, Matt, and Nina Laurie. 2011. “International volunteering and development: global citizenship and neoliberal professionalisation today.” Transactions of the Institute of British Geographers. 36 (4). OCLC 751323473.

Biddle, Pipa. 2013. The Problem with Little White Girls (and Boys): Why I Stopped Being a Voluntourist. From <http://pippabiddle.com/2014/02/18/the-problem-with-little-white-girls-and-boys/>. Accessed 31 May, 2014.

Citrin, David M. 2011. “Paul Farmer made me do it”: a qualitative study of short-term medical volunteer work in remote Nepal. Thesis (M.P.H.), University of Washington. OCLC 755939202.

Crabtree, Robbin D. 2008. “Theoretical Foundations for International Service-Learning.” Michigan Journal of Community Service Learning. 15 (1): 18-36. OCLC 425540415.

Crabtree, Robbin D. 2013. “The Intended and Unintended Consequences of International Service-Learning.” Journal of Higher Education Outreach and Engagement. 17 (2): 43-66. OCLC 854574208.

Crossley E. 2012. “Poor but Happy: Volunteer Tourists’ Encounters with Poverty.” Tourism Geographies. 14 (2): 235-253. OCLC 792841012.

Dolnicar, S., and M. Randle. 2007. The international volunteering market: market segments and competitive relations. International Journal for Non-Profit and Voluntary Sector Marketing, 12(4), 350-370. OCLC 826185553.

Finley, Ashley P., and Tia McNair. 2013. Assessing underserved students’ engagement in high-impact practices. OCLC 872625428.

Freire, Paulo. 1970. Pedagogy of the oppressed. [New York]: Herder and Herder. OCLC 103959.

Gacel-Avila, Jocelyne. 2005. “The Internationalisation of Higher Education: A Paradigm for Global Citizenry.” Journal of Studies in International Education 9 (2): 121-136. OCLC 424733796.

Green, Patrick, and Matthew Johnson, eds. 2014. Crossing Boundaries: Tensions and Transformation in international service-learning. Sterling, VA: Stylus. OCLC 877554267.

Harris, Suzy. 2008. “Internationalising the University.” Educational Philosophy and Theory 40 (2): 346-357. OCLC 4633544389.

Hartman, Eric, Richard C. Kiely, Jessica Friedrichs, and Judith V. Boettcher. 2013. Building a Better World The Pedagogy and Practice of Ethical Global Service Learning. Stylus Pub Llc. OCLC 866938358.

Hartman, Eric, Cody Morris Paris, and Brandon Blache-Cohen. 2012. “Tourism and transparency: navigating ethical risks in volunteerism with fair trade learning.” Africa Insight 42 (2): 157-168. OCLC 853073233.

Hartman, Eric, and Richard Kiely. 2014. “A Critical Global Citizenship.” In Green, Patrick, and Matthew Johnson. Crossing boundaries: tension and transformation in international service-learning. OCLC 877554267.

Hickel, Jason. 2013. “The ‘Real Experience’ industry: Student development projects and the depoliticisation of poverty.” Learning and Teaching 6 (2): 11-32. OCLC 5528846526.

Jones, A. 2005. Assessing international youth service programmes in two low income countries. Voluntary Action: The Journal of the Institute for Volunteering Research 7 (2): 87-100. OCLC 658807900.

Jordan, Katy. 2014. “Initial trends in enrolment and completion of massive open online courses.” The International Review of Research in Open and Distance Learning 15(1). OCLC 5602810303.

Leigh, R. 2011. State of the World’s Volunteerism Report. United Nations Development Programme. OCLC 779540815.

McBride, A., and Lough, B. 2010. “Access to International Volunteering.” Nonprofit Management & Leadership 21 (2): 195-208. OCLC 680823597.

Mintel. 2008. Volunteer Tourism – International. London: Mintel International Group Limited.

Mitchell, Tania D. 2008. “Traditional vs. Critical Service-Learning: Engaging the Literature to Differentiate Two Models.” Michigan Journal of Community Service Learning 14 (2): 50-65. OCLC 425540125.

Ouma, B., and H. Dimaras. 2013. “Views from the global south: exploring how student volunteers from the global north can achieve sustainable impact in global health.” Globalization and Health 9 (32): 1-6. OCLC 855505685.

Rieffel, L., and S. Zalud. 2006. International Volunteering: Smart Power. Washington, DC: The Brookings Institution. OCLC 70134511.

Simpson, Kate. 2004. “‘Doing development’: the gap year, volunteer-tourists and a popular practice of development.” Journal of International Development 16 (5): 681-692. OCLC 5156622715.

Stein, N. 2012. “Is 2012 the year of the volunteer tourist?” From <http://www.travelmole.com/news_feature.php?news_id=1151074>. Accessed 31 May 2014.

Zakaria, Rafia. 2014. “The White Tourist’s Burden.” Al Jazeera, 21 April 2014.

[1] The neologism “voluntourism,” though lacking a precise definition, is generally used in a pejorative manner to describe programs that combine volunteering with tourism. Throughout this essay, we use the terms “service-learning,” “volunteering,” and “voluntourism” somewhat interchangeably—not because we are unaware that many commentators have attempted to delineate between them, but rather because we believe that such delineations often obscure more than they illuminate, and that our criticisms and suggestions are applicable to programs that fall into all of these categories as well as the grey area between them.

[2] To offer one example of the sort of high-cost “voluntourism” program against which we work to offer an alternative: an organization branding itself as “the gold standard of global engagement” sells ten-day trips to Uganda under the slogan of “short term trips; long term impact.” Customers pay a program fee of $1,990 plus airfare, and travel in groups of at least eight. In contrast, a student participating in Omprakash EdGE and working with an Omprakash partner in Uganda for sixty days would pay a total of roughly $1,350 plus airfare ($750 for the EdGE program fee, and $10 per day in country), and would receive pre-departure training and mentorship that is unavailable in the typical “voluntourism” model.

 

 

About the Authors

Willy Oppenheim is the founder and Executive Director of Omprakash, a web-based nonprofit that connects volunteers, interns and donors directly with social impact organizations in over 40 countries. Willy received a BA from Bowdoin College, where he completed a self-designed major in religion, education and anthropology. In 2009, he received a Rhodes Scholarship and is currently pursuing a Ph.D. in education from the University of Oxford. As an educator and educational researcher, Willy has worked in classrooms in the United States, India, Pakistan and China, and in the wilderness as an instructor for the National Outdoor Leadership School. For over 10 years, Willy has been working through Omprakash to transform the field of international service-learning to make it more affordable, more ethical, and more educational for everyone involved.

Joe O’Shea serves as the Director of Florida State University’s Center for Undergraduate Research and Academic Engagement and is an adjunct faculty member in the Department of Philosophy. He received a BA in philosophy and social science from Florida State University, where he served as the student body president and a university trustee. A Truman and Rhodes Scholar, he has a master’s degree in comparative social policy and a Ph.D. in education from the University of Oxford. Joe has been involved with developing education and health-care initiatives in communities in the United States and Sub-Saharan Africa. His research and publications are primarily focused on the civic and moral development of people, and his recent book, Gap Year: How Delaying College Changes People in Ways the World Needs, was published by Johns Hopkins University Press. Joe serves on the board of the American Gap Association and as an elected Councilor for the Council on Undergraduate Research, the leading national organization for the promotion of undergraduate research and scholarship.

Steve Sclar is the co-founder and Program Director for Omprakash EdGE (Education through Global Engagement). Steve received a BBA from the College of William & Mary, where he majored in Marketing and Environmental Science. He is finishing up an MPH in the Global Environmental Health department at Emory University’s Rollins School of Public Health. Previous volunteer or work experience in Tibet, Ghana and Iceland led Steve to his current role for Omprakash.

Need help with the Commons? Visit our
help page
Send us a message
Skip to toolbar