Tagged eportfolios

The image features a person, viewed from behind on the right side, reading a paper titled "ePortfolio Project: Reflective Writing." Behind the paper, a laptop is open. Several other students work in the background.
2

More Than Assessment: What ePortfolios Make Possible for Students, Faculty, and Curricula

Abstract

To disrupt the notion that ePortfolios are primarily an assessment tool, this article-as-ePortfolio invites readers to consider what is made possible when ePortfolio initiatives lead with student learning in their structure and implementation. In addition to descriptions of both faculty and student support, we offer extended examples of ePortfolio implementation in three disparate programs at our university: Biosystems Engineering, English Education, and Nursing. To conclude, we reflect on the pedagogical challenges and opportunities that have emerged as a result of the structure and implementation of our ePortfolio initiative. Ultimately, we aim to demonstrate what is made possible for students, faculty, and curricula when ePortfolio initiatives prioritize student learning.

Editor’s Note

Lesley Erin Bartlett, Heather Stuart, Justin K. Owensby, and Jordan R. Davis have created an ePortfolio about ePortfolios. At JITP, we celebrate such confluence of form and content. After exploring various options for rendering this work on our site, we found the iframe to be the best solution. We recognize that an iframe may not render the contents of this piece correctly on all devices and apologize for any inconvenience; for a full-screen experience, please see https://jitp.commons.gc.cuny.edu/wp-content/static/JITP-Bartlett-et-al/.

About the Authors

Lesley Erin Bartlett is Assistant Director of University Writing at Auburn University. Her primary responsibility is the ePortfolio Project, which is housed in the Office of University Writing. She received her PhD in Composition and Rhetoric with a specialization in Women’s and Gender Studies from the University of Nebraska-Lincoln in 2014. She has developed and taught courses in composition, rhetorical theory, literature, and women’s and gender studies. Her research interests include composition theory and pedagogy, inclusive pedagogies, feminist rhetorical theory, and rhetorical performance.

Heather Stuart is the Program Administrator for the ePortfolio Project at Auburn University. In her current role she provides support for students by facilitating workshops, teaching classes, creating resources, and advising students in the ambassador program. She received her M.Ed. in Administration of Higher Education.

Justin K. Owensby is a graduate assistant for the ePortfolio Project in the Office of University Writing at Auburn University. His work with the ePortfolio Project involves facilitating ePortfolio presentations and workshops and focuses on visual and ethical literacy in ePortfolios. He is also a PhD candidate in the department of Health Outcomes Research and Policy, where he is interested in integrating mobile health technology into healthcare. He is also interested in teaching, faculty development, and technologies (such as ePortfolios) associated with student learning.

Jordan R. Davis is a graduate assistant for the ePortfolio Project in the Office of University Writing at Auburn University. He is currently a candidate for the Master of Technical and Professional Communication, where he has found an interest in the use of rhetoric to capture the attention of readers. His current research focuses on the reliability of product reaction cards as a data collection instrument for usability tests. He tweets @courageousdavis.

Figure 9. Treemap of All ePortfolio Pages
2

RePort_Bot: A Computational Approach to ePortfolios and Reflection

Abstract

This article is an experimental effort to add a new dimension to the priority of reflection, which characterizes the crafting of both print and ePortfolios (see Yancey 2009; Bourelle, et al. 2015): the concept of rhetorical velocity. Rhetorical velocity (Ridolfo and DeVoss 2009; Ridolfo and Rife 2011) refers to how reusable texts are in acts of circulation and recomposition. In this article, I outline a computational approach to measuring rhetorical velocity of ePortfolio textual content, visualizing findings, and using these findings to assist reflection and refinement of ePortfolio composing. This computational approach revolves around a Python “bot” script with the portmanteau RePort_Bot (Report + ePortfolio + Bot). The RePort_Bot crawls web pages, scrapes textual content and metadata such as page titles, analyzes the ePortfolio, and returns results to users in the form of descriptive statistics and visualizations.

Introduction

This article is an experimental effort to add a new dimension to the priority of reflection, which characterizes the crafting of both print and ePortfolios (see Yancey 2009; Bourelle, et al. 2015: the concept of rhetorical velocity). Rhetorical velocity (Ridolfo and DeVoss 2009; Ridolfo and Rife 2011) refers to how reusable texts are in acts of circulation and recomposition. Media boilerplate texts represent an exemplar genre of texts predicated on maximizing rhetorical velocity because these texts are designed to populate multiple contexts and to be easily resituated. The application of a heuristic based on rhetorical velocity acts of reflection can take the form of close-readings of ePortfolios by students or instructors. However, in this article, I will outline a computational approach to measuring rhetorical velocity of ePortfolio textual content, visualizing findings, and using these findings to assist reflection and refinement of ePortfolios. This computational approach revolves around a Python “bot” script with the portmanteau RePort_Bot (Report + ePortfolio + Bot). The RePort_Bot crawls web pages, scrapes textual content and metadata such as page titles, analyzes the ePortfolio, and returns results to users in the form of descriptive statistics and visualizations.

Robot Rationale

The RePort_Bot (and its emphasis on rhetorical velocity) is galvanized by two related theoretical questions: how does the digital nature of ePortfolios affect composing and reflection practices and how can we leverage the digital affordances of ePortfolios in the service of better writing and design?

One obvious difference that the digital nature of ePortfolios makes manifest is the ability to incorporate diverse content as a showcase of skills, experiences, and knowledge-bases. This includes hyperlinking to resources or hosting a range of media such as PowerPoint presentations, YouTube videos, audio tracks (Yancey 2013, 26–27; Rice 2013, 42), and social media content (Klein 2013). Reynolds and Davis (2014) also observe that ePortfolios represent dynamic means of storing and displaying work (Kilbane & Milman 2003, 8–9). With the expanded technical avenues that ePortfolios bring to writers comes the corresponding obligations to manage ePortfolio content and plan for its live deployment (Barrett 2000, 1111).

While not solely targeting ePortfolios (although ePortfolio assignments do factor heavily into her theorizing), Silver (2016) offers a handy gloss of what the digital (hyperlinking, video and audio sharing, social networking) means for the act of reflection in the writing classroom. To paraphrase Silver (2016, 167), reflection with and through digital media platforms becomes easier and more enjoyable as an activity, fosters collaboration and dialogue among peers, becomes more visible and measurable (167), and enables students to enter into more robust self-dialogues, thereby increasing their awareness of their own rhetorical actions and their ability to self-correct (Silver 2016, 169). Silver cites Meloni (2009) on the use of GoogleDocs revision history to provide automatic feedback to students on the trajectory of their writing, which then led to more detailed and data-supported reflections at the end of the assignment sequence.

While I would concur with all the points made by Silver (2016) above, it is the last two claims that scaffold the development of the RePort_Bot’s scope and goals. First, concommitant with the ability to house multimedia content such as video and audio is the fact that the presentation of many ePortfolios in HTML or XML makes these ePortfolios available to automated content extraction and indexing and computational analysis. Thus, reflections as products of ePortfolios and reflections as driven by the digital material of ePortfolios can be subject to or mediated by computational entities such as crawlers, bots, or text mining programs.

Of course, the fact that ePorfolios can be plumbed by bots or computer algorithms does not necessarily mean that it would be productive to do so. I argue that the contributions a bot like the RePort_Bot could make to the crafting of ePortfolios aligns with Silver’s (2016) valuation of self-dialogue in the course of reflection and Meloni’s (2009) practical use of GoogleDocs revision history for portfolio creation. At the same time, I would pose a different interpretation of Silver’s reading of Meloni’s article. Rather than view student utilization of GoogleDocs revision history as a case of a student using a tool to promote recall and reflection, we might also characterize the GoogleDocs platform as a semi-autonomous agent that provided students feedback that aided reflection. While not a bot in the conventional sense of the term, the GoogleDocs revision history feature does work automatically to record each version of user-generated content in a way that is denatured from the original act of writing. The GoogleDocs revision history feature re-presents the arcs of student composing—arcs which only exist in retrospect after drafting and revision has occured. The RePort_Bot seeks a similar relationship with writers of ePortfolios. The RePort_Bot returns individualized and cumulative measures of rhetorical velocity of ePortfolio content to writers in a way that humans would have difficulties replicating. These measures, generated, in a sense, behind a screen of automated number-crunching, would then provide an anterior perspective on that ePortfolio content.

The advantages of such anterior perspectives have been argued for most saliently in the field of digital humanities. Jockers (2013) offers “macroanalysis” as a complement to traditional “close reading” methodologies employed in literary analysis. Jockers (2013) takes pains to explain that macroanalysis and microanalysis have a shared goal to gather data and derive insights. They key principle of macroanalysis is that it inquires into details that are generally inaccessible to human readers (Jockers 2013, 27). Ramsay (2003) extends this point in his paragonic comparison between close reading exegesis and algorithmic criticism, musing that machines can open new pathways to analysis by allowing scholars to reformulate texts and uncover new patterns of information and organization (171). Drawing on Samuels and McGann (1999), Ramsay (2011) has elsewhere referred to the process of reformulating a text for alternative interpretations as “deformance” (33). Samuels and McGann (1999) describe deformance as a critical operation that “disorders” a text, upsetting conventional readings and placing the reader in a new relationship with the text. Embodying deformative criticism is a proposal by Emily Dickinson to read her poetry backwards as a means to unfurl aspects of language that may be obscured by conventional approaches. Ramsay (2011) extends this to the work of digital humanists, arguing that the subjection of alphabetic texts to word frequency counts or semantic text encoding is similarly deforming (35). The creation of a text concordance removes grammar and syntax from analysis, providing scholars a view of a text in which the meaning or significance of a word depends less on its contextual deployment and more on the number of times it occurs in a text. The RePort_Bot hews to this notion of deformance by providing users an unfamiliar reading of a text based on word frequency counts, text normalization, and the application of the concept of rhetorical velocity.

Rhetorical Velocity

Rhetorical velocity (see Ridolfo and DeVoss 2009; Ridolfo and Rife 2011) is generally applied to those texts that have been designed for re-use, easy repurposing, and mass circulation. Consequently, rhetorical velocity has often been associated with boilerplate writing such as press releases or media images and video. One example is Amazon’s business summary. This summary conveys basic information about the who, what, and where of Amazon and appears with little deviation across multiple financial websites such as Bloomberg, Reuters, and Yahoo Finance. By boilerplating its public profile, Amazon is able to present a unified brand image to its multiple audience and facilitates the distribution of its brand identity by giving media outlets a plug-and-play textual patch for other artifacts such as news stories or blog posts.

A rhetorical concept that governs boilerplate writing seems far removed from the type of particularity of language and topic selection that ePortfolios demand. However, there are good reasons why rhetorical velocity is specifically applicable to ePortfolios. The first reason is inflected by the question of genre. Designers may be crafting a professional ePortfolio to reflect their career fields or, for undergraduates, their prospective career fields. In this case, designers must be attentive to the disciplinary codes of their chosen professions so that their ePortfolio can be recognized by practitioners of that field. In this case, ePortfolio content will likely concentrate a few key and readily discernible thematic foci. Thus, each page of the ePortfolio is likely to have replicated language to signal the focus or theme of the ePortfolio. Indeed, certain fields such as content strategy recommend this “templatey” approach (see Kissane 2011). Tracking the rhetorical velocity of ePortfolios reflective of this case can reveal if such a focus has been linguistically achieved. Theme words would appear more regularly as structuring rhetorical elements of each page, and would be considered to possess higher rhetorical velocity because they are being reused more frequently. Conversely, the antinomy of boilerplate writing can also be tracked because rhetorical velocity is a relative measure. For certain terms to be considered “fast,” other terms would need to be considered “slow.” These “slow” terms could serve as an index of an ePortfolio’s variety in a way that is much more informative than measuring word frequency. Sheer counts may provide people with a sense of content and topicality; however, the application of rhetorical velocity to this issue allows people to measure diversity of words and style against prevailing words (i.e., words that lessen the variability of the text).

Another reason that rhetorical velocity is an appropriate analytic for ePortfolios involves sustainability. While certain ePortfolio assignments may ask people to add content as if from scratch, others, maximizing the affordances of the media, may seek to re-use content or rhetorical modes across pages/content areas. Assessing the rhetorical velocity of an ePortfolio can help writers self-audit existing content from a fresh perspective and cue people to the best approaches to extending their ePortfolios.

No definitive means of calculating or rendering rhetorical velocity has been established. The RePort_Bot offers a computational approach to foregrounding textual features cognate with the idea of rhetorical velocity based on the calculation of a text’s word frequency h point. Formulated by Jorge Hirsch (2005; 2007; see also Ball 2005; Bornman & Hans-Deiter 2007), the h point or h index attempts to weigh the impact of an author has in his/her discipline in a single value. This single value or h point is the point at which a publication’s rank in terms of citation count is equal to or less than the number of citations for that particular publication:

h point = (rank of citations, number of citations)

Let us say that Author A has 5 publications. The ordering of these publications is given by rank-frequency of their respective citations. The number of citations for each paper is given from highest to lowest: 10, 8, 7, 4, 1. Consequently, the h point for the Author A is 4 because the fourth ranked publication has 4 citations. The conclusion one would draw about Author A’s publication output is that Author A has at least 4 publications with 4 citations each.
Figure 1. Author A h point Scatterplot

Figure 1. Author A h point Scatterplot

The h point attempts to balance comparison between the works of 2 or more scholars in the field above blunt metrics such a sheer citation count or number of articles. Consider the case of Author B who has five publications with the following citation counts from highest to lowest: 25, 4, 3, 1, 0. The h point for Author B is 3 despite the fact that Author B’s first ranked publication possesses a citation count exceeding any by Author A. Thus, a case could be made that the breadth of Author A’s work is more impactful than Author B, who may be noted for one item.
Figure 2. Author B h point Scatterplot

Figure 2. Author B h point Scatterplot

As a bibliometric, the h point functions to correct wide disparities in citation counts among authors and take a more broad accounting of citation histories. The h point describes a fixed point in a discrete distribution of quantities upon which to sort quantities, adjusting for the highest and lowest values. The RePort_Bot’s application of the h point relies on its basic formula, but departs from its bibliometric aims. Instead, the RePort_Bot uses the h point to model the rhetorical velocity of ePortfolio content. The theory behind this approach is directly attributable to Popescu and Altman (2009), who apply the h point method to the word frequencies of corpora. Words in a corpus, like citations in the original bibliometric version of the h point, are ranked by their frequency. The h point refers to that word whose rank and frequency are the same or whose frequency is nearest to that word’s rank frequency.

On the face of things, word rank-frequencies and citation rank-frequencies are quite different. The latter relies on the circulation of discrete documents; the former relies on the selection of words, which are more fungible than a published paper. However, if we consider the h point as a means to trace the syndication of objects, and focus our attention on how objects on either side of the h point are delivered. The upper-bound of the h point distribution indicates those publications that have been reused the most. The lower bounds of the h point describes publications that have been reused the least. When applied to word frequencies, the upper- bound of the h point distribution indicates those words that have been used the most. The lower bounds indicates those words that have been reused the least. According to Popescu and Altman (2009, 18), modeling a text on its word rank-frequency h point can sort the vocabulary of a text into its “synsemantic” (words that require other words for their meaning) and “autosemantic” (words that can be meaningful in isolation) constituents. Synsemantic words include function words such as prepositions, auxiliaries, articles, pronouns, and conjunctions. These are the words that make up the connective tissue of a text; thus, they are used with relatively greater frequency. Autosemantic words occur less frequently, but they include most of the meaningful content words (e.g., those that announce the topic of the text or employ the specialized jargon of the writing genre). Popescu and Altman (2009) compare the operation of synsemantic and autosemantic words to the movement of high velocity and low velocity gas particles (19). Articles, prepositions, and pronouns move faster than content words, which can be deployed with less frequency and higher effect because they have more semantic weight.

Zipf (1949) makes a similar points to that of Popescu and Altman (2009). Zipf (1949) offers the “Bell Analogy” for understanding how the “work” of words is divided. Zipf (1949) asks us to imagine a row of bells, equally spaced apart along the length of a board. A demon is positioned at one end of the board. Furthermore, the demon is charged with ringing a bell every second of the time. After each bell is rung, the demon must return to the starting point and record the bell ringing on a blackboard. This results in a round trip after each bell ringing. For the bell closest to the starting point, the demon need only make a short round trip. For the bell at the farthest end of the board, the demon will make a considerably longer trip. The metaphoric mapping of gas clouds (Popescu and Altman 2009) to bells rung by a demon (Zipf 1949) is not seamless, but the notion of movement animates both. The nearest bells in the “Bell Analogy” are easiest to ring and are equivalent to the fast-moving synsemantic works that we expect to prevail at the top of the word frequency ranks. The bells at the farthest end of the board take more effort to ring and are equivalent to the autosemantic words on the bottom reaches of the h point distribution.

Why doesn’t the demon simply opt to ring only the closest bells? Why aren’t Popescu and Altman’s (2009) gas clouds only comprised of high velocity particles? The warrant for both of these ideas is partly grounded in the truisms of language. We know that language cannot only be constituted by synsemantic function words. By definition, synsemantic words require autosemantic content words to form intelligible messages (e.g. the synsemantic term “for” requires an autosemantic object to form a meaningful expression). Our experiences with language teaches us that there will be a mix of synsemantic and autosemantic vocabularies, so the bells on the far end of the board will be rung for any given text. We can illustrate Popescu and Altman’s (2009) and Zipf’s (1949) arguments through a simple writing assignment. The task is to make a declaration about the strengths of an ePortfolio that you have created and then revise that declaration to be more concise:

Declaration: “As you can see from my CV page, I have gone to many conferences where I talked about my research.”

Revision: “As you can see from the CV page, I have presented my research at numerous conferences.”

The initial declaration and revision are making the same point. The Declaration is 20 words long; the Revision is 16 words long. The mean word length of the Declaration is 3.89. The mean word length of the Revision is 4.38. Shorter words may be easier to type taken individually (indeed, one might even say that the use of function words such as “as” and “from” require little thought and energy at all), but one needs to use more of them. Words such as “presented” may be longer and take more energy to write, but that single word accounts for two concepts in the Declaration (“have gone to” and “talked about”) with more expediency. Thus, the demon may have to walk farther down the board to ring “presented” as opposed to “gone” or “talked,” but the demon doesn’t have to make as many trips.

The hop from Zipf’s (1949) Bell Analogy, Popescu and Altman’s (2009) gaseous synsemantic and autosemantic words, to Ridolfo and DeVoss (2009) concept of rhetorical velocity is a short one. Beyond the preoccupation with metaphors of speed and movement, all three theories are concerned with (1) how a text is framed, (2) how this frame manages variability, and (3) how this management impacts the delivery of a text. The h point delineation of a text between its fast and slow moving components is another way to represent those linguistic units are that being reused and those linguistic units that are reused sparingly—if at all. While tracking degrees of reuse through a representation of an h point distribution may not give us a single numerical measurement, it can reveal the structural and rhetorical priorities of a text, which could involve the emergence of global gist or a leit motif or point to a text’s variability and vocabulary richness.

Before moving forward, I must make one defining point about my use of the categories synsemantic and autosemantic: in understanding the h point and velocity of a text rhetorically, I take the synsemantic and autosemantic labels as a general speed rating. Thus, content words such as nouns, which are generally classed as autosemantic in linguistics and word frequency studies will appear in the synsemantic category after RePort_Bot processing because those content words will be circulating “faster” than those words that appear beneath the h point. Put another way, for this article, I am more interested in the velocity of meaning-making, as evidenced by word frequency, than in tracking fixed linguistic categories.

The RePort_Bot Procedure

The RePort_Bot procedure proceeds as a combination of input/output, text normalizations, and statistical modeling steps, culminating in an HTML report. I deal with the theoretical and analytical underpinnings for each step in turn. The code for the RePort_Bot procedure and instructions on how to implement the script are provided here: https://github.com/rmomizo/RePort_Bot/tree/gh-pages.

While the code offered functions on ryan-omizo.com, I should note that modifications are needed if the script is to work with other ePortfolios due to the variegated naming conventions used for HTML and CSS selectors. The Python code that extracts textual content targets from ryan-omizo.com targets the CSS class “entry-content” of my customized WordPress theme. The current RePort_Bot script will not return useful results from pages that lack a CSS class called “entry-content” or employs the class “entry-content” for other content sections. Readers should treat the code as functional pseudo-code that invites modification. Browser Inspector tools can help users identify the proper DOM elements to use in their own scraping efforts (more instructions on editing the code can be found in the README.md file at https://github.com/rmomizo/RePort_Bot).

Input

In order to analyze ePortfolios living on the web, the RePort_Bot incorporates the Python library Scrapy (“Scrapy | A Fast and Powerful Scraping and Web Crawling Framework” 2016) for opening URLs and scraping web data. The scraper “bot” opens preselected URLs from ryan-omizo.com and captures the following elements based on their HTML and CSS identifies: links (all anchor tags with the href attribute); title (all page titles based on the presence of a <title> tag); textual content found within the div element of the class “entry-content.” The RePort_Bot outputs this information as a JSON file for further processing.

The JSON file stores the information from the ePortfolio as a series of keys with colon separated values according to the above data model:

{“content”: “Welcome to my website!”,
“links” : [“http://www.example.com”],
“title” : “Example”}

Text Normalization

The quantitative analysis the RePort_Bot conduct relies on counting the words used in the ePortfolio. We must tally what is present and use these counts as a jumping off point for further analysis. In order to take meaningful counts, we must “normalize” the natural language text extracted from the ePortfolio page, which, in this case, is all of the textual data stored as “content” in the JSON file. This “norming” session smoothes subtle but less significant variations in the textual data so that we can take proper word counts. For example, in an ePortfolio of a writing and rhetoric professor, the word “student” may occur several times. Proceeding with the assumption that for a field concerned with pedagogy, we might assume that the word “students” offers significant indices with which to analyze the rhetorical content of the portfolio. The rate of occurrence for the word “students” may suggest the content priorities of this hypothetical portfolio. Further, this rate of occurrence may incline us to count all the occurrences of the word “students.” The complicating factor here is that the word “students” may occur in both upper and lower-case spellings. While “Students” or “students” may convey the same semantic meaning to readers, they will be counted a separate words by the computer because of the difference in case. For this reason, one text normalization step would be to convert all words in scraped from the ePortfolio into lowercase so that all instances of “Students” and “students” will be counted together.

The RePort_Bot applies the following text normalization processes to reduce the signal noise that the natural lexical variability of written language supplies:

  • HTML tag removal – removes HTML tags that persist in the web scraping procedure using the Python package bleach (see https://github.com/mozilla/bleach)
  • Lowercase – converts all string data into lower case
  • String tokenization – splits natural language text strings into individual word units called “tokens.” These tokens are stored in an Python list.
  • Stopword removal – deletes what are often considered function words such as articles and prepositions and pedestrian constructions involving helping verbs and verbs of existence (the stopword list used in the RePort_Bot is present in the code under the variable ‘stopwords’).
  • Lemmatization – reduces words are reduced to their root dictionary representations; the most significant application of lemmatization is the conversion of plural words to their singular roots (e.g., “wolves” to “wolf” or “eggs” to “egg”).

The processing steps above are sourced from natural language processing and information retrieval. However, as Beveridge (2015) argues in his description of “data scrubbing”:

There is no universal or always-correct methodology for how data janitorial work should proceed. Data scrubbing is always a relative triangulation among a particular dataset, a project’s goals, and the analyses and visualizations that a project eventually produces.

I would further argue that each step in the text normalization protocol represents an analytical intervention into the process that will greatly influence the final form that the text assumes and are not innocent by virtue of their conventionality. For example, among text processing methods, an analyst could choose to either lemmatize words (as I have) or stem words. Stemming reduces words to their most basic alphabetic root. Depending on the stemmer, the word token “wolves” would be abbreviated to “wolv.” This reduces variability, but it does not return a real world. The RePort_Bot returns real words in attempt to strike the balance between reducing variability and retaining the integrity of the original text.

Modeling

The RePort_Bot builds models of ePortfolio pages by ranking word frequencies and dividing words into their synsemantic and autosemantic categories. Recall from the discussion of Text Normalization that the RePort_Bot eliminates what would likely constitute synsemantic function words as part of its preprocessing step. Consequently, we are not sorting between function words and content words. We are sorting only content words (i.e., semantically meaningful) into their fast and slow types. We can then use the relatively fast moving content words to achieve a greater sense of key topics and actions that are holding the text together and the relatively slow moving content word to perhaps see where the writer of the ePortfolio wishes his/her readers to linger. The RePort_Bot diagnoses the h point profiles for each page in the ePortfolio and all page content combined.

Because the RePort_Bot requires customization to be used for specific ePortfolio sites, the next section offers a tutorial for gathering, installing, editing, and executing the RePort_Bot Python script.

RePort_Bot Tutorial

Introduction

For those familiar with the Python language and installing Python libraries/dependencies, you may visit https://github.com/rmomizo/RePort_Bot/tree/gh-pages for condensed instructions.

Below, you will find an illustrative walk through of installing the RePort_Bot script to your computer, installing dependencies, customizing the RePort_Bot for your needs, and displaying the results for analysis. Note that there are multiple paths to using the RePort_Bot, but this walkthrough focuses on the most basic paths for use. For more generalized installations (e.g. installing Python to your machine), I will refer readers to existing guides. Lastly, the figures depicting command line code and the results of executing that code were created using Mac’s Terminal program.

The RePort_Bot script does not possess a generalized user interface at the time of this writing. One might call it a functional proof of concept. The modules found within the RePort_Bot script proceed in a stepwise fashion. For example, the Scrapy module will generate a JSON file. The ePortfolio script will then read this JSON file and return analytical results. The virtue of this approach is that users can customize the script to inspect any ePortfolio (or any website). Indeed, as we shall see, some XPath selectors will need to be modified to match the HTML of a given ePortfolio site.

Tools/Materials

  • Command line tools (Command Line or Terminal)
  • Plain Text Editor or Python Interpreter
  • Web browser

Procedure

1. The RePort_Bot script is written in Python. Recent Mac computers already have a working version of Python pre-installed. Windows users can download an executable installer here:

https://www.python.org/downloads/release/python-279/

2. The RePort_Bot script requires the following Python dependencies to run:

pip
virtualenv
Scrapy==1.1.0rc3
beautifulsoup4==4.3.2
bleach
lxml==3.4.1
nltk==2.0.4
numpy==1.8.0
pyOpenSSL==0.15.1
python-dateutil==2.2
pyzmq==14.3.1
requests==2.7.0
requests-oauthlib==0.5.0

3. Installing the above dependencies requires a command line tool or Python interpreter. For this tutorial, we will be working with command line tools because nearly all computers arrive pre-packaged with command line software. For Windows, it is called Command Line or Power Shell. For Mac, the command line tools is called Terminal.

4. To install the dependencies listed above, open your command line tool (see Figure 3).

Figure 3. Command line window (Terminal for Mac)

Figure 3. Command line window (Terminal for Mac)

5. First we install pip. The pip library is an automatic package manager that will collect and install resources to your computer. We will use the default package manager easy_install to download. In your command line, enter the following code:

$ easy_install pip

6. A successful installation will resemble the following (see Figure 4).

Figure 4. Successful pip installation

Figure 4. Successful pip installation

7. Next, we need to install the virtualenv dependency using pip. This virtualenv will allow us to install further dependencies in an insulated director. Ultimately, we will run the RePort_Bot script from this “virtual envelope” on your machine. To install the virtualenv package, type then execute the following in your command line tool:

$ pip install virtualenv

8. The successful installation of the virtualenv will resemble the following (see Figure 5):

Figure 5. Successful virtualenv installation

Figure 5. Successful virtualenv installation

9. We can now create a virtual envelope to insulate our work with the RePort_Bot script. We are creating a directory that has its own version of Python installed. The Python installed within your computer’s framework will not be touched. Using your command line tool, navigate to your Desktop. You can place this virtual envelope anywhere you wish, but for expediency, I am placing the virtual envelope for this tutorial on my Desktop.

The command will follow this basic sequence:

$ virtualenv [name_of_envelope]

I will be calling the virtual envelope for this tutorial venv. The code is:

$ virtualenv venv

10. Navigate inside venv via the command line.$cd venv

11. Activate the venv virtual environment by entering:

$ source bin/activate

You will see a change to the command line interface. The name of our virtual environment now leads the shell prompt (see Figure 6).

Figure 6. Active Python virtual envelope*

Figure 6. Active Python virtual envelope

*!Note: To deactivate your virtual environment, enter deactivate.

12. With the venv active, we can install the remaining dependencies using pip. For this tutorial, we will manually install each of the dependency packages listed above using the following syntax:

$ pip install [package_name]

For a concrete example:

$ pip install Scrapy==1.1.0rc3

Do this for each package listed above to insure the proper installation. You will see a range of feedback in your command line interface. This code indicates that pip is working to download and install the required Python libraries to the virtual environment (see Figure 7).

Figure 7. pip installation of Scrapy to virtual environment

Figure 7. pip installation of Scrapy to virtual environment

*!Note: there are means to install a list of Python libraries using pip and an external .txt file. Handy instructions for this process can be found on this stackoverflow thread: http://stackoverflow.com/questions/7225900/how-to-pip-install-packages-according-to-requirements-txt-from-a-local-directory.

13. With the packages listed above installed, the virtual envelope venv is ready to run the RePort_Bot script. Download or clone the entire RePort_Bot-gh-pages repository from Github here:

https://github.com/rmomizo/RePort_Bot/tree/gh-pages

14. Once downloaded, unzip the RePort_Bot-gh-pages repository in the venv directory we have created for this walkthrough.The RePort_Bot-gh-pages repository contains several required directories that are necessary for the operation of the RePort_Bot script. These directories and files placed therein can be edited to alter the scope of the RePort_Bot analytic and the appearance of the results. For this walkthrough, I will only focus on editing and executing those files that will return the type of results featured in this article.

15. Locate the settings.py file in venv > RePort_Bot-gh-pages > RV > portfolio > portfolio.

16. Open this file in your plain text editor of choice or Python Interpreter. You should see the following Python code:


# Scrapy settings for portfolio project
# 
# For simplicity, this file contains only the most important settings by
# default. All the other settings are documented here:
#
#     http://doc.scrapy.org/en/latest/topics/settings.html

BOT_NAME = 'portfolio'

SPIDER_MODULES = ['portfolio.spiders']
NEWSPIDER_MODULE = 'portfolio.spiders'

# Crawl responsibly by identifying yourself (and your website) on the user-agent
USER_AGENT = 'portfolio (+http://www.ryan-omizo.com)'

17. For this step, you will replace the USER_AGENT variable with the name of your website (if you have one). This will identify your bot to those ePortfolio sites that you wish to scrape and analyze. Replace the current URL (in red) with your own website, leaving the + in place. If you do not have a personal website, you may skip this step.

USER_AGENT = 'portfolio (+http://www.ryan-omizo.com)'

18. Save settings.py.

19. Locate the crawler.py file in venv > RePort_Bot-gh-pages > RV > portfolio > portfolio > spiders and Open crawler.py in your plain text editor. You should see the following Python code:


import scrapy
from scrapy.spiders import CrawlSpider, Rule
from scrapy.linkextractors import LinkExtractor
from portfolio.items import PortfolioItem
from scrapy.selector import HtmlXPathSelector
from scrapy.contrib.spiders import CrawlSpider, Rule
import bleach

class PortfolioSpider(scrapy.Spider):
    name = "portfolio"
    allowed_domains = ["ryan-omizo.com"]

    def start_requests(self):
        yield scrapy.Request('http://ryan-omizo.com/', self.parse)
        yield scrapy.Request('http://ryan-omizo.com/cv-page/', self.parse)
        yield scrapy.Request('http://ryan-omizo.com/research-page/', self.parse)
        yield scrapy.Request('http://ryan-omizo.com/teaching-page/', self.parse)
        yield scrapy.Request('http://ryan-omizo.com/experiments-blog-page/', self.parse)

    def parse(self, response):
        item = PortfolioItem()
        item['start_url'] = response.request.url
        item['title'] = response.xpath('//title/text()').extract()
        item['content'] = response.xpath('//div[@class="entry-content"]').extract()
        item['links'] = response.xpath('//a/@href').extract()
        yield item

20. The above Python code imports the required dependencies to run crawler.py. Notice the URL information in the def start_request(self) function:


yield scrapy.Request('http://ryan-omizo.com/', self.parse)
yield scrapy.Request('http://ryan-omizo.com/cv-page/', self.parse)
yield scrapy.Request('http://ryan-omizo.com/research-page/', self.parse)
yield scrapy.Request('http://ryan-omizo.com/teaching-page/', self.parse)
yield scrapy.Request('http://ryan-omizo.com/experiments-blog-page/', self.parse)

These URLs point to different pages in my ePortfolio hosted at http://ryan-omizo.com. To apply the RePort_Bot script to a different ePortfolio, replace the URLs in crawler.py with those matching the targeted ePortfolio.

21. Save crawler.py with your plain text editor or Python interpreter.

22. The next edit to make in crawler.py is to the def parse(self, response) function. This function parses the HTML elements in your page. For ryan-omizo.com, the div class entry-content contains the primary page content for all pages. For best results, you should target theat div that contains most of the text in the ePortfolio. You can track this by using inspector tools found in browsers such as Firefox or Chrome or you can view the page source in the browser.

def parse(self, response):

        item = PortfolioItem()
        item['start_url'] = response.request.url
        item['title'] = response.xpath('//title/text()').extract()
        item['content'] = response.xpath('//div[@class="entry-content"]').extract()
        item['links'] = response.xpath('//a/@href').extract()

        yield item

To target the div id or class specific to an ePortfolio, replace the XPath selector (in red) associated with the item['content'] variable:

item['content'] = response.xpath('//div[@class="entry-content"]').extract()

*!Note: HTML selectors can vary greatly. It may be necessary to target an id rather than a class or a default HTML element such as <body>. For a reference to using selectors with Scrapy, see https://doc.scrapy.org/en/latest/topics/selectors.html.

23. Save crawler.py.

24. With the RePort_Bot script customized, we can now execute the RePort_Bot script through our virtual envelope. Using command line tools, enter into the spiders directory. Because you should currently be in the venv virtual envelope, you can use the following code:

$ cd RV/portfolio/portfolio/spiders

25. Run the scrapy spider by entering the following code through the command line interface:

$ scrapy crawl portfolio -o items.json

26. The code will generate an items.json file in your spider directory. This JSON file contains all HTML content scraped by Scrapy. You can consider this the “raw” data for the RePort_Bot analytic.

27. Activate the Python interpreter in your command line interface by entering the following code:

$ python

*!Note: if you are using a Python interpreter such as PyCharm or Anaconda, then you may skip to step 28.

28. With Python active, we can import the Python file that will apply the RePort_Bot analytic to the scraped content found in items.json by entering the following commands:

>>> import ePortfolio
>>> from ePortfolio import *
>>> make_report('items.json')

29. The above code will analyze the items.json content and generate an HTML file called report.html, which contains the results of analysis. You can open this file in your browser with CSS styles and JQuery interactivity already applied.

*!Note: The CSS and JQuery script for report.html can be found in venv > RePort_Bot-gh-pages > portfolio > portfolio > spiders as report.css and jquery.tipsy.js respectively. You can edit these files to customize the appearance and interactivity of the RePort_Bot results.

30. See sample results here:

http://rmomizo.github.io/RePort_Bot/report

For the interpretation of the above results, see the Output/Analysis section below.

Output/Analysis

In this section, I illustrate the RePort_Bot output via sample runs on my own ePortfolio site: ryan-omizo.com. This site is divided into 5 main pages: landing page, CV, Teaching, Research, and Experiments blogroll. I elaborate on the data visualizations through close, reflective analysis of these results. Summative statements about the applicability of the RePort_Bot can be found in the Use Cases/Conclusion section below. The actual report generated by the RePort_Bot on ryan-omizo.com can be accessed here: http://rmomizo.github.io/RePort_Bot/report. Readers can interact with the charts as an accompaniment to the following analysis.

I have chosen to deploy the RePort_Bot on my own professional portfolio because this type of self-reflective work is a primary entailment of composing ePortfolios and fundamental to ePortfolio pedagogy. While writing within a classroom is a social action that enrolls participants at all stages of the process, students must also refine their ability to deliberate and respond as individual actors, which also includes making decisions about their own texts after self-review or adopting the persona of an expert when reviewing others. Abrami, et al. (2009) refer to this concern as “self-regulation” (5). Yancey’s (1998) concept of “reflection-in-action” offers additional support to this position and is worth quoting at length:

Through reflection we can circle back, return to earlier notes, to earlier understandings and observations, to re-think them from time present (as opposed to time past), to think how things will look to time future. Reflection asks tahat we explain to others, as I try to do here, so that in explaining to others, we explain to ourselves. We begin to re-understand.

Reflection-in-action is thus recursive and generative. It’s not either a process/or a product, but both processes and products. (24)

Consequently, by applying the RePort_Bot script to my own ePortfolio, I am conducting the self-regulating or reflection-in-action exercises that we ask students to conduct as they work to populate and make sensible their own ePortfolios. And this self-regulation or reflection-in-action is predicated on the RePort_Bot remaking my own content so that I can approach it from an unfamiliar perspective.

The RePort_Bot outputs an HTML page featuring treemap visualizations that graph the h point distributions of word tokens for the entire portfolio and for individual pages[1]. The synsemantic, h point, and autosemantic terms are color coded so that users can discern the breakpoint between “fast” and “slow” terms in the treemap (see Figure 8). Hovering reveals a tooltip with term and rank-frequency information. Figures 9–11 display data culled from ryan-omizo.com.

Figure 8. Treemap Color Codes

Figure 8. Treemap Color Codes

h-Point for total pages http://ryan-omizo.com/ (‘first’, (13,13))
Figure 9. Treemap of All ePortfolio Pages, with 'first' as the h-point (13, 13) and 'author' with the highest rank-frequency (1, 78)

Figure 9. Treemap of All ePortfolio Pages

Taking ryan-omizo.com as a test case for the RePort_Bot, we can start reconciling results with rhetorical aims of the website/ePortfolio. The function of ryan-omizo.com is to establish my web presence in the field of academia—specifically, rhetoric and composition, professional writing, and digital humanities. The primary means of grounding my professional academic presence is through the use descriptions of myself, descriptions of page content, and blog posts that present examples of computational work, of which, a lengthy portion is occupied by material supplementing a off-site publication (see Omizo and Hart-Davidson 2016 for further context). All of this would be considered the why and what of this professional/academe-anchored ePortfolio. The RePort_Bot findings focus on how I am implementing this strategy and crafting an ethos.

The first treemap visualization (Figure 9) accounts for the total term frequencies and ranks in the entire ePortfolio. The h point (‘first’, 13, 13) seems rather innocuous as a separator of synsemantic and autosematic terms in the ePortfolio’s vocabulary. Terms such as (“writer”, 14) and (“rhetoric”, 17) that fall in the “fast” synsemantic category parallel the avowed ethos of rhetoric and composition academic. However, also occupying the “fast” synsemantic region of the rank-frequency distribution are more idiosyncratic. Terms such as (“author”, 1, 78) and (“sentence”, 2, 41) seem to relate to the discipline of rhetoric and composition, but their high rank-frequency suggest that something more is occurring. Indeed, the reason these terms appear with such high rank-frequency is because they are part of an extended blog post that contains a supplement to a print article. Of course, determining the provenance of a term’s rank-frequency does not convey higher order rhetorical information—especially to the writer of the ePortfolio under inspection. The information in this treemap does suggest that primary content of ryan-omizo.com (defined by length and vocabulary richness) is skewed toward a single blog post in the Experiments section. The insight here implicates the arrangement of the ePortfolio as opposed to the topics considered.

ryan-omizo.com |
h-point (u’writing’, (2,2))
Figure 10. Treemap visualization of landing page, with 'writing' as the h-point (2,2) and 'rhetoric' and 'university' as the top rank-frequency (1,3)

Figure 10. Treemap visualization of landing page

Research
h-point (u’graph’,(2,2))
Figure 11. Treemap visualization of Research page, with 'graph' as the h-point (2,2) and 'human,' 'computational,' 'how,' and 'research' as the top rank-frequency (1,3)

Figure 11. Treemap visualization of Research page

The second (Figure 10) and third treemaps (Figure 11) also seem to return the expected results. The brevity of both pages translates into scant distinctions between synsemantic and autosemantic terms. Moreover, the landing page features “fast” terms such as “rhetoric,” “composition,” “writing,” and “university”—all of which reiterate institutional affiliation and academic training. However, it is the fourth treemap that visualizes Teaching (Figure 12) that, upon personal reflection, returns surprising results. The highest ranked term in that page is “must,” suggesting that the content on that page emphasizes prescription—the “oughts” of what teachers and students should do to make learning happen or a univocal stance toward pedagogy. As a point of personal reflection, this is not the pedagogical stance that I wish to communicate in my ePortfolio, so this result is a surprise to this author.

Teaching
h-point (u’writer’,(3,3))
Figure 12. Treemap visualization of Teaching page, with 'writer' as the h-point (3,3) and 'must' as the top rank-frequency (1,5)

Figure 12. Treemap visualization of Teaching page

In all, the rhetorical velocity information, framed as the reuse of terms above and below the in-page h point of ryan-omizo.com, has suggested the following: (1) the content linguistic content of the ePortfolio is dominated by a single post in the Experiments blog, which functions an online resource for readers of the Journal of Writing Research; (2) the landing, Research, and CV pages return results that are typical of genre in that the circulation of terms on the page are disciplinary-specific and seem to enunciate my positionality within the field of rhetoric and composition; (3) the Teaching page suggests the use of a prescriptive vocabulary that runs counter to my actual teaching philosophy.

Use Cases/Conclusions

In this section, I offer four use cases for the RePort_Bot and make an extended point about how the use of the RePort_Bot can inform pedagogy and classroom practices. To make this latter point, I include a possible ePortfolio assignment that leverages the theoretical preoccupations of the RePort_Bot.

The first use case builds upon the Output/Analysis section above and poses the RePort_Bot as an instrument that fosters self-reflection/self-regulation. This use case is based on the assumption that writers do not always have a global perspective on their own writing (hence the need for peer and instructor feedback). Though blunt, the RePort_Bot provides an alternative perspective on the ePortfolio. This perspective does not offer definitive answers, but obliges ePortfolio writers to revisit their writing and attempt to integrate the results with their composing strategies. This act represents what Graves (1992) describes as a “nudge.” This nudge can encourage the writer to re-read his/her writing and learn to make individual course corrections.

The second use case for using the RePort_Bot involves making the ePortfolio a site for increased functional literacy (Selber 2004). The steps required to update the RePort_Bot files to make it viable for an individual’s ePortfolio can serve as a brief introduction to Python. More germane to an ePortfolio course, however, is the reconceptualization of the ePortfolio that is required before processing. Editing the RePort_Bot obliges the user to think of his/her ePortfolio not simply as a holistic showcase of exemplary work, curation, and reflection but as live web assets, accessible by human and non-human agents. Students would learn that their digital artifacts operate within an ecosystem of web crawlers and bots that are reading their ePortfolios and already repurposing their content. The RePort_Bot’s Python configuration can illuminate what this process involves. In this way, the RePort_Bot is similar in purpose to Ridolfo’s (2006) (C)omprehensive (O)nline (D)ocument (E)valuation heuristic. The CODE guide asks students to evaluate online sources in the context of the sources’ digital fingerprint (e.g., domain name, IP address, ISP, geographic location, and version history), much of which is neglected when undergraduates select materials for research. Revising the RePort_Bot for the needs of a class or individual user can perform the same type of unpacking, making students aware of their own digital fingerprint.

The third use case involves using the RePort_Bot as a research tool for a competitive review of like-minded ePortfolio writers. In a competitive review session, a writer can use the RePort_Bot to compare his/her site with a peer’s ePortfolio in order to determine how his/her ePortfolio aligns with others in his/her field according to the use of “fast” and “slow” terms. A student who is majoring in public relations and is designing a professional ePortfolio may look to the sites of industry experts and see how those experts are managing their rhetoric with the use of stabilizing and recurrent synsemantic terms and stickier, more idiosyncratic autosemantic terms. Put another way, this application of the RePort_Bot can be instructive in teaching practitioners how to mine ePortfolio for genre cues at the level of situation and syntax (Campbell and Jamieson 1978, 19; Jamieson and Campbell 1982, 146). Designers of ePortfolios can use the RePort_Bot to discover the linguistic elements that recur within known disciplinary examples in order to determine the types of elements that are characteristic of their fields.

The fourth use case for the RePort_Bot construes the RePort_Bot as a research tool for a broader genre analysis of ePortfolios. This genre research coheres with Miller’s (1984) definition of genre as goal-driven action that renders communication socially intelligible. Miller (1984) makes it clear that genre exceeds style and formal rules, indicating an “acting together” without necessary closure. While the RePort_Bot operates as a function of the presence or absence of terms in an ePortfolio, the heuristical nature of its output does engage in the type of open inquiry for which Miller advocates. The conventions and taxonomies read from the treemap visualizations are meant to complement human judgment. Moreover, the treemaps oblige a bottom-up interpretation—beginning with the frequency of word units that lay the foundation for higher-order rhetorical analysis. The empirical results of the RePort_Bot can lead to the establishment of more general rules and tendencies, but these rules and tendencies are generated by the constitution of textual inputs as opposed to imposing macro-classification schemes onto texts. In the arena of computation, the results generated by the RePort_Bot might be considered unsupervised—meaning that human coding of content has not been used to “teach” the algorithm how to label data. Consequently, a sub-use case could be made for using the processing and labeling (“fast” vs. “slow”) as a means to annotate data for other computational tasks like supervised machine learning or classification tasks.

These four use cases are presented as starting points for future elaborations. That said, communicating rhetorical concepts such as rhetorical velocity and bibliometric concepts such as the h point to students might still present challenges for instructors. To help students use the RePort_Bot in their cycles of composing, reflecting and revising, I am including a worksheet predicated on reconciling the rhetorical goals of student users and the RePort_Bot results. The tasks, as you will, see are meant to position students within a feedback loop between their drafts, their RePort_Bot results, and their goals and expectations for their ePortfolio projects. Students are obliged to first isolate data from the ePortfolio as a whole and from individual pages. They then frame the entire ePortfolio and each page in terms of their motivations. After scraping and analyzing their ePorfolio content with the RePort_Bot script, they are asked to recursively compare the measurements of the RePort_Bot with their own perceptions and then explain areas of coherence or disjunction.

Notes

[1] The treemap visualization used is an adaptation of Bostock’s treemap example: https://bl.ocks.org/mbostock/4063582. The RePort_Bot’s visualization is greatly indebted to this published code sample.

RePort_Bot Student Worksheet

Name: __________

Date: ___________

Using this sheet, you will record the goals for your ePortfolio as a whole and for each page or content area. Additionally, you will describe the tone you wish to convey through the written content of the ePortfolio. You will list at least 3 key words that communicate the primary topics of your ePortfolio as a whole and 3 key words for each page or content area.

Once you have completed the sheet, run the RePort_Bot script on your ePortfolio and compare your reading of the results to your initial goals, tone descriptions, and key words.

ePortfolio URL: ____________________

Please describe the primary goals of your ePortfolio in the space below. Who is the audience of yourmy ePortfolio? What do you wish to communicate to this audience? How are you communicating with this audience?

Please describe the tone you wish to convey in your ePortfolio and how you believe you are establishing this tone in the space below.

Please list 3 key words that sum up the main topics of your ePortfolio:

1.
2.
3.

After running the RePort_Bot, please compare the interpretation of your results to the goals description above. Is the RePort_Bot classifying the terms you wish to emphasize as “fast” or “slow” in rhetorical velocity? Is the RePort_Bot confirming what you expected to see? Why or why not? What do you think is influencing the RePort_Bot’s analysis judgment?

ePortfolio page title: ______________

URL: ____________________

Please describe the primary goals of the listed ePortfolio page in the space below. What is the function of this page? How does the content of this page relate to the goals of the whole ePortfolio?

Please describe the tone you wish to convey in this ePortfolio page and how you believe you are establishing this tone in the space below.

Please list 3 key words that sum up the main topics of this ePortfolio page:

1.
2.
3.

After running the RePort_Bot, please compare the intepretation of your results to the goals description above. Is the RePort_Bot classifying the terms you wish to emphasize as “fast” or “slow” in rhetorical velocity? Is the RePort_Bot confirming what you expected to see? Why or why not? What do you think is influencing the RePort_Bot’s judgment?

Bibliography

Abrami, Philip, Anne Wade, Vanitha Pillay, Ofra Aslan, Eva Bures, and Caitlin Bentley. 2009.”Encouraging self-regulated learning through electronic portfolios.” Canadian Journal of Learning and Technology/La revue canadienne de l’apprentissage et de la technologie 34, no. 3. https://www.cjlt.ca/index.php/cjlt/article/view/26414/19596

Ball, Philip. “Index Aims for Fair Ranking of Scientists.” Nature 436, no. 7053 (2005): 900–900. doi:10.1038/436900a

Barrett, Helen. 2000 “Electronic Teaching Portfolios: Multimedia Skills+ Portfolio Development= Powerful Professional Development.” ERIC Database (IR020170). http://eric.ed.gov/?id=ED444514

Beveridge, Aaron. 2016. “Looking in the Dustbin Data Janitorial Work, Statistical Reasoning, and Information Rhetorics.” Computers and Composition Online Fall 2015–Spring 2016. http://cconlinejournal.org/fall15/beveridge/

Bourelle, Andrew, Tiffany Bourelle, and Natasha Jones. 2015. “Multimodality in the Technical Communication Classroom: Viewing Classical Rhetoric Through a 21st Century Lens.” Technical Communication Quarterly 24, no. 4: 306–327. http://dx.doi.org/10.1080/10572252.2015.1078847.

Campbell, Karlyn Kohrs, and Kathleen Hall Jamieson. 1978. “Form and Genre in Rhetorical Criticism: An Introduction.” Eds. Karlyn Korhs Campbell and Kathleen Hall Jamieson. in Form and genre: Shaping rhetorical action, edited by Karlyn Korhs Campbell and Kathleen Hall Jamieson, 9–32. Falls Church: Speech Communication Association.ERIC Database (CS502046). http://eric.ed.gov/?id=ED151893

Graves, Donald. 1992. “Help Students Learn to Read their Own Portfolios.” in Portfolio Portraits edited by Donald H. Graves and Bonnie S. Sunstein, 85–95. Portsmouth:Heinemann.

Hirsch, Jorge E. 2005. “An index to quantify an individual’s scientific research output.” Proceedings of the National academy of Sciences of the United States of America 102, no. 46: 16569–16572. doi: 10.1073/pnas.0507655102.

———. 2007. “Does the h index have predictive power?” Proceedings of the National Academy of Sciences 104, no. 49: 19193–19198. doi: 10.1073/pnas.0707962104

Jamieson, Kathleen Hall, and Karlyn Kohrs Campbell. 1982. “Rhetorical hybrids: Fusions of generic elements.” Quarterly Journal of Speech 68, no. 2: 146–157. http://dx.doi.org/10.1080/00335638209383600.

Jockers, Matthew L. 2013. Macroanalysis: Digita Methods and Literary History. Champaign: University of Illinois Press.

Kilbane, Clare R., and Natalie B. Milman. 2003. The Digital Teaching Portfolio Handbook: A how-to guide for educators. New York: Pearson College Division.

Kissane, Erin. 2011. The Elements of Content Strategy. New York: A Book Apart. eBook.

Klein, Lauren F. 2013. “The Social eportfolio: Integrating social media and models of learning in academic eportfolios.” in ePortfolio performance support systems: Constructing, presenting, and assessing portfolios, edited by Wills, Katherine V., and Richard Aaron Rice, 53–71. Fort Collins. http://wac.colostate.edu/books/eportfolios/willsrice.pdf

Kopple, William J. Vande. 1985. “Some Exploratory Discourse on Metadiscourse.” College composition and communication: 82–93. doi: 10.2307/357609

Meloni, Julia. 2009. “Getting Started with GoogleDocs in the Classroom.” Chronicle of Higher Education: ProfHacker, Accessed April 12, 2016. http://www.chronicle.com/blogs/profhacker/getting-started-with-google-docs-in-the-classroom/22641.

Miller, Carolyn R. “Genre as Social Action.” Quarterly Journal of Speech 70 (1984): 151–167.

Omizo, Ryan and Hart-Davidson, William. 2016. “Finding Genre Signals in Academic Writing.” Journal of Writing Research 7, no. 3: 485–509. doi: 10.17239/jowr-2016.07.03.08

Popescu, Ioan-Ioviț, and Gabriel Altmann. 2009. Word frequency studies. Berlin: Mouton de Gruyter.

Ramsay, Stephen. 2011 Reading machines: Toward an algorithmic criticism. Champaign: University of Illinois Press.

———. 2003. “Special section: Reconceiving text analysis toward an algorithmic criticism.” Literary and Linguistic Computing 18, no. 2: 167–174. doi: 10.1093/llc/18.2.167

Reynolds, Nedra, and Elizabeth Davis. 2013. Portfolio teaching: A guide for instructors. New York: Macmillan Higher Education.

Rice, Rich. 2013. “The Hypermediated Teaching Philosophy ePortfolio Performance Support System.” in. ePortfolio performance support systems: Constructing, presenting, and assessing portfolios, edited by Katherine Willis and Richard Aaron Rice, 37–51. Fort Collins: WAC Clearinghouse. http://wac.colostate.edu/books/eportfolios/willsrice.pdf

Ridolfo, Jim. 2006. “(C).omprehensive (O).nline (D).ocument (E).valuation.” Kairos: A Journal of Rhetoric, Technology, and Pedagogy 10, no. 2. http://kairos.technorhetoric.net/10.2/praxis/ridolfo/

Ridolfo, Jim, and Dànielle Nicole DeVoss. 2009. “Composing for Recomposition: Rhetorical Velocity and Delivery.” Kairos: A Journal of Rhetoric, Technology, and Pedagogy 13, no. 2. http://kairos.technorhetoric.net/13.2/topoi/ridolfo_devoss/

Ridolfo, Jim, and Martine Courant Rife. 2011. “Rhetorical Velocity and Copyright: A Case Study on Strategies of Rhetorical Delivery.” in Copy(write), edited by Martine Courant Rife, Shaun Slattery, Danielle Nicole DeVoss, 223–243. Fort Collins: WAC Clearinghouse. eBook pdf.

Samuels, Lisa, and Jerome J. McGann. 1999. “Deformance and interpretation.” New Literary History 30, no. 1: 25–56. doi: 10.1353/nlh.1999.0010

“Scrapy | A Fast and Powerful Scraping and Web Crawling Framework.” Scrapy | A Fast and Powerful Scraping and Web Crawling Framework. Accessed April 12, 2016. http://scrapy.org/.

Selber, Stuart. 2004. Multiliteracies for a digital age. Carbondale: SIU Press.

Silver, Naomi. 2016. “Reflection in Digital Spaces Publication, Conversation, Collaboration.” in Rhetoric of Reflection, edited by Kathleen Blake Yancey, 166–200. Logan: Utah State University Press. http://www.jstor.org/stable/j.ctt1djmhfg

Yancey, Kathleen Blake. 2004. “Made not only in words: Composition in a new key.” College Composition and Communication 56, no. 2: 297–328.

———. 2013. “Postmodernism, Palimpsest, and Portfolios: Theoretical Issues in the Representation of Student Work.” in. ePortfolio performance support systems: Constructing, presenting, and assessing portfolios, edited by Katherine Willis and Richard Aaron Rice,15–36. Fort Collins: WAC Clearinghouse. http://wac.colostate.edu/books/eportfolios/willsrice.pdf

———. 2009. “Reflection and Electronic Portfolios.” in Electronic portfolios 2.0: Emergent research on implementation and impact, edited by Darren Cambridge, Barbara Cambridge, and Kathleen Blake Yancey, 5–15. Sterling: Stylus Publishing, LLC..

———. Reflection in the writing classroom. 1998. Logan: Utah State University Press.

Zipf, George Kingsley. 1949 Human behavior and the principle of least effort: An Introduction to Human Ecology. Cambridge: Addison-Wesley Press.

About the Author

Ryan Omizo is an Assistant Professor in the Department of Writing and Rhetoric at the University of Rhode Island. His research interests include computational rhetoric, Asian-American rhetoric, and the digital humanities. His work has appeared in The Journal of Writing Research and Enculturation: Journal of Rhetoric, Writing, and Culture. He is the co-editor of The Rhetoric of Participation: Interrogating Commonplace In and Beyond the Classroom, which is currently under review by Computers and Composition Digital Press.

8

Interview: Bret Eynon, Joseph Ugoretz, Laura M. Gambino

Editorial Note

This interview took place on July 7th, 2016 at LaGuardia Community College in Long Island City, NY. The transcript below has been edited for readability and for length and the 90-minute conversation has been edited down to a 66-minute recording. For those who may want to skip to specific topics in the interview, we have provided section headers and time markers to allow for more efficient listening. However, we also wanted to give our readers and listeners the opportunity to experience both the recorded interview and the written transcript in their entirety.


Link to archived content on archive.org

Getting Started: Launching ePortfolio at CUNY

Dominique:
I’d like you to start with a brief introduction about your institution and how you’ve used ePortfolio–your campus ePortfolio history.
Bret Eynon:

I’ve been at LaGuardia since 2000. In 2001, I helped Paul Arcario write the first grant that supported our ePortfolio work. When we got that grant, we first focused on pedagogy, influenced in part by the work I’d done with the New Media Classroom Project[1] and the Visible Knowledge Project[2]. In those projects we emphasized linking constructivist pedagogies with multimedia authoring as a way to help students take ownership of their learning and develop their identities as learners. As our ePortfolio work at LaGuardia grew, we started taking on assessment. We did little around assessment for the first eight years. Around 2009, as our Middle States visit approached, Paul asked us to strengthen our work with the assessment side of ePortfolio, so that grew in importance.

The development of our new First Year Seminar has in some ways shifted the balance back to pedagogy, and has added new emphasis on advisement. Each year 10,000 to 14,000 LaGuardia students are active on their portfolios. ePortfolios are used in some way in most credit programs, and to some extent in Adult and Continuing Ed.

In 2008 we held our first national conference. That same year we launched the first of six years of grant-funded work through our Making Connections National Resource Center. We worked with 70 to 80 colleges and universities nationwide around ePortfolio pedagogy and practice and built the Catalyst for Learning site (http://c2l.mcnrc.org/).

Joseph Ugoretz:

When I came to Macaulay in 2007, I found a really unique institutional structure. It’s a consortial program where the students are on eight different CUNY campuses. We don’t have our own faculty. We draw faculty from the different campuses. We don’t have our own curriculum except for four seminars. So it was an unstructured and flexible setup that required an unstructured and flexible ePortfolio solution. I came with the same orientation Bret mentioned from the Visible Knowledge Project; I was focused on pedagogy and on reflection for students, and especially on students taking ownership of their own learning and being able to design the representation of their learning. And that was a nice match for the Honors College. So we launched with all our incoming freshmen in 2008, 500 at once, and grew from there.

What has characterized ePortfolio at Macaulay is this flexibility, opening to many different pathways. It’s rhizomatic in a way, because we weren’t able to impose a structure or a plan. As a result, we’ve seen the flowering of things we didn’t expect and that may not even fit a strict definition of ePortfolio. “What is an ePortfolio?” is an interesting question that we ask all the time.

It’s been fun to see the way that students have taken hold of the concept in order to create different approaches: literary journals, travel blogs, curriculum plans, career placement portfolios, poetry collections, etc. The other exciting thing is the number of students who use the platform for group or collaborative work. There’s a large number of class ePortfolios or small groups within a class, or clubs, or student groups that put together a portfolio based on their interests that are shared. Many voices can be included, with a consensus on overall design.

I’ll say one more thing: From the beginning, we felt that visual graphic design was critical, that students express their individuality that way, and that they make the portfolio itself be a representation. A paper portfolio tends to be a black binder. But then we see someone putting stickers on it. They want it to be more than a container, to be a thing that has meaning in itself, as an object.

Laura Gambino:

I’m at Guttman Community College, which opened in 2012 as the New Community College. I joined the college two weeks before we opened. One of the great opportunities we had was to rethink the community college experience for urban students. We built on a lot of the great work that we saw at LaGuardia and Macaulay and elsewhere across CUNY. And one of the things we did was to implement ePortfolio at scale, right from the start. So on the second day of our Summer Bridge program for our first cohort of students, every student created an ePortfolio.Because we’re a little newer, we focused on all different aspects of ePortfolio at once. So we’re very focused on the pedagogy and having students create a learning portfolio that spans their entire academic experience from Bridge right through commencement, connecting their curricular, co-curricular and experiential learning activities that take place out in the city. It’s also the primary vehicle for our assessment of student learning.

All students submit their portfolios at three milestones: the end of Bridge, the end of the first year, and the end of the second year. And all faculty engage in assessment where we look at not just the artifacts of student work in the portfolios, but also the reflections that go along with them. That’s the most engaging piece for faculty: the reflections. ePortfolio is also our course content delivery system. (We use Blackboard only in limited ways.) Most faculty maintain professional ePortfolios. Our student clubs and student government use it. But we also use it as an institutional repository– course teams share assignments and materials through ePortfolio. So, we’ve had that same experience where its use has blossomed in ways we didn’t foresee. But it’s safe to say ePortfolio is central to our learning culture; and that presents great opportunities and great challenges.

Joseph Ugoretz:
Laura makes a good point. People are hungry for ways to bring things together and publish and share them. When you give people something that allows them to do that, they seize it and run with it.
Laura Gambino:
In Bridge, students create a group ePortfolio. They’re doing a group research project, and they construct a portfolio together, synthesizing different components into one, collaborating and sharing it. And it really engages them. It doesn’t just introduce them to ePortfolio; it engages and shows them what the Guttman learning experience will be like.

Pedagogy first, platform second

(11:50)

Bret Eynon:

I want to pick up on a couple of things. First, Laura and Joe’s emphasis on collaboration shows how our practice has changed over the past 10 to 15 years. The notion of a social pedagogy for ePortfolio has transformed our notion of portfolio practice. When we first encountered ePortfolio, it was private and individual. People were wary about going public. That has really changed. We now know that the more portfolios are used as a site of exchange, communication, conversation, and collaboration, the more powerful they are. I also want to comment on Joe’s point about visual expressiveness. I’ve long been a believer in the importance of visual design in ePortfolios, and have seen that as crucial for student ownership. But it’s now more difficult than it used to be.

Our first platform required all students to learn HTML authoring. They created visually stunning portfolios, but it took a huge amount of work to train them, and that was a barrier to broad adoption. So we moved to a platform that emphasized ease of use and promised the possibility of interaction–but that required templates and came, to some extent, at the cost of visual expressiveness.

There are complex factors around the platform, related to visual expressiveness, ease of use, interaction and organization for assessment. As a result, there is no such thing as the ideal platform.

Laura Gambino:
When we speak at conferences, the question we always get is: What platform should my campus use? As Bret was saying, there is no perfect platform, and there will never be a perfect platform. ePortfolio is so much more than that technology. But that’s what people get stuck on.
Joseph Ugoretz:
The question of which platform to use is sort of a “how long is a piece of string” question. The point is to start with the philosophical goals, and the pedagogical goals, and really think those through. And then see what kinds of technologies are going to best meet those needs. We’re all saying the same thing here, that the solution that’s best for one college is not best for another.
Bret Eynon:
I’ve seen an interesting tension as institutions take on portfolio. To understand your goals for ePortfolio is difficult without experience. I’m a deep Deweyan and I believe in experiential learning. But you can’t get the experience you need without a platform. So it’s a chicken and egg conundrum. I guess the solution is to jump in some place and be willing to adapt.
Joseph Ugoretz:
One of the things I admire about LaGuardia is that you started with a model and weren’t afraid to radically alter the approach as you learned. And I think that’s really critical. We don’t have to commit everything to one approach or one platform and then stick with it. But we do have to have something to work on. And then we can learn from it and see what the next best step will be.

What’s unique about ePortfolio as an educational technology?

(18:18)

Dominique Zino:
What exactly have you seen change in terms of the technology? What have been the most beneficial changes? Do you see other changes on the horizon in terms of platforms and technology?
Laura Gambino:

ePortfolio technology is different from other educational technologies or learning management systems because of its ability to span courses, semesters, and co-curricular spaces. It can bring together the entire student learning experience. Most educational technology is course-centric. But a student can create an ePortfolio and carry it with them across all of their learning experiences, and see the way those experiences fit together and integrate. And they can reflect on the integration of all of those different pieces. As far as I know, no other software or educational technology is designed to do that. That, to me, is what it makes it so unique and so valuable to the student learning experience.

Joseph Ugoretz:
And I’ll double confirm that. It’s always infuriated me that our learning management systems think of learning as something that starts in September and ends in December, and then you can never see that course again ever. That’s not the way that people learn. It’s not the way people want to learn. Learning goes across courses, across time, and across institutions.
Bret Eynon:

Over the last fifteen years we’ve seen growing recognition, across higher education, that we must help students build their abilities to integrate their learning. Fifteen years ago, I don’t remember anybody talking about that. But now it’s a common discussion. We were just down at the US Department of Ed for this big national symposium on the future of higher education, and everyone in the room was talking about it: how do we help students connect their learning? How do we help them use what they’re learning? How does education add up to more than a set of discreet, isolated experiences?” It’s a big change. That’s one reason why I think the moment for ePortfolios is arriving. To some extent, at our three schools we’ve been ahead of the curve on this. But for most educators, the need for the portfolio, the need for integration, is only now becoming clear. That puts us at a really interesting moment. It’s not to say that the ePortfolio will solve all problems–the challenge of integration is not just a matter of the software. It even goes beyond the pedagogy and practice of individual faculty members. Integrative education requires a different level of institutional integration and collaboration, a common envisioning of the educational project.

We have a long way to go on that front. But integrative ePortfolio technology can help. And, as more educators recognize the need for integration, I think our integrative portfolio practice can grow increasingly sophisticated and powerful.

Laura Gambino:
One of the biggest technological changes is related to what Bret was talking about earlier, the rise of social pedagogy. The ability to comment on other folks’ portfolios, to have conversations in portfolios, and connect to other social media–we didn’t see that ten years ago. So that’s emerged and it’s great because it has helped facilitate a new and effective classroom pedagogy.
Joseph Ugoretz:

As the web has developed, we’ve seen a move to curation, pulling in lots of different sources, evaluating them and mashing them up. ePortfolios are a place where that can happen. That includes curating videos and images quickly and easily. Students want to be able to put something up right away and maybe reflect on it later. Portfolios are places where we can, through comments and interaction with peers and professors, push them to go beyond the first response to the deeper learning.

Laura Gambino:
Another facet that’s evolved is the assessment system. Think about assessment systems ten years ago. Now, in many platforms, you can easily capture snapshots of student portfolios, store them for as long as you want, distribute them to different faculty, score them with rubrics, and get the data easily and quickly–wow, I’m sounding like a real geek! That’s a huge advantage.
Joseph Ugoretz:
We’ve moved to a broader and deeper view of assessment. The portfolio platform we developed at BMCC focused on filling in boxes for specific tasks. You get the box filled in, boom, that’s a positive assessment. That’s no longer an interesting form of assessment for anyone–if it ever was.
Bret Eynon:

I’d mention two other technology developments. One is the intersection of badging, and portfolios. Badging seeks to capitalize on the motivational skill-building, level-conquering processes found in digital gaming. It can also create discreet pathways into the portfolio for external viewers. The badge lives in the portfolio, and provides a way to spotlight a particular part of the portfolio, a salient skillset, let’s say, for a particular audience. We’re now experimenting with this at LaGuardia. In the digital learning environment, students have the opportunity to “learn everywhere.” Take a course on another campus, or online. The key is, don’t leave it disconnected. Connect it with a common repository of learning. I think that’s part of what the portfolio is going to offer, that place of intersection, where diverse learning experience is connected. Badges and ePortfolios could be a part of facilitating and representing that integration.

The other thing to mention is the connection between portfolios and analytics. Clearly, learning analytics and algorithmic analysis of student data is upon us. But as of now, it’s happening at a pretty superficial level–how many clicks here and how many clicks there. That doesn’t actually tell us very much.

I see the potential for a more interesting form of analytics, linked to portfolios. Doing a machine reading of portfolios, and surfacing data patterns could support a more integrative understanding of analytics. That could be a very interesting area to explore in the future. But we have a ways to go on this.

Asking Students to Make Choices about their Digital Identities

(31:05)

Joseph Ugoretz:

My friend Patrick Masson says “I’ve been enrolled in this really fascinating MOOC for the past 20 years. It’s called the internet.” People don’t necessarily go to institutions to learn anymore. They don’t necessarily go to sources that the academy decides are credible. Many of our students go to YouTube as their first source of information. And often they learn more from YouTube than they do from college. So the important skill for students going forward is learning the right kinds of questions to ask, and how to evaluate answers; how to look for sources, and how to judge sources. That’s really critical. If I’m in a science fiction mind, the idea of intelligent agents assisting that process is both brilliantly utopian and a bit frightening.

I don’t want a Siri to give me the answers based on what I’ve always found before. Because the loss in such directed searching is the serendipity and discovery we used to get by browsing library shelves. For our students, the skill to learn is how to ask questions and how to judge sources.

The other skill is becoming aware that these are choices. That what Google gives you is not a transparent, unmotivated choice. Help them think about who’s controlling their information and how they can control their own information.

When we start with our incoming first year students and introducing portfolios, we do an exercise about digital identity because students come to us, 17, 18 years old, and they’re nervous about being out there. And portfolios can help them learn that the best way to have a positive digital identity is to make your own digital identity. You don’t keep yourself off the web. That can’t be done. But you make yourself searchable in ways that you approve of by putting yourself out there in ways that you approve of.

Laura Gambino:

We do the same thing, and we ask: What do you want that identity to look like? How might your academic identity be different from your Facebook identity or your Instagram identity? Your ePortfolio is the place where you get to design and construct and develop who you are as a learner and a scholar. And that can be a powerful thing to students, and for many of our students. And that’s a key part of the portfolio: they own it. It’s not institutionally owned. They’re the owners. Back to that idea of customization and ownership. The goal is helping students not just own the portfolio, but become owners and agents of their own learning process.

Bret Eynon:

We mentioned digital identity and learning identity. Both are increasingly salient needs for today’s students. For me, the idea of digital identity goes back to the early ’90s, wanting students to be more than consumers of knowledge, wanting them to be creators of knowledge; more than consumers of the web, creators of the web. That idea–linking constructivist learning and multimedia authoring—was part of what was behind the portfolio. Now we also understand that your identity as a learner is crucial to the learning process. We’ve seen a broad shift in our understanding of learning: first it was just content acquisition; then it was cognitive skills development; and now we increasingly understand that affective dimensions are involved as well. The whole person is engaged in the learning processes.

As students learn, they craft a new sense of self, engaged with the world, engaged with the knowledge of the world. We now realize that we must help them develop their capacity to do that throughout their lives. That’s increasingly clear as a goal. That’s another reason we may be on the cusp of an ePortfolio moment.

ePortfolio practice is well suited to address this need. ePortfolio practices can help students engage in this critical process: making choices about who I am, who I want to be, what I need to know, how to learn it. How to become an adaptive critical thinker and always coming back to that sense of self: the deeper question of who I am and how I take ownership of myself.

The scholar who helped me think about that is Marcia Baxter Magolda, who talks about purposeful self-authorship. All three of those words are important: purposeful, self, and authorship. That concept helped me think about what is going on at the most sophisticated level of portfolio practice.

Maybe it’s a LaGuardia thing, or CUNY community colleges, or community college more broadly; or maybe it’s CUNY, I don’t know. But when you work with first-generation students, you see how the process of self-authorship is crucial for their success. It makes a big difference for students. There is very solid evidence on this.

Our LaGuardia data–and data from other campuses–shows that the ePortfolio process has a significant positive impact on retention and student success. Other factors matter, too: pedagogy, peer mentoring, etc. But I think that the process of self-authorship is a powerful factor for our students.

Joseph Ugoretz:
It’s a good point. We’ve all had the experience of being at a national conference or at places where students have a different level of home education, experience with college, academic skills and privilege in the world. It’s sometimes hard for people on other campuses to understand the reality of CUNY and CUNY students.
Laura Gambino:
When we introduce ePortfolio to students, as I mentioned before we intentionally use that language: This is the space to construct and develop and create who you are as learners and scholars. And it’s powerful, because our students often don’t see themselves that way. The ePortfolio becomes the space for them to do that.

Specific examples of faculty and student work

(40:59)

Dominique Zino:
Can you talk about a case where you’ve seen this sense of self emerge? Maybe in a bridge program, or maybe in a capstone course, where a student has done something with portfolio that fulfills this vision?
Bret Eynon:

I’d point to our Graduation Plan. In our First Year Seminar, we use ePortfolio to introduce all students to a process of self-examination, self-assessment, and purposeful planning. It starts with simple steps–identifying and reflecting on your goals in life, your values, strengths and interests. Then we ask them to connect that to real choices, thinking about career, transfer, and ultimately, “What courses should I take next semester? How am I going to move forward in this building my education in concordance with my goals and values?” The ePortfolio template gives faculty and peer mentors a set of prompts to use with students. It’s a practical exercise in purposeful self-authorship. Done well, this process is very powerful. And some faculty go further. In Natural Sciences, after doing the Graduation Plan, Preethi Radhakrishnan has her students do a video of a digital story. And the videos students create are incredible. In their own voices, students connect their education to their new sense of self. This opens the opportunity for a recursive examination, as students revisit this process, looking at how they’re changing over time.

Joseph Ugoretz:

So while we were talking about this, I looked up one student’s ePortfolio, called “The Utopia of Daniel” (http://macaulay.cuny.edu/eportfolios/utopiaofdaniel/). So Daniel is the student. If you look across the top, Daniel has his categories for what is in his portfolio, and it’s Home, In The News, Cars, College, Gaming, Technology, TV and Movies, and Random. And so you notice, college is one tiny little piece of him and his learning. And then he’s got this rich list of subcategories for each one, and experiences that led him to internships. Daniel was interested in cars and went to Texas to visit the Delorean factory. And boom, he got an internship with Delorean, in part because he had been thinking about how his interest in cars matched his interest in filmmaking and digital production and computer science.

Looking at his ePortfolio you get a real picture of Daniel as a person. Lots of movie reviews, and technology reviews, like which external hard drive is the best for video editing. And he gets questions from people interested in buying technology.

Here’s one: “How to Upgrade Your Ram on Your Mac Book Pro: Super Simple and Dirt Cheap.” That’s not something that he learned in school, but it connects to everything he’s interested in learning and doing. The idea is to collect and integrate all these different pieces of student learning and then make that available to a wider audience, so that you can engage with that audience. That’s been a key advantage for our students.

Another example is a student who was assigned to write a review of a play without seeing it just by reading other critics’ reviews. She did it, she posted it, and the playwright saw it, and left a comment saying, “Wow, I’m really interested in your ideas. There will be two tickets waiting for you at the box office. Please come talk to me afterwards, and so we can discuss what you thought.” So that changes the learning–it’s very different from just learning in an insulated way. It makes the learning process permeable to the outside world. That’s very powerful for students.

Laura Gambino:

I’ll tell you two quick stories. First, I had student a couple of years ago whose initial goal in Bridge–he was very, very clear–was to get his degree and become an auto mechanic. That’s just what he thought he should do, and what his family thought he should do. He took our First Year Seminar, our Ethnographies of Work course, and learned that he could be a successful student. And we got him to delve into his goals a little more, to do some career exploration. Meanwhile, he started to share his drawings in his ePortfolio. We started to see them, and then I saw him doing them in class too. And we started to talk about that. Students in his class and his advisor and I encouraged him to talk to someone in the art field and what his possibilities could be.

Over the course of the year, as he was doing his journaling in his portfolio, and getting all this feedback, he did this research project in his portfolio about what art careers are. And he realized that art was something you could actually pursue as a career. He’s now finishing his Bachelor’s in Graphic Design.

And here’s a second story that’s a little different. Back when I was at Tunxis Community College in Connecticut, I taught an Intro to Programming class, a very tough class. One year I had a very quiet young woman who would sit in the back. She never really spoke in class. It seemed like she wasn’t sure if this is what she wanted to do. In their ePortfolios, I had them do reflective journaling on their design process; I had them use the portfolio to share with each other and comment on how they solved particular problems.

I remember looking at this student’s portfolio–she had changed the color; she put up all this comic artwork. It was one of the most beautiful portfolios I‘ve ever seen. She’d done this all on her own.

So they all did their group portfolio review process. And I come in the next day, and there are all of these students sitting around her going, “Show us how to do that.” “How did you do that?” “That is like the coolest thing I have ever seen.” And you could see her mindset shift: I can do this, I can be one of them, I can hold my own, I can be a computer scientist. It was an amazing experience that helped craft her sense of self and her sense of purpose in this field. And I will never forget that moment walking in and seeing that, and then just watching her grow and move through the program all because of what she had done on her portfolio and the social interaction it supported.

And I guess there’s ways you can replicate that without the portfolio. But here, the portfolio was the vehicle that helped her craft her identity, because she had taken ownership and put that sense of herself in her portfolio. That helped her see how she could connect to the work she was doing, connect to a place in the field.

Key ePortfolio practices: making, sharing, integrating

(51:13)

Dominique Zino:
What pedagogy produced such rich experiences and products,? If you had to name two or three tenets of ePortfolio pedagogy, what would you point to?
Joseph Urogetz:

That’s tough for us because our ePortfolios are 100 percent student driven, student motivated, student designed. It’s sort of build and they will come. Or let them build it, and then they’re there. It’s a respect for the individual student as a creator and as a person and as a learner. And whether that happens in the classroom or somewhere else, the message that comes across is that you can make something here. I think one of the things that make us human beings is that we like to make things. It’s like if you give a child a set of Legos, they want to make something with it. So it’s making that set of Legos available and usable. And then maybe showing them a few examples of what other people have made.

Laura Gambino:

I would add reflection to that, the reflective pedagogy. To me, that’s where students articulate their learning and see it for themselves. One of my students talked about this in their portfolio more eloquently than I can–I wish I could quote him here– but he said what he loved about ePortfolio was its ability to combine process and product. Doing the work is interesting, but reflecting on it and thinking about how he got that product and the process he went through to get there, that’s much more interesting. Carol Rodgers and others say reflection is where learning takes place. We need to stop and think and to articulate our learning process. Portfolio practice builds the ability to do that, to reflect and share reflection. Reflection is not always a solitary process because we need to reflect in community and share as well.

Joseph Ugoretz:
Like Laura said, it’s not just sitting alone and reflecting on your work. It’s saying, “Bret, let me show you this,” and then Bret asking questions, and then me answering these questions. And that’s really where a social pedagogy for ePortfolio motivates and multiplies the reflection.
Bret Eynon:

The core of what’s pedagogically valuable about portfolios is the way it makes learning visible. It makes the learning visible to students themselves, supporting the reflective process. It also makes the learning visible to peers, for a social pedagogy. And it makes the learning visible to faculty for individual and collective assessment. Engaging in this process collectively, using ePortfolios can help faculty better understand that what students learn in my course connects to what they learn in your course, and how it all adds up. Finally, by making learning visible, it opens the way for integrative pedagogy. That’s the frontier we’re working on. Seeing how students learn over time, across courses and disciplines, how what’s happening in one area connects to another. Integration between the learning experience and the evolving sense of self.

Integrative pedagogy points to all those connections. Of course, it’s a deeply challenging pedagogy. It’s an unfamiliar pedagogy, in that it asks faculty to step outside their area of expertise and think about all these other things happening outside their classroom.

There’s a learning curve for faculty, for students, and for everyone else, too. It’s a slow learning curve, but an essential one.

Integration won’t happen for students unless faculty scaffold and support it, help them see the connection: how does this class relate to you? To who you are? To who you’re becoming?

It takes faculty a while to learn how to do that. It’s kind of scary, risky. It requires us to provide scaffolding and support to faculty as they think it through, learn how to do it, step by step. We need to encourage it. Reward it. Support faculty as they learn from others and try it out themselves. Get used to looking at students’ portfolios as they enter the class, seeing who they are, what they bring to the class, what they’re doing in other classes. It’s really building a culture where it’s widely understood – yeah, this is what we do here. We as a campus are working together to support the learning and growth of these complex individuals, our students, who we value so much. That takes time and practice.

That’s why building meaningful ePortfolio practice is challenging. The technology is simple. But the pedagogy is not easy to learn. It requires an extended process of engagement and professional learning, a sustained collective engagement.

That’s difficult to do in higher education. Our institutions are not set up to do this. We’re not set up as integrative institutions. Faculty are not familiar with or prepared for an integrative approach. And students aren’t accustomed to it, either. It’s a paradigm shift, across the board, that requires a high degree of institutional support, institutional intentionality, institutional integration. That’s what we’re working on now. We’re all inventing a new practice for higher education, and it takes time.

Integrating ePortfolio pedagogy and assessment

(1:00:15)

Dominique Zino:
Is there a tension between that kind of pedagogy and the need for assessment?
Joseph Ugoretz:
This doesn’t have to be, no. There isn’t if the assessment is authentic and really feeds back into the teaching and learning enterprise.
Laura Gambino:

I don’t see them in tension at all. In fact, I think it’s just the opposite. They complement each other nicely. When we have our Assessment Days, where all of our faculty are getting together, it’s not just looking at artifacts of the work. The work is great. But faculty find the greatest value in the reflections, and looking for integration. Because that’s one of our core outcomes. We want students to integrate and apply and critically think. Because we have the portfolio, because we have not only the artifacts, but also the reflections and all the different pieces of their learning, we’re getting a much richer picture of the student learning experience. And we’re able to see what students are learning and how they’re learning it, which enables us to do a better assessment of our learning outcomes.

Bret Eynon:

When we think about assessment, it’s important to distinguish between assessment for accountability and assessment that’s designed to enhance student learning, faculty learning, and institutional learning. When we talk about assessment at LaGuardia we’re talking about that latter category. Assessment for accountability can be done more simply in ways that are ultimately not very satisfying or meaningful. But we’re talking about assessment that has a deeper purpose–to help us work more collectively and effectively with students to transform the learning experience.

Grounding assessment in the real work of the classroom makes it more authentic, makes it easier to connect the insights generated by the assessment back to the practice that generated them. “Okay, here’s the work, here’s the assignment, here’s the result–let’s go back and redesign the assignment. Let’s go back and look at the work again so that we’ll see a real adaptive cycle.”

The portfolio can facilitate that by making the artifacts available. And it can facilitate the next step of that, which I think Laura is pointing to, which is thinking about those discreet components of the work as part of a larger picture with the student. That’s powerful for educators–seeing the whole student, seeing the evolution of the whole student.

Laura Gambino:
The ePortfolio brings the student into the process. I heard a colleague at Tunxis say this a long time ago, and I’ve used it many times since: the Latin root of the word “assess” means to sit beside. So assessment really means to sit beside the student. We can’t physically have the students sit beside us. But through their ePortfolios, they are sitting beside us, because their voice is now in the process. And they are more active in the process through the story they’re telling us in their ePortfolios. So to me, it’s the truest form of assessment, where students are sitting beside us.
Joseph Ugoretz:
Reflection includes a kind of self-assessment. Putting that power in the hands of the students is really valuable.
Bret Eynon:

Pedagogy should inform assessment and vice versa. They have to dovetail. Integrative reflection is key to both. Reflection is the site of connection and crossing boundaries. It opens all sorts of possibilities for learning and change. We’re asking students to reflect and learn about themselves as learners, to think about how they learn best, how they can get stronger as learners. We’re asking faculty to reflect and understand what’s happening in the learning process, what students bring into the classroom, and how they’re changing. And we’re asking the institution to reflect and learn on a broad scale: who are our students? Are we serving them well? Are we meeting their needs? Are we really doing what we say we’re doing? And when we ask those questions, we must constantly bring it back to action: “What does this tell us about what we should do?”So assessment is not an end unto itself. It’s really about thoughtful change. How should we change what we’re doing to make it better? That’s essential if we’re going to be successful as 21st century educational institutions. We have to do that. We’re not given the choice to just stay as we’ve been. We must adapt to very new situations. The new learning ecosystem is filled with powerful new players. If we’re not figuring out how we can best help our students–what we have to offer them and how we can do it better–we’ll be left behind. To my mind, it’s imperative that we in higher education figure out how to do that. There are lots of facets to that work, but it’s now increasingly clear that ePortfolio can play a crucial role in helping higher education rise to this moment.

Notes

[1] A pioneering, teaching-with-technology faculty development program coordinated by CUNY’s American Social History Project from 1996-2002, the NEH-funded New Media Classroom served educators from schools, colleges/universities, and cultural institutions nationwide. Under Eynon’s leadership, NMC helped educators develop strategies for using new digital resources in history and culture classrooms.

[2] Co-led by Eynon and Georgetown University’s Randy Bass, coordinated by Georgetown’s Center for New Designs in Learning and Scholarship, and funded by Atlantic Philanthropies from 2000-2005, the Visible Knowledge Project engaged 70 faculty from 22 campuses nationwide in Scholarship of Teaching and Learning projects focused on the use of new digital resources to support adaptive, embodied, and situated learning. For more info, see: https://blogs.commons.georgetown.edu/vkp/

About the Participants

Bret Eynon is a historian and Associate Provost at LaGuardia Community College (CUNY), where he is responsible for strategic planning and oversight of collegewide educational change initiatives related to learning, teaching, curriculum, technology, advisement, and assessment. The founder of LaGuardia’s Center for Teaching and Learning and its internationally known ePortfolio project, Eynon also co-directed (with Georgetown’s Randy Bass) the national Visible Knowledge Project and directed the FIPSE-funded Connect to Learning project, which from 2010 to 2014 worked with twenty-four diverse campuses nationwide and produced a unique international resource site, Catalyst for Learning: ePortfolio Resources & Research (http://c2l.mcnrc.org).

Eynon’s many articles and books include The Difference that Inquiry Makes (with Randy Bass); Freedom’s Unfinished Revolution: An Inquiry into the Civil War and Reconstruction; and 1968: An International Student Generation in Revolt; as well as Who Built America?, an award-winning series of textbooks, films, and CD-ROMs created with CUNY’s American Social History Project. His most recent book, co-authored with Randy Bass, Open and Integrative: Designing Liberal Education for the New Digital Ecosystem, was published in June 2016. High Impact ePortfolio Practice: Catalyst for Student, Faculty, and Institutional Learning, co-authored with Laura M. Gambino, will be released in January 2017. A national faculty member for the Association of American Colleges and Universities since 2006, Eynon has been honored for his work by the American Association for Higher Education, the American Council on Education, the Community College Futures Association, and the Carnegie Foundation for the Advancement of Teaching. The national Community College Humanities Association has recognized him as a Distinguished Humanities Educator.

Laura M. Gambino is Associate Dean for Assessment and Technology and Professor of Information Technology at Guttman Community College (CUNY). In her role as Associate Dean, Dr. Gambino oversees the College’s institution-wide ePortfolio program and the Integrated Planning and Advising for Student Success (iPASS) initiative. She serves as Principal Investigator for Guttman’s EDUCAUSE/Achieving the Dream iPASS, GradNYC College Completion Innovation Fund and Title V grants. She also leads the assessment of Guttman’s institutional student learning outcomes; her work in this area focuses on the intersection of assessment, pedagogy, and assignment design. Gambino, a leading ePortfolio and assessment practitioner and researcher, serves as a DQP/Tuning Coach for the National Institute for Learning Outcomes Assessment (NILOA). She is co-author with Bret Eynon of High Impact ePortfolio Practice: A Catalyst for Student, Faculty, and Institutional Learning.

Joseph Ugoretz is currently Senior Associate Dean for Teaching and Learning (and Interim Chief Academic Officer) at Macaulay Honors College of the City University of New York (CUNY). He is also an adjunct faculty member of the CUNY Graduate Center’s Certificate Program in Interactive Technology and Pedagogy. Dr. Ugoretz has taught high school English, served as a professor of English at a large urban community college and has led initiatives across the liberal arts, particularly in the STEAM (Science, Technology, Engineering, Arts and Mathematics) disciplines. He has taught fully online courses, first-year honors seminars, graduate courses, and high school English, as well as faculty development programs from workshops to retreats to unconferences. He currently serves on the board of the Association for Authentic, Experiential and Evidence-Based Learning (AAEEBL) and is a leader in open flexible eportfolios and the scholarship of teaching and learning.

At Macaulay, Dr. Ugoretz supervises the Instructional Technology Fellows, the faculty and curriculum of all the Macaulay seminars, and frequently teaches the Arts in New York Seminar as well as upper level seminars in Science Fiction and the Future of Education. Events and activities like the Night at the Museum, Snapshot NYC, BioBlitz and the Science Forward video series are examples of what he and his team provide to Macaulay students.

Aside from the scholarship of teaching and learning, Dr. Ugoretz’ research interests include Urban Legends and Internet Lore, Science Fiction, and Oral Performance Art (the subject of his fieldwork with pitchmen at county fairs and carnivals, and of his essay, “Quacks, Yokels, and Light-Fingered Folk: Oral Performance Art at the Fair” in the 2006 collection Americana: Readings in American Popular Culture).
He blogs at https://prestidigitation.commons.gc.cuny.edu.

Dominique Zino is Assistant Professor of English at LaGuardia Community College (CUNY), where she is a member of the college’s ePortfolio Leadership Team. She has published on her classroom pedagogy in JAEPL and has work forthcoming in Textual Cultures, the journal of the Society for Textual Scholarship, and Digital Pedagogy in the Humanities (MLA Books). She is currently working on a longitudinal study of the culture of writing at LaGuardia. @DZ0222

Skip to toolbar