Chapter 1: Out of Industry, Into the Classroom: UX as Proactive Academic Practice
The more significant conclusion that I took from this usability study was that the usability test was not generalizable beyond the specific context of this particular syllabus evaluation. What I mean by this is that once I revised the syllabi to be more usable, or a new population of students was tested, or I moved to a new institution with different policies and procedures, or new technologies were better suited to deliver syllabi, the test conducted in 2014 would be insignificant beyond a historical perspective of usability using two texts and technologies in 2014. (Crane 4)
Usability, therefore, should only be used iteratively to understand how a design works for users at any given time or environment. Second to this conclusion was my understanding that course documents are (or should be) student-user centered and that it is an instructor's responsibility as an information designer to understand the student-user experience while using these documents. (Crane 4)
Although by no means exhaustive in its discussion of design and research methods, this chapter attempts to show the hierarchy of UX and its relationship to design and research methods. At the same time, using illustrative examples from my own syllabus research, I discuss the various opportunities and challenges of UX work. (Crane 5)
System-centered design focused more on the needs of the system to function as the designers intended. The problem with this approach is that systems, even well-built systems, are not always usable for the people those systems were designed for. (Crane 5)
Four of Nielsen's and Quesenbery's components for usability are very similar. For instance, users need to be able to complete tasks efficiently, learn a system in a reasonable amount of time, and recover from errors when they are made. (Crane 8)
not only should we be concerned with how well users can complete tasks, but researchers should not assume that their (or a designer's) preconceived ideas about users' work is a fair representation of the complexity of users' work beyond a usability lab (or any testing situation). (Crane 8)
After analyzing my syllabus usability test's results, I learned that usability testing alone could not answer the questions I posed for my study. The syllabus and the students who use it are part of a complex academic system with multiple factors, stakeholders, tasks, environments, and functions. Thus, looking at usability alone, though a good starting point, led to more questions than it could answer. (Crane 10)
This is one example of why TPC instructors and program designers need to understand how UX functions within an interconnected web of design processes (user-centered design [UCD], human-centered design [HCD], participatory design, and design thinking) and research methods (observation, self-reporting, affinity diagramming, usability testing, etc.). (Crane 10)
It is a theory or philosophy, supported by design processes that put the human user in the center of design processes, whether these processes are labeled as user-centered design (UCD), human-centered design (HCD), participatory design, or design thinking; these design processes are then enacted through four iterative stages: 1. collecting information about human users (those most likely to use products upon design completion), 2. designing prototypes that can be used by these human users to collect additional data about their use, 3. redesigning products in response to the first two methods, and 4. testing and retesting products during and after distribution. (Crane 10-11)
One of the main differences between UCD and HCD, is the shift in nomenclature from "users" to "humans." This is not to minimize either process; rather, it acknowledges that some UX scholars and designers feel that the term "users" is not the best way to refer to people. (Crane 14)
Affinity diagramming provides all participants the opportunity to make their values and attitudes known without succumbing to group thinking. This practice can also lead to low-fidelity co-designed prototypes where users can construct their own syllabus, in this case, using the values discovered from affinity diagramming and program and university syllabus requirements to create their own student-as-user-centered syllabus. (Crane 19)
Chapter 2: Beyond Lore: UX as Data-Driven Practice
We begin this chapter with a nod to North (1987) because the collection's focus on user experience (UX) has deep roots in both the concept of practice and the concept of inquiry: UX might be defined as a practice that improves human experiences through situated inquiry within a highly contextualized space. (Cook and Crane 26)
While some have chosen to use terms directly from UX, e.g., referring to students as "users" and curricula as "products," others have chosen to reference students with terms ranging from "student users" to "co-creators." (Cook and Crane 29)
The goal of the project, its scope, and the context must shape the UX process. Not only is this necessary to ensure teacher-practitioners have developed a product or system that considers student experiences, but it is also necessary to create goals and develop a UX plan that matches the scope and context of the project. (Cook and Crane 31-32)
Methods Journey Map Infographic: Understanding – Surveying, Journey Mapping, Journals, Card Sorting, User Profiles and Personas; Looking – Affinity Clustering, Observations, Interviews, Focus Groups, Usability; Making – Rapid Ideation, Prototyping, Operative Imaging. Under each method, they include which authors in the collection employed a specific method for UX testing. (Cook and Crane 33)
Chapter 3: User Profiles as Pedagogical Tools in the Technical and Professional Communication Classroom
I had previously taught the same course at the same institution but perceived a disconnect between the course material and student understanding about the material, learning management system, and course expectations. In response, I reframed my role as a TPC instructor facing student confusion to a designer facing a design problem for users. (Martin 43)
I focused on two key activities to explore how TPC instructors might leverage student-user profiles to guide course and lesson design decisions: 1. developing and iterating a student-user profile before, during, and after a course; 2. understanding how information from a student-user profile can inform course and lesson design decisions. (Martin 44)
A student-user profile will ideally end up as a robust, detailed tool to help you make informed pedagogical decisions. You may have information about how your students conceptualize TPC, interpret assignments, and even navigate an LMS. But starting out, all you need is a piece of paper and some general student information from your registered student list. (Martin 54)
Think about what else you might know about your students to start building your student-user profile. Do you have any international students? Do you have students from different parts of the country? These distinctions may or may not be relevant based on what you subsequently learn about your users, but they offer simple starting points to consider as you brainstorm student perspectives until you can refine them with observational and self-reporting data. (Martin 54)
Specifically, your design inquiry, or what you are trying to learn about users to improve their experience, (e.g., how do TPC students use the LMS?) will determine how much a profile must be altered or discarded. In short, your design inquiry will guide your student-user profile development activities. (Martin 56)
Importantly, the student-user profile was based on triangulation of user "see-say-do" information (Still & Crane, 2016) rather than sole self-report data such as course surveys or student evaluations. While those tools can support a student-user profile, on their own they cannot supplement the robust approach of creating a user profile based in UCD methods. (Martin 57)
Chapter 4: User Experience and Transliteracies in Technical and Professional Communication
This fluidity among cultures and digital platforms is at the core of what we want to teach our students in UX—to develop methods for understanding culture not as a fixed entity, but as fluid, constantly emerging, and iterative. Transliteracy thus provides students with an entry point into broader conversations in UX regarding user research and ethical technology design. (Gonzales and Walwema 68).
As an innovative framework, UX can be deployed to tackle social issues that are constantly shifting and that resist single solutions. Although many programs and courses have argued for the value of UX training, particularly within technical communication curricula, the notion of technology design and UX research more broadly can be intimidating to students who do not have experience in this area, especially given the overwhelming whiteness of UX as a field and industry. (Gonzales and Walwema 69).
The transliteracy model helps UX designers determine what the target culture communication patterns might be. By gaining a snapshot of the communication environment in a particular culture to discuss implications for intercultural UX, technical communicators can interpret what they have listened to, generate new ideas, and incorporate those ideas to create UX that emerges from the users' sociocultural contexts. (Gonzales and Walwema 70).
As the student narratives demonstrate, it is not enough for UX to consider diverse users; it has to take the next step of understanding users' sense of who they are in order to address their needs in a more targeted way. The narratives show that UX through a transliteracies framework encourages UX researchers to look more closely at the inequities that manifest in products and services. (Gonzales and Walwema 80).
UX-inspired assignments such as journey mapping, the Notebook of Relations, and affinity diagramming activities allowed students to speak back to what they were reading while applying these readings to their own interests, experiences, and research. As we continue developing courses that thread UX and transliteracy, we hope to continue embracing this iterative course design while also maintaining an emphasis on interdisciplinarity and intercultural communication. (Gonzales and Walwema 81).
A transliteracies framework in UX also assures that advocacy for users is done by both scholars and users. Rather than limit user responses to select quotes, a transliteracy framework values all user media, including audio or video stories, as legitimate sources of knowledge that together paint a panoramic picture of communities, and change minds and attitudes. (Gonzales and Walwema 82).
Although we realize that the examples, narratives, and experiences that we share in this chapter are very localized to a specific course and context, we believe that the pairing of UX and transliteracy, as well as the attention to students' backgrounds and interests in designing UX curricula, can be incorporated into other programs and contexts seeking to introduce UX. The clear takeaway for UX and TPC is that combining transliteracy with iterative course design practices drawing from UX can bring empathy, efficiency, and emotional engagement by intentionally co-creating experiences with students to be better immersed in students' everyday lives. (Gonzales and Walwema 83).
Chapter 5: Using Student-Experience Mapping in Academic Programs: Two Case Studies
Walker explains that "A user experience map shows the users' needs, expectations, wants, and potential route to reach a particular goal. It's like a behavioral blueprint that defines how your customer may interact with your product or service" (n.p., emphasis added). (Howard 89)
One of my pedagogical goals was to impress on the students that UX maps come at the end of a long, rigorous research process. Both my industry clients and my students want to jump right in and start creating maps, so I wanted them to recognize that maps are the result of scaffolding; i.e., maps can't be created without first creating personas, and personas can't be created without data resulting from triangulated empirical inquiries. (Howard 92)
Even though they only represent approximately 67 percent of the users, data like those detailed above can be correlated with the admissions data we received from the Architecture School's administrative assistant in order to help the students make informed decisions about details to include in their personas. [The data listed above are those that Google Analytics provides, demographics overview, age, gender, interests, affinity categories, in-market segments, and other categories]. (Howard 96)
Indeed, in our early meetings with our clients, they told us that they were attempting to decide if they needed to redesign the website so that there was a whole section of the site dedicated to the two-year track and another to the three-year (the current site combined information for both programs on the same pages). (Howard 98
Taken together, the five personas from all three teams combined with three user experience maps (one for domestic undergraduates, one for domestic graduate students, and one for international graduate students) collectively gave our clients a clear and thorough understanding of the needs that required attention in the redesign of the School of Architecture's website. (Howard 100)
Without students having completed the work as a client-based project for a course, few of us who direct TPC programs could assemble the resources needed to conduct such a study. (Howard 101)
We began by conducting what, today, we would call a "content audit" and surveyed and compiled all of the advising handbooks, webpages, and materials available for both students and faculty. Not surprisingly, we found that the information was "all there" and available; however, it was scattered across a variety of sources and not compiled in a user-friendly format. (Howard 101-102)
Taking a single class, such as the Usability Testing and UX Design seminar I described in the first case, didn't really allow students to demonstrate "expertise" in the area. They needed more coursework. However, until the faculty engaged in this mapping exercise, we didn't realize that students were often unable to take three courses in a cognate area because of the demands of the five core courses: two required thesis research courses and at least one course required for students to obtain graduate teaching assistantships. (Howard 103)
In other words, it took the mapping exercise to convince the faculty to make the painful decision to drop core TPC courses in favor of cognate courses. The mapping exercise turned faculty who had been advocates for their own privileged core course topics into student advocates. (Howard 104)
Both formal and informal forms of user experience mapping improve students' academic experiences through the inclusion of students' voices in the design of websites and curricula for academic programs. (Howard 104)
Chapter 6: "A Nice Change of Pace": Involving Students-as-Course-Users Early and Often
In this chapter, I demonstrate how thinking of the ENGL 2312 class as a "user experience" inspired two early class activities focused on the syllabus' design and the course Blackboard site. (Pihlaja 110)
While user-centered design, usability, and user experience stand as distinct, discrete objects of study and methodological approaches to design and inquiry, they share a common concern with the user. In wrestling with how to think with my students about "culture" and how texts and technologies are used in any given context, it became obvious that the way forward was to begin with the first two "commandments" of the user-centered design process: "thou must involve users early and often" (Still & Crane, 2016). (Pihlaja 112).
While cultural usability is a complex topic, historically, it is concerned with the design of products for usability "cross-culturally," requiring critical analysis of the wider global context for any given local users (Sun, 2012). (Pihlaja 113)
As instructors gain more experience semester-to-semester and year-to-year, student "personas," students as actual users, are iteratively re-imagined based on those who have taken the course before, succeeding or failing in various ways each year. (Pihlaja 114)
Acceding expert status to students may feel like conceding instructors' role and status—one's whole reason for being a teacher. Significantly, students may also feel this way and be suspicious of instructors who do not perform competence and confidence in the learning environment or class-as-product in ways they have been enculturated to expect. (Pihlaja 115)
Furthermore, the process of consulting, testing, and reflecting on course elements with students has the potential to aid the pedagogical goals of the course, using students' agency as "expert end users" of a course as learning product to engage course content itself more critically and deeply. (Pihlaja 115)
To enable students' participation in (re-)designing the syllabus, at the beginning of an early class period, I placed students in groups of three to four and assigned each group a subsection of the syllabus to review. One group focused on the course description, objectives, and materials section; another, the assessment criteria for grading; another, the course policies; and finally, another, the course calendar. (Pihlaja 117)
Paraphrasing: First students repeat policies listed in their section of the syllabus in their own words akin to a syllabus quiz, then identify two questions about the content, and finally what they like about the syllabus and what makes it easier to use. (Pihlaja 117)
To enable students' participation in (re-)designing the syllabus, at the beginning of an early class period, I placed students in groups of three to four and assigned each group a subsection of the syllabus to review. One group focused on the course description, objectives, and materials section; another, the assessment criteria for grading; another, the course policies; and finally, another, the course calendar. (Pihlaja 118)
Engaging with students about, say, whether the explanatory preface for each course goal area was really necessary in this document for what they would use it for (it wasn't) led me to revise that section in particular to make later reference to it easier (Figure 6.3). (Pihlaja 118)
Without realizing my intention, students asserted (quite forcefully and in one instance with a hint of disgust if not horror) that red was an "angry" or anxiety-producing color—especially when I used it to highlight assignment due dates. (Pihlaja 120)
But I do wonder whether we could take this approach every semester, regardless of the status of the class. Indeed, to ask these questions every time is to accept that students' needs and user practices are not all the same and that the culture of the class changes from semester to semester, if not more frequently (Pihlaja 123)
This particular example also dovetailed nicely with our course content discussions around possible cultural differences that show up even in mundane, everyday ways (e.g., how we represent dates and time). While it is customary in the United States to represent months and days in that order, many other nations represent them in the reverse: day then month. This added a cultural competence dimension to the discussion. (Pihlaja 126)
Indeed, we returned to the day-month example several times throughout the semester. In explaining what a "redesign for cross-cultural connection" of some existing text or technology might look like for their final projects, I called back to this example. (Pihlaja 127)
Where the student is thought of as a user and brought into the process of designing courses, the prospects for student engagement, learning, persistence, and success are substantial. (Pihlaja 128)
From an assessment perspective, thinking of students as users of course content and tools was an effective way to test their prior knowledge while disclosing (to both the instructor and students themselves) their tacit understanding of the course topic and tracking learning over the course of a semester. It has the ability to help clarify why a student might not be succeeding. (Pihlaja 129)
Inviting student input on course element design no doubt renders one vulnerable. To show up on day one of a course expecting to be able to teach the class only after you've had substantial input on how students will or will not be able to "use" its organization and environment may feel like risking one's identity as a teacher. (Pihlaja 130)
Of course, there's no guarantee that UCD approaches themselves will be able to move beyond the more apolitical, individualist thinking regarding student engagement that leads Collin Bjork (2018) to propose we supplement usability-type approaches with insights from digital rhetoric, identifying the inherently rhetorical dynamics at work in any user interface, such as audience, persuasion, and credibility. (Pihlaja 131)
Chapter 7: Learning from the Learners: Incorporating User Experience into the Development of an Oral Communication Lab
In Fall 2018, the college administration expressed support for the business communication faculty to develop new initiatives that foster students' soft skills (teamwork, leadership, ethics, and communication) and, in particular, oral communication. (Clark and Austin 138)
What impact did incorporating user experience throughout the development process have on the overall success of our Oral Communication Lab? (Clark and Austin 142)
As we engaged in a cycle of implementation, reflection, adjustment, and re-implementation, we realized the importance of including students in the development process. As such, our new approach echoed the approach to usability testing modeled by Shivers-McNair et al. (2018), which they define as "an empathetic, flexible, ongoing engagement with our audiences and users" (p. 39). (Clark and Austin 142)
At the outset of the semester, we planned to use the following assessment instruments: an observations/electronic journal, the Personal Report of Communication Apprehension (PRCA-24) as a pre-test and post-test, the Shannon Cooper Technology Profile, the Instructional Video Usability Survey, Speech Anxiety Thoughts Inventory (SATI), Lab Technology Usability Survey, and the final Logistics Survey. (Clark and Austin 142)
The findings also showed us the importance of incorporating usability and user experience feedback during the development of initiatives like the lab. As a result of the inclusion of user experience assessments, we were able to make adjustments during the development process that aligned more with the needs of our current users. (Clark and Austin 146)
Not only did we assume student attitudes toward dress, but we also assumed they would be proficient in the technologies we planned to use in the lab. The results from the Shannon-Cooper Technology Profile (Appendix B) indicated that students self-reported high proficiency in social media, basic computing programs, and the Blackboard LMS platform. (Clark and Austin 147)
In contrast, Kaltura, our integrated video recording platform, scored an average of .59/10, with 22 of the 28 students giving it a score of zero. Because the Shannon-Cooper Technology Profile showed that students were not familiar with Kaltura, we felt it was important to meet each student in the lab during that student's first visit in order to lead them through, click by click, the process of recording and uploading their videos. (Clark and Austin 147)
These findings also supported our perception that students were learning the technology quickly and intuitively. Our observations in the lab provided another example of this technological intuitiveness on the part of students. Once we showed the students where to open the My Media tab on Blackboard (where Kaltura is housed), many students actually started to lead us; they would find and click on the proper buttons before we pointed them out. (Clark and Austin 148)
While we recognized early on that our assumptions of our users' needs and wants were not always correct, we were so focused on scaffolding skills that we did not create an opportunity for gathering feedback on basic scheduling and process logistics. (Clark and Austin 149)
Though this project had a relatively small sample size, i.e., 28 students who constitute one section of a multi-section course at our university, the research findings emphasize the importance of including our students in the developmental process of initiatives aimed at supporting their professional development. (Clark and Austin 151)
Had we not incorporated user feedback checkpoints or kept our eyes open during informal interactions with students, the lab and its activities would have had a much lower chance of success. First of all, we would have created more work for ourselves as teachers (and likely for the students as well) by using unsuccessful, ineffective instructional strategies. Secondly, we would have missed the innovative and insightful comments, ideas, and actions expressed by students as they navigated, learned from, and contributed to the lab. (Clark and Austin 152)
Chapter 8: Ideating a New Program: Implementing Design Thinking Approaches to Develop Program Student Learning Outcomes
Still, some other guides do spend more time describing PSLO (Program Student Learning Outcomes) design processes. For example, a guide developed by the University of Nebraska-Lincoln described six strategies for creating PSLOs, including holding conversations with department faculty, examining existing instructional materials, and reviewing similar units or programs (Jonson, 2006). Yet these varied strategies still emphasized a closed, faculty-centric approach rather than a UX design methodology. (Thominet 162)
To meet these UX goals, this chapter describes how a design thinking process can support active and collaborative methods that integrate the knowledges and experiences of numerous stakeholders. In this way, adopting design-thinking practices can help to move us away from a faculty-centered committee model and toward a participatory approach to PSLO design that focuses on students' experiences, needs, and goals. Ideally, this process will result in more responsive, representative, and inclusive program definitions. (Thominet 163)
While I adopt the d.school structure in this chapter due to its ability to open space for critical reflections on my PSLO design project, it is important to note that all these formulations of managerial design thinking share the same core practices. First, designers observe and interview stakeholders to better understand their needs. Based on this information, designers seek to clearly define the design problem. Next, large multidisciplinary teams use active, collaborative, and visual design exercises to imagine many potential solutions to the design problem. Then the teams prototype and test select ideas with potential users. Through several iterations, the prototypes are narrowed and refined until one design is finalized and implemented as a product or service. Two further points should be made about these phases. First, each phase is treated as cyclical and recursive, so further user research can occur after the product implementation, which can lead to further ideation and prototyping, etc. Second, the phases are often conceptualized as cycles of divergence and convergence: designers intentionally open up to a multiplicity of ideas and then move toward defining or narrowing solutions. For example, divergent thinking is often the focus of the ideation stage, while convergence to a singular design solution is a goal of the testing and iteration phase. (Thominet 164)
The subsections that follow will be framed specifically according to the d.school process of design thinking, which includes specific stages for empathizing, defining, ideating, prototyping, and implementing. I am using this structure here primarily because it offers a means to organize the discussion and to reflect on areas of revision in future iterations of this work. (Thominet 168)
For the empathize phase of the curriculum design project, I interviewed faculty and students about their experiences in the program. First, I recruited faculty who had designed and taught at least one upper-division writing and rhetoric course. Student participants were then recruited directly by those faculty. (Thominet 169)
Since the participants were not a representative sample and because we wanted to get students actively involved in our design process, we did not use the interview results as generalizable data to support specific programmatic changes. Instead, we used them to understand the situation more clearly and as an inspiration for our subsequent work. In that way, the interviews played a significant role in the next phase of problem definition, which, in turn, informed the ideation methods that followed. (Thominet 169-170)
For the PSLO project, three elements of the interviews contributed to the problem-setting phase: the broad program definitions by faculty, the emphasis on practical application by students, and the lack of a shared vision among the participants. While these elements suggested some marketing strategies (e.g., tying classes to specific jobs or highlighting student testimonials), they also demonstrated the need for a clear and specific vision for the program. (Thominet 170)
To foster divergent thinking, ideation typically takes place in multidisciplinary teams or workshops where participants use active collaboration techniques to conceptualize and prioritize potential solutions to a given problem. The exact methods vary, but organizational guides and popular press books have offered numerous ideation exercises (Gray et al., 2010; IDEO, 2015; Mattimore, 2012). (Thominet 171)
The ideation phase for the PSLO project consisted of two identical workshops that lasted two hours each. Initially, the phase was planned as a single workshop, but conflicting schedules made it impossible to locate a single time that would work for all participants, so two smaller, identical workshops were used instead. (Thominet 171)
The workshops had six stages: 1) introduction, 2) warmup, 3) ideation, 4) categorization and prioritization, 5) prototyping, and 6) reflection. I will discuss each of these stages below. (Thominet 172)
Since participants had varied experience and knowledge, I presented prompts in sets, which included questions customized for students, faculty, and practitioners. Each participant was still free to respond to any version of the prompt. (Thominet 173)
In the next workshop stage, participants categorized ideas using an affinity diagramming method where they first grouped sticky notes together without discussing their reasons, then named the groups, and finally, voted for the most important groups (Spool, 2004). In our workshops, participants initially created many different idea clusters, but they were subsequently asked to consolidate them whenever possible. (Thominet 174)
The workshops concluded with a collective debrief. Participants discussed the ideas that were most surprising in the workshop, the exercises that worked best, and the exercises they would change. Some people were surprised at the potential outcomes that received relatively limited attention and prioritization, including teamwork and reading. Others commented on how the workshop was a positive experience, saying that they felt it valued everyone's voice and gave everyone a chance to speak. (Thominet 175)
Since the outcome drafts made during the workshops were incomplete, I collaborated with another faculty member to condense the ideas from the workshops into cohesive PSLOs and to test those PSLOs with other faculty members. (Thominet 176)
At the end of this process, the initial 247 ideas were narrowed into 90 outcome statements. At that point, we moved the outcomes into one (very long) list, which can be found in Appendix C. (Thominet 176)
While an implementation phase is not always included in design thinking models, it is sometimes appended as a sixth step at the end of the d.school model. During the implementation phase, designers "put [their] vision into effect. [They] ensure that [their] solution is materialized and touches the lives of end users" (Gibbons, 2016). (Thominet 177)
The process of implementation for the PSLOs primarily involved moving the work from ad hoc workshops and collaborations back into official program committees. The first step was to re-form the defunct major track committee. For the new instantiation, the committee membership was kept small. (Thominet 177)
The committee's next major task is developing an assessment plan for the outcomes. With 16 PSLOs, assessment will not be easy. However, since the major track is a sub-degree level program (i.e., it is a track within the pre-existing English major rather than a new, standalone major), we are not subject to institutional oversight on assessment, which gives us more flexibility in our plans. Currently, we plan to assess outcome categories one at a time and to collaborate with other program committees (e.g., the technical writing committee) on assessment. (Thominet 178)
Both design thinking and UX are inherently built on an iterative approach that emphasizes direct feedback from major stakeholders. For that reason, the committee is also planning on using some indirect assessment practices, including exploratory exit interviews with graduating students, to supplement our more traditional assessment methods. (Thominet 178-179)
We adopted a traditional approach to creating PSLOs during the prototyping and testing phase. While the outcomes were based on the ideas and input of a broader group of stakeholders, the actual work of prototyping them still occurred in a closed faculty collaboration. While it was necessary to tame the vast amount of data from the workshops, we still might have undertaken this work in more open and participatory ways. (Thominet 181)
In building a heuristic model, I also simplified the process into four activities: listening, problem setting, ideating, and iterating, as shown in Figure 8.1. In this model, the implementation phase is incorporated into the process of iteration as a recognition that programmatic design projects do not have a clear start or end point. (Thominet 182)
Ultimately, when faculty and administrators can make space for intentional problem setting, we can focus our efforts on the real problems that students (and other stakeholders) encounter in academic programs. (Thominet 183)
Finally, design thinking is, fundamentally, a process of iteration. It is a process that works best when solutions are modeled, tested, and changed over time. To accomplish this activity, faculty and administrators can experiment with physical and visual prototypes of the curriculum to encourage non-faculty stakeholders to actively engage in the design process. (Thominet 183)
Chapter 9: Using UX Methods to Gauge Degree Efficacy
This study addresses student silence by centering on student experience while completing a degree: it directly engages students in curricular development and assessment. (Cargile Cook 199)
This study employs user experience research methods to gather the perspectives of these majors over time and to use that data to design a viable assessment plan, develop curriculum, and generate recruiting and marketing materials for the DMPC. Using Patricia Sullivan's (1989) definition of longitudinal field studies as a guide, this research project is designed to "employ qualitative methods to study a group or a number of individuals over a period of time" (p. 13). In her discussion of such studies, Sullivan cautions researchers who choose to use this method: longitudinal field studies are resource-, time-, and labor-intensive. (Cargile Cook 199-200)
This chapter focuses on the study's design and its initial findings. It details the five user experience methods/activities in the study's design, provides a rationale for their use, and maps these methods into a four-year timeframe. It then provides results from initial data collected in order to present a student-user profile. Finally, it discusses the value of including UX methods as assessment tools for degrees in professional and technical communication. (Cargile Cook 200)
In addition to participating in annual surveys and focus groups, samples of DMPC majors will engage with program administrators using three other user experience methods: user profiles, personas, and journey mapping. (Cargile Cook 201)
Phase 1 of this research relies on annual surveys to collect both quantitative and qualitative data about DMPC majors' demographics and attitudes. These data will be aggregated to develop user profiles and personas. (Cargile Cook 202)
While surveys are the first interaction students will have with this research, focus groups will be their last…. Each focus group session will last approximately two hours and be held in a designated focus group room with audio and video recording capabilities. The focus group team will include the moderator and at least one additional researcher to take notes. The focus group will provide a concluding snapshot of student experiences with DMPC courses, degree plans, advisors, and administrators. It will also ask majors for their ideas on degree revisions, innovations, and marketing and recruiting materials. (Cargile Cook 202)
In the spring semester of 2020, program administrators will invite a random sample of DMPC majors to meet for the first time in a persona development workshop. This workshop begins Phase 2 of the study. After completing the required Institutional Review Board (IRB) informed consent procedures, administrators will report the aggregate survey results—the user profile—to participants, explaining how user profiles inform user experience research and how they lead to the development of personas. They will then explain how to construct personas of DMPC majors from key demographics, interests, and opinions. (Cargile Cook 203)
Phase 3 of the study requires participants to create two kinds of journey maps, one for their fictional personas and a second for themselves. A journey map is a "visual depiction of what users need and what steps they take to fulfill those needs as they interact with a product" (Still & Crane, 2016 p. 95) from first interaction to last. Journey maps generated in this study focus on how personas (and eventually participants) begin their journey with the declaration of the DMPC major and end with their leaving the major or graduation. (Cargile Cook 204)
Participants will have to puzzle through degree plan requirements and catalog course descriptions to successfully map their persona's journey from matriculation to graduation. At the end of the session, debriefings will follow, describing maps and discussing different paths and rationales used. After the debriefing, future-state maps will be used for analysis. (Cargile Cook 206)
Because survey results provide useful information about DMPC majors, the results are reported in this section in spite of the low response rates. While such low rates may be criticized for their lack of generalizability, ignoring the results of those majors who did respond, from administrators' perspectives, would be indefensible. In other words, administrators realized that although the response rate was low, even a low response rate was user data that offered important insights about programmatic efficacy. (Cargile Cook 209-210)
For now, the results of this study are inconclusive and provide only first impressions of DMPC majors. Through iterative studies and multiple methods, DMPC administrators recognize that program assessment is an inexact art: Some methods deployed work better than others. Some results provide better data than others. Failures are part of any UX process and cannot be avoided, but UX processes also produce successes. Furthermore, innovation is not a linear process, and continuous improvement requires longitudinal study whatever methods are used to collect and report data. (Cargile Cook 217)
Employing user experience methods offers a methodological rationale for including student voices and experiences in program assessment that other means of assessment simply do not. (Cargile Cook 217)
Chapter 10: Real-World User Experience: Engaging Students and Industry Professionals Through a Mentor Program
John Gould and Clayton Lewis (1985) coined the phrase "user-centered design" and defined it as having three central characteristics: (1) early focus on users, (2) systematic data collection, and (3) iterative design. Using this model, we wanted to investigate the "joint enterprise" that results from strategic interaction of students and industry professionals (TCAB members and program alumni) through a mentor program. (Katsman Breuch et al 220)
Throughout the pilot, we were chiefly concerned with this question: How might user experience in a mentor program address the academic-industry gap? Sub-questions included the following: What is the "user experience" of participating in a mentor program? And how can we make improvements to a mentor program based on user/participant feedback? Our goal was to integrate user feedback with instructional design to find ways to better bridge industry and academia and to engage students and industry practitioners. This approach is indeed innovative and useful as we actively practice student-practitioner engagement as a method for cultivating real-world user experience through such joint enterprise activity. (Katsman Breuch et al 221)
Our TCAB is an intergenerational group of business leaders whose purpose is to provide exemplary networking and experiential learning opportunities for students and to enrich the curriculum and visibility of our programs, students, faculty, and staff. Three of our academic programs––a B.S. in Technical Writing and Communication, a Graduate Certificate in Technical Communication, and an M.S. in Scientific and Technical Communication––have opportunities to interact with TCAB members. (Katsman Breuch et al 223)
We provided time for the pairs to meet and asked them to articulate goals for their mentorship pairing, and we also asked them to plan for two additional points of contact in the remaining 15-week period. (See Appendix A for launch meeting worksheet.) We then asked pairs to come back to a large group discussion in which we fielded any additional questions about the program. The mentor-mentee pairs were then on their own to conduct their plans. (Katsman Breuch et al 224)
We use community of practice theory as a framework for our study of this mentor program, in that we are interested in Wenger's (1998) three dimensions for establishing a community of practice––joint enterprise, mutual engagement, and shared repertoire––as a framework. (Katsman Breuch et al 224)
A community of practice framework for the study of our mentor program also aligns well with user experience and user-centered design theory and practice. By integrating "user experience" in our mentor program, we mean understanding not just performance or preference of a specific task but rather the entire user experience before, during, and after their "use" or participation in the mentor program (see Getto & Beecher, 2016; Potts & Salvo, 2018; Rose et al., 2017; Still & Crane, 2017). (Katsman Breuch et al 225)
Specifically, we asked users—in this case, students and mentors—to inform us of ways they believed the mentor program did or did not address the gap between academia and industry and of recommendations they would have to improve the program. In gathering this input, we approach the mentor program through a collaboratively constructed user-centered design perspective that relies on participant research and takes into account participant contributions that will be addressed as the program continues to improve. (Katsman Breuch et al 225-226)
This initial meeting included an introduction to the mentor program, including an overview of participation and suggested structure for the mentor pairs. We asked mentors and students to articulate goals for participating in the program and outline three contact meetings that would occur during the program. (Katsman Breuch et al 226)
Near the end of the 15-week period, we distributed a questionnaire to all participants that asked questions about the goals of their mentor pair, their meeting choices, their hopes for the program, and whether or not hopes were met. The questionnaire also asked participants for reflections about how the program addressed the academic-industry gap and any recommendations. (Katsman Breuch et al 226-227)
The last item on the survey asked if participants would be willing to participate in a brief interview about their experience. Of the survey participants, 23 agreed to be interviewed. We scheduled brief 15-minute interviews with these participants using whatever method worked best, whether in-person, video conference, or phone. One interview was conducted with two participants at the same time; all others were conducted one-to-one. (Katsman Breuch et al 228-229)
In order to create the mentor pairings, we began by reviewing these survey responses for each participant. We also took into consideration a brief one to two paragraph statement written by each student, which expressed their specific interests and reasons for wanting a mentor through this program. Based on the student paragraphs and survey data from students and mentors, we conducted an informal coding process that looked for similar themes, interests, and goals between the students and mentors. When an ideal match surfaced in the themes, the student and mentor were paired together. (Katsman Breuch et al 231)
In our post-participation survey, we asked users what their hopes were for the program as it continues and how well their hopes are being met. (Katsman Breuch et al 233)
From surveys and interviews, we identified the need to revisit these goals throughout the program and to add more specificity to these; e.g., what exactly does it mean to "bridge the gap" as a student meets with a technical communication professional for the first time? While academics may use PLN visualizations to indicate resources, tools, and contexts within which they work and learn, such visualizations are not commonly used in either academia or industry. Therefore, we should articulate mentor-mentee strategies that more clearly relate to making connections that build understanding about technical communication industries and how to best develop skills for securing a position and being successful in this profession. (Katsman Breuch et al 242)
User feedback allowed us to better understand the mentor program user experience, and in this case, we learned that the student experience needs to broaden outside the classroom. We see such a user experience perspective as bridging industry and academia, as integrating design and instructional design, as engaging students and industry practitioners. (Katsman Breuch et al 244)
Chapter 11: User Experience Design and Double Binds in Course Design
I have a strong inclination toward pedagogical practices that prioritize what works best for students in the classroom. Elsewhere (Zachry & Spyridakis, 2016), I have described this commitment and how it helped shape program and curricular decisions broadly in my home department. In this chapter, however, I will explore some of the inherent challenges in following this approach at a more granular level—that of an individual class. In particular, I will explore the experience of attempting to place student needs and desires as a central concern in the design of a class. (Zachry 251-252)
Effective instruction emerges from the artful design of learning experiences that should be meaningfully informed by attention to the people (students) we will engage in that design. (Zachry 252)
Experienced instructors know that the insights students offer are often uneven, perhaps reflecting a singular perspective or not accounting for the overall learning context the instructor is working within. Some insights, nevertheless, are relatively easy to address and require negligible effort to implement. Addressing some other needs and desires, though, requires more substantive changes. (Zachry 252)
In this chapter, then, the phenomena that I am particularly interested in exploring is that in which attempting to use feedback from students can lead to double binds for instructors who are attempting to design the best possible learning environments. To facilitate this exploration, I will draw on examples from a class that I routinely teach at my institution. As I present each of the three examples, my focus will be on my attempt to foster a classroom design that is responsive to the experiences of students. I will then expand on the theory of double binds in responding to the needs and desires of students when designing a class-based learning experience. (Zachry 253)
implementing a course design feature suggested by students from a previous class immediately surfaced new concerns that countered the suggested feature in an unanticipated way. Clearly, within the broad student population, people held competing—perhaps irreconcilable—thoughts about how course evaluation should be designed. (Zachry 255)
Drawn from my teaching experiences over three years, each of these examples illustrates a variation on dilemmas that I have faced as I have attempted to integrate the experience of learners in these classes into its design. To think productively about these instances and how they might have implications for using a UX approach to class design, I see value in thinking about double binds in UX design. (Zachry 258)
In the context of class design following a UX approach, a double bind is a situation in which the designer faces a dilemma due to competing demands. On one hand, the instructor-designer seeks to hear from students about their needs and desires as learners and to incorporate what is discovered into the design of the course. On the other hand, the instructor-designer is positioned within an institutional context that places its own demands (including educational policies and conventions), affecting what may or may not be possible or wise to do in the classroom. (Zachry 259)
When acting as designers and following UX priorities, these same instructors will periodically hear from students that the standards for measurement and evaluation should be altered. In my example 1, this recommendation came in the unexpected form of making the standards more demanding. In this instance, upon analyzing the costs and benefits of making such a design change, I decided to follow the institutional process to make the course graded (rather than credit/no-credit). The choice, however, was not clearly or necessarily the right one. (Zachry 260)
The details of these three examples are specific to my institutional context, but the types of double binds they represent are almost certainly recognizable to most readers. I could readily point to instances of such double binds in other courses I have taught over recent years, as I anticipate nearly any instructor-designer could. (Zachry 261)
This framing clearly has a relationship to notions of constraints and competing interests in design, but it is more specific. In particular, this framing places an emphasis on the conflicted, felt experience of instructor-designers. That is, double binds are experienced personally as tensions in our identity as we occupy our professional/institutional roles and also seek to empathize with the experiences of our students and empower them to contribute to the design of their education. (Zachry 262)
We should expect double binds to be part of the essence of our work, not something that can be resolved for all time with a single, clever design decision My purpose here is not to solve these three forms of double binds (or the many others that we face). Instead, I want to provide a framework that facilitates naming and discussing a phenomenon that we experience as instructor-designers who want to embrace the values of UX and attend to the needs and desires of learners. (Zachry 262)
Chapter 12: User Experience in the Professional and Technical Writing Major: Pedagogical Approaches and Student Perspectives
This chapter explores a central research question for educators of professional and technical writing majors: How can a program best prepare students for future career opportunities and the skills needed to succeed in those careers? We argue for user experience as a pedagogical approach for educating students about one university's professional writing major. (Bay et al 266)
We argue that user experience (UX) can serve as a more robust framework for understanding how a programmatic experience can facilitate student engagement with/in a field of study. User experience, as a concept, attempts to capture all of the aspects embedded in one's experience with an outside entity or situation. (Bay et al 266)
We present a case study of an undergraduate research methods class that asked students to assess user experiences in the professional and technical writing major at Purdue. In teams, we surveyed, interviewed, and visually mapped our large network of alumni, with particular attention to location and job position, as well as surveying current students in the major. We framed much of this work around data visualization methods (Wolfe, 2015), especially in mapping our program's alumni, in order to contextualize the ways in which user experience can also function as "big data" (McNely et al., 2015). (Bay et al 267)
A key takeaway for readers is learning about a flexible pedagogical approach to user experience that combines program assessment, introduction of students to the major, development and donor relations, as well as critical reflection on students as users. Perhaps most importantly, this article is co-written by undergraduates in the professional and technical writing major, demonstrating their roles as users and as user experience researchers. (Bay et al 267)
When we develop a major or concentration, we are creating an experience for students. We want students to proceed through a program and not only learn concepts, theories, and approaches, but also to develop a sense of themselves as future professionals entering a community of practice. These students will also be "products" of a program and its approaches, much like we see doctoral students as products of a particular program, with particular strengths and ways of seeing the world. (Bay et al 268)
One approach we might take is thinking about specific sites or courses as micro-testing grounds to gauge the experiences of a program's students and/or alumni. In a sense, this approach relies on what we might term "programmatic UX," or taking the temperature of users at a specific moment and in a particular context. Programmatic UX could be one way to iteratively research, test, and refine particular aspects of a program's user experience. (Bay et al 269)
Unlike programs such as engineering or computer science, the professional and technical writing program at our university does not have dedicated staff to collect and maintain data on our alumni (in fact, as a humorous aside, when Jenny requested data on our professional writing alumni from the university development office, she received a list of alumni from the creative writing major instead). She realized that without understanding the prior experiences of alumni, it would be difficult to design a better experience for current students. (Bay et al 270)
One programmatic illumination from this project was that the LinkedIn alumni group is a self-selected group, meaning that members were not necessarily alumni of our specific professional writing program. Almost all of the members were alumni from Purdue, but they may have earned different degrees and were working as professional or technical writers. Thus, some of these users were not necessarily users of the program but were users in the field, which provided a rich set of perspectives. (Bay et al 272)
What the diversity of group members showed was that members identified themselves in terms of their careers first and their majors/education second. They saw professional and technical writing less as a field of study and more as a career trajectory that was not necessarily connected to an academic program. In thinking of program assessment, then, career preparation as a category of assessment might need to be more nuanced. (Bay et al 272)
The research these students performed was also beneficial to them as students of the program since the information they collected increased the awareness of the students around them. They felt that it was strange that there was no prior research into the students in the professional writing program because they assumed that the program would know everything about the students and alumni. The team decided to send out a survey in order to obtain the answers to their questions. That process in itself was something with which the group had little previous experience. (Bay et al 275)
To understand how to create an effective survey, the team researched and spent time attempting to gather as much data as possible. They needed to create a survey which was unbiased, yet still asked specific questions to collect the desired feedback. The trouble was related to leading questions, as the team did not want to affect the responses they received with the framing of the prompts. Part of this issue might have been because the students themselves belonged to the population being studied. (Bay et al 275)
The method used to collect this data unfolded as the team worked on the project since it was difficult to collect data from scratch. Data on alumni of the professional writing program was collected through multiple outlets and was stored in a Google Sheets file. First, data was gathered from the Purdue Professional Writing Group on LinkedIn by sorting through the group members for graduates of the program. Many members of the group were not professional writing alumni and thus were not added to the database. The team looked at connections to our faculty in the program, as well as other alumni for more names to add to the database. A final data source was the PW-Talk email list, which is a listserv for current students and alumni of the program. In all, students collected data for over 300 program alumni. (Bay et al 277)
The resulting data set was used to create visual graphics, including a word cloud graphic of job titles (see Figure 12.3), as well as pie charts and graphs displayed in a poster for the final presentation to stakeholders. The most valuable aspect of this project for Margaret was creating something that would be used for purposes beyond turning it in for a grade. Beyond creating the poster displaying the results of the project to members of the English department, the team was able to share the database with the administration of the professional writing program for them to use for their own purposes. For Margaret, this type of "service learning" is the most beneficial because it combines the learning process with applications outside of the classroom. (Bay et al 277)
Several common threads of the user experience emerge from these project reflections, which we didn't realize until writing this chapter. The first is that user experience in the professional writing major includes more than just academic or career preparation; rather, it also includes life preparation. As Ashlie notes, the UX approach of the class led her to become more aware and understanding of other human beings with whom she interacts. Likewise, there was a consensus that the subject matter experts reinforced that everyone is human; we all make mistakes and are still learning while on the job. (Bay et al 277-278)
The UX approach of the class meant compiling information not only for student projects and grades, but for the program as a whole. Students believed that the program administrators could use the information gleaned from the survey to help structure the program and its curriculum to something that the students could be proud of by the time they graduate. (Bay et al 278)
These conclusions led us to see how programmatic assessment does not necessarily need to occur from the outside looking in; rather, perhaps students can be the most lucid assessors of our programs. Students, as users, can provide rich reflections on the value of a program and where that program can be strengthened. (Bay et al 279)
Jenny plans to engage current students and classes to continue participating in this evaluation of user experience so there can be a reciprocal and iterative process for understanding the user experiences of the program, as well as continue to teach students how to research and respond to user experience as a methodological approach. (Bay et al 279)
Chapter 13: Program as Product: UX and Writing Program Design in Technical and Professional Communication
As TPC administrators consider the range of available approaches to building and improving programs, we argue that user experience (UX) methods can provide an innovative approach to program redevelopment. In this chapter, we explore how UX approaches to program redesign differ from existing approaches, and we forward the idea of program as "product" and students as "users" to theoretically ground this shift to UX-based research methods. (Masters-Wheeler and Fillenwarth 286)
Through this research, we demonstrate the value that UX-grounded research brings to program redesign, and we offer suggestions for initial and extended programmatic research based on the idea of students as users of programs. (Masters-Wheeler and Fillenwarth 286)
UX has the potential to illuminate the invisible or overlooked experiences of the users of an organization's product or service. Many programs include alumni when gathering feedback on program design processes, yet only including alumni in these processes may cause programs to miss out on gathering current or future students' perspectives. (Masters-Wheeler and Fillenwarth 287)
From these examples, we can see how nonprofit organizations of all kinds can use UX to improve stakeholder experiences of their products, whether those products are informational materials, goods, or services. In higher education, our products are the programs that we offer to students. (Masters-Wheeler and Fillenwarth 287)
Programs are products associated with university brands, and they are marketed to prospective students who have many choices about where to enroll. It may be a question of semantics, as Eric Stoller (2014) argues, yet "calling ourselves anything but a business seems unfair and untrue. Students pay a great deal for the product that is higher education" (n.p.). Almost all students and/or their families contribute at least some of their own funds towards their higher education. Students pay for opportunities to take classes and earn degrees, and they should understand that there is no guarantee as to whether they will pass, fail, or get a job just because they paid for an educational experience. (Masters-Wheeler and Fillenwarth 288)
Similarly, Bridget Burns (2016) has called for more institutions to adopt the practice of "process mapping" to improve student experiences. She makes the point that "[a]s consumers, we expect that retailers or service providers have designed the experience around the customer. We become frustrated when things are counterintuitive, bureaucratic, slow, difficult or painful. So why should we tolerate flawed processes that frustrate our students?" (n.p.). She gives examples of process mapping initiatives conducted by Georgia State University and Michigan State University that assisted students, especially first-generation students who lacked external support, with navigating university processes such as those surrounding admissions and financial aid. (Masters-Wheeler and Fillenwarth 289)
Viewing student experience as user experience forces us to view programs from a new angle. Adding UX to our continuous improvement practices can challenge underlying assumptions about what education means—in a beneficial way. Framing students as users can be a disruptive and innovative program design practice ( Johnson et al., 2017). UX in program design positions students as active learners who already possess valuable knowledge sets, even as they seek more skills and knowledge from an educational program. (Masters-Wheeler and Fillenwarth 290)
Students are the users of our products—the educational experiences facilitated by our programs. Yet, our unspoken assumptions may resemble the reverse scenario—we may tend to regard students as the products of our programs. (Masters-Wheeler and Fillenwarth 290)
The continuous improvement and UX processes that we apply in program design can increase the quality of these educational experiences, but students ultimately determine how they interact with the product and how they use it in their lives and careers. (Masters-Wheeler and Fillenwarth 290)
As a first step in applying UX principles to program redesign, we now turn to our study of the various ways that students interact with a program's representatives, spaces, activities, and artifacts. These sites of interaction may be viewed as interfaces. Identifying these allows us to create a map of all the sites where students encounter the idea of an individual TPC program. These interfaces may fall into some of the "programmatic landscapes" as defined by Schreiber and Melonçon (2019); however, the focus for a UX methodology will be on how students experience these areas, which will be completely different from how a program administrator experiences them. (Masters-Wheeler and Fillenwarth 291)
While Guo's approach is geared towards business products, these four elements of UX may also be applied to the design of any system. Guo simplifies the purpose behind each element with a question: Value - Is it useful? Usability - Is it easy to use? Adoptability - Is it easy to start using? Desirability - Is it fun and engaging? (Masters-Wheeler and Fillenwarth 291-292)
To explore Guo's four UX elements in our programs' interfaces, we have developed and administered a survey to current students and alumni at our respective programs that have some similarities and many differences. Gracemarie's program at an R2 university (formerly a regional comprehensive university only three years prior) has recently developed a minor, certificate (for non-writing arts majors), and concentration (for writing arts majors) in technical and professional writing. (Masters-Wheeler and Fillenwarth 292)
Through this survey, we seek to identify the multimodality of student interactions. Some interfaces are concrete artifacts, while some interfaces are immaterial—they involve exchanging ideas about the program by talking to people and participating (voluntarily or involuntarily) in experiences. We must also keep in mind that the interfaces through which students encounter our programs actually involve varying degrees of programmatic involvement (and therefore control). (Masters-Wheeler and Fillenwarth 293)
The first set of survey questions asks students four questions about their institution, major or minor area of study, and how far they have progressed in their program. In our analysis of survey results, we evaluate significant differences in the answers of current students versus alumni across both institutions. Responses from current students and alumni of each institution's technical / professional writing program are considered valid. (Masters-Wheeler and Fillenwarth 293-294)
This UX survey focuses on general attitudes about the program's worth or usefulness, but the primary focus is not on evaluation of program content. Survey questions about perceived value from students' perspectives should complement, not replace, alumni and employer surveys that help determine which TPC curriculum areas are valuable in professional settings. (Masters-Wheeler and Fillenwarth 294)
We adopt Guo's "usability" category because we want to measure how easy it is for students to "use" our programs without encountering practical or logistical problems. This category is separated from user perceptions of value and desirability, although somewhat arbitrarily—user perceptions about value and desirability cannot be completely divided from the more practical aspects of use. (Masters-Wheeler and Fillenwarth 294)
UX methods that focus on "usability," understood as students' "use" of a program, can help us to identify and remove any barriers that may hinder students' progress through these requirements. Our survey asks five questions concerning usability that focus on how easily students progress through the program. In contrast to the adoptability section, which deals with how easy it is for students to learn about and enter the program, this section focuses on students' progression as program users. (Masters-Wheeler and Fillenwarth 295)
This section asks five questions that help to establish how students perceive what we have called the program interfaces—the sites where the idea of the program surfaces for students. Becoming aware of the program and what it entails allows student to evaluate whether it suits their needs. This section of the survey also measures how easy students thought it was for them to join the program by declaring it as their major/minor (or certificate, if applicable). (Masters-Wheeler and Fillenwarth 295)
The element of desirability involves students' satisfaction with the program. Education is not entertainment—it is not supposed to be "fun." Nonetheless, as we address in our discussion of UX methods, there may be ways to evaluate whether students are engaged and satisfied that go beyond data usually gathered through traditional course evaluations, which come with their own controversies about gender bias, racial bias, and general ineffectiveness in evaluating personnel. (Masters-Wheeler and Fillenwarth 295)
One of the consistent findings of our survey was the importance of people— particularly professors and advisors—and artifacts as program interfaces. Professors were mentioned, often by name, in questions surrounding program value, and advisors were cited as an essential component of ease of use. Professors also played a large role in adoptability by introducing students to available programs, and advisors contributed through their assistance in helping students go through the steps of formally adopting their program. (Masters-Wheeler and Fillenwarth 302)
Artifacts also played a large role in respondents' experiences in their programs, particularly in the areas of adoptability and ease of use. Though students' first exposure to the program was typically through a person, artifacts more commonly provided information throughout students' experience in a program. As artifacts also came up as a highly requested way to clarify program requirements for questions regarding ease of use, we need to seriously consider the role of documents in helping students understand and navigate our programs. (Masters-Wheeler and Fillenwarth 302)
Questions regarding content, design, and access to these artifacts would all be relevant. Participatory design projects, as described by Salvo and Ren (2007), could follow, perhaps assigned as course projects where students would develop engaging and useful program artifacts. (Masters-Wheeler and Fillenwarth 302)
Implementing UX tools such as interviews, focus groups, or observations would be a starting point for additional research to help us learn more about student-faculty/advisor interactions. Methods such as think-aloud testing could also be implemented with faculty and advisors to help improve the usability of the documents from which they draw their knowledge. (Masters-Wheeler and Fillenwarth 303)
As a next phase of research, task analysis, which examines the actions users take as they work toward completing a task, would be a particularly helpful research tool to implement. In the case of our programs in particular, task analysis relating to advising and course selection would provide helpful insights into the ways that various people, artifacts, and experiences come into play as students navigate the course registration process. (Masters-Wheeler and Fillenwarth 303)
While we could again speculate about students' reasoning for preferring one variation over the other, additional research could more productively illuminate students' perceptions regarding this distinction. Interviews or focus groups could be particularly helpful for learning more about major and course preferences. (Masters-Wheeler and Fillenwarth 304)
As our students, institutions, and worlds change, so too will student needs and experiences with our programs. For example, in a preCOVID-19 world, Christine's focus group finding leads us to reassess the value of physical artifacts. In the world in the midst of a pandemic (as of this writing), however, when students may not be physically on campus, such physical artifacts will obviously shift in importance in students' experience. (Masters-Wheeler and Fillenwarth 306)
All of this is not to discourage user research in the present moment or to demand incessant research that never allows us to make changes, but simply to encourage faculty and administrators using UX-based approaches to programs to adopt an attitude of continual curiosity toward user experience, as advocated by Schreiber and Melonçon, and to be attentive to context and time in planning and analyzing data. (Masters-Wheeler and Fillenwarth 306)