Project write-up coming soon!
My goals in tech fellows were to increase student engagement in class and to help students learn more about ways technology can be useful as they pursue careers as mental health professionals.
My classes had several old school, paper-based activities in them. The activities are designed to allow/encourage students to put the activities in their clinical toolbox. In other words, they are therapeutic interventions or tools that students would be able to use with clients if they had access to them. In addition, transition from paper to tech would facilitate sustainability. The activities I use, such as card sorts, become fairly ratty after a few uses, necessitating replacement.
Additionally, by providing students with the technology to use these activities, the student can revisit the activity for themselves and the activities become much more customizable, fitting clients’ and students’ needs better.
When considering Bloom’s Digital Taxomony, the purpose of most of these activities falls in two categories – Applying and Evaluating. First, students are expected to be able to apply the activities, then they are also expected to critically analyze the activities – which clients are the activities appropriate for? What modifications might be useful to make the activity applicable to different groups or individuals? From SAMR’s perspective, this is more augmentation rather than modification. However, I’d like to think that with the increased use of tech in a classroom, some students will be able to engage in Redefinition – using tech to do previously unimaginable things!
I use a values card sort in my Ethics class. As a mental health clinician, self-awareness is of utmost importance. Clinicians must be aware of their own values and ways that their values can intentionally or unintentionally influence the counseling process. By using a values card sort, students can clarify their own values, then reflect on (through discussion and a paper) ways their values might influence perceptions of clients.
With much assistance from Lauren and Becky, I translated my old paper values into Padlet, which allowed students to sort several values into one of 4 columns:
My plan B was to have a few copies of the paper versions, and paper worksheet at the ready. Padlet provided several ways to share this template with the students, including an option to allow students to remake the template on their own device. I could also email the link to students, embed it in Moodle, or even share it on Facebook (I chose not to do that…for obvious reasons).
I assessed the project informally. First, most students were able to use Padlet; only 2 of my students were not able to access Padlet via their computers. I’m still not sure why but those students used the old-school paper cards to sort their values. The rest of the students were able to use the Padlet version on their computers (yay for plan B). Most students said they liked it. Two of the students said they would have preferred paper because there was something engaging about actually holding (“weighing” one student said) the values in their hands. The research on online vs. paper reading helps me make sense of this – there is a bit of evidence that pen and paper methods may be associated with better retention, but it is far from conclusive at this point. Personal preference has also been cited in the research: Age and nationally tend to influence preferences (younger students tend to prefer screen).
To me, the project gave all of us (students and me alike) the opportunity to consider personal preferences and reflect on how our clients will have preferences for paper or technology as well. Students came to the conclusion, on their own, that client preferences should be honored and that the Padlet version of the card sort is a little more pleasing to the eye and customizable. But,
Similarly, I am also using Padlet to coherently organize all of our field placement listings for our master’s students. IN the past, I have simply sent out an email with attachments of descriptions of the various internship opportunities in the community. With the brilliance of Lauren, she suggested Padlet. Feedback has been overwhelmingly positive, as the listings are more organized, easier to access, and more aesthetically pleasing:
Project Reflections and Next Steps
I think the projects worked well and I really like Padlet! The downside is for the free account, one can only have 3 Padlets, and now I want to use it for everything! However, there are so many other fantastic tech tools available, I am excited to explore other options.
For my next project (coming up in my class in about a week), I plan to use Google Maps to have students create community asset maps. Another option is ThingLink, but that will have to be for another time.
What I learned:
- I really like using tech and learning about tech to improve teaching!
- For the card sort project and the job announcement bulletin board, I think I have a good thing going. However, getting more formal student feedback is important and will very likely help me make the tools even more user friendly for students.
More generally, I learned that I have to carve out specific time for learning and implementing tech into my teaching. Implementing the ideas can take time and sometimes involves a steep learning curve but Lauren and Becky have been fantastic in providing refreshers as needed and moral support/encouragement (and coffee…thanks!)
My project incorporated the use of several technologies, Zoom and Padlet, into my Critical Thinking in Psychology (PSY217W) course in the fall of 2018. The purpose of this course is to teach students the skills to engage in critical reading, writing, and thinking as they consume information within the discipline of psychology. This course also teaches students how to engage in scientific writing and adhere to proper APA formatting. In previous semesters, I would often use class time for students to discuss the psychological content they were consuming as well as providing students an opportunity to work on their APA formatting. However, I observed that students would be quite reluctant to either initiate or join in on the discussions. I also observed that students would often struggle outside of the classroom when working on their assignments that required scientific writing and adherence to proper APA formatting. Based off these observations, I concluded that there must be a way to not only increase their participation, but to also improve their performance on writing assignments. After being introduced to Zoom and Padlet throughout our Tech Fellows meetings, I soon realized how these tools could encourage students to be more engaged with the content and discussions, and as a result, improve their performance on APA writing assignments. Therefore, my project incorporated the use of Zoom to capture a recording of my computer as I completed a references section from start to finish while adhering to proper APA formatting. The second technology my project incorporated was the use of Padlet, an online bulletin board, that allows students to post to an online bulletin board in real time.
For both technologies, student accessibility was the factor I initially considered. The Zoom recording was uploaded to the course Moodle page. Padlet is a free app that can be downloaded on smartphones, which every student in the course had. A second consideration was to learn and become familiar with the technologies myself. Having used Zoom before to video conference, becoming familiar with the recording feature of Zoom did not take long. Although I had no prior experience with Padlet, I was able to play around in Padlet during our Tech Fellows meetings. Learning how to use Padlet was a relatively quick and easy process. A third consideration was how to get students to “buy in” and learn how to use the app in class.
A major course learning objective was the development of skills to properly write and format a references section. Creating a video tutorial for how to properly format a references section seemed like an appropriate and salient project. Other major course learning objectives were the development of critical thinking and communication skills. The use of Padlet in class afforded students with low-stake and low-pressure opportunities to practice and hone those skills as responses made in Padlet can be done so anonymously.
While planning the Zoom video tutorial, I integrated two of the twelve principles of multimedia learning according to Richard E. Mayer’s book, Multimedia Learning (Cambridge Press, 2001). The first principle I integrated was principle #5, which is the temporal contiguity principle. This principle states “People learn better when corresponding words and pictures are presented simultaneously rather than successively”. The second principle I integrated was #11, which is the voice principle. This principle states “People learn better when the narration in multimedia lessons is spoken in a friendly human voice rather than a machine voice”. A second model I integrated into this project was Bloom’s Taxonomy. As students first watch the video tutorial, they will be understanding and making sense of the skills necessary for formatting references sections. The end goal of the Zoom video tutorial is to provide students with the skills necessary for creating accurate references sections in the future.
While planning the Padlet project, I integrated several components of the Substitution, Augmentation, Modification, Redefinition (SAMR) model. One component of the SAMR model I incorporated was modification. Using Padlet in the course redesigned the task of communication from the traditional format of a student raising their hand to speak to where each student now has the opportunity to communicate in an anonymous and digital format. Further, Padlet allows students to communicate in ways other than spoken words (e.g., pictures, GIFs). By affording the students to communicate in these alternative ways (e.g., via GIFs), this incorporates the redefinition component of the model. In previous courses that utilize communication, it would have been inconceivable to communicate through such mediums as emojis and GIFs.
For the Zoom video tutorial project, I recorded a practice run of myself going through the tutorial. Then, I watched this practice recording to pick up on any errors or mistakes I may have made, made sure I was speaking loud enough, not hitting the keys too loud, etc. Once I went through a few more practice runs, I recorded the tutorial that would eventually be uploaded to the Moodle page for the course. After I uploaded the video, I went to the Moodle page and switched my role to ‘Student’. That way, I could see what the student would see and to ensure the video was properly uploaded and could be viewed without issue. Had there been any issues preventing the tutorial from either being recorded or uploaded to Moodle, plan B was to conduct the tutorial in-class with a video recorder capturing the tutorial. That way, a recording of the tutorial would still exist.
For the Padlet project, I first had to go into the app and create a blank bulletin board. The template I chose to work with was a stream. That way, responses to the question could be streamlined in an easy to read, top-to-bottom format. After choosing a stream template, I edited the title to reflect a question, which was “What are your strengths and weaknesses as a psychology student?” Next, under settings, I activated the profanity filter, which replaces bad words with nice emojis.
Next, I sent out an e-mail to students a few days before the class in which we would be using Padlet. In the e-mail I instructed students to bring their smartphones to class and to download the free version of the Padlet app prior to class. In this e-mail I sent out links to download the Padlet app from either the Google Play store or the App store, depending on what kind of phone each student had. Lastly, I embedded the link to the stream template I created in the e-mail. That way, students could easily access the Padlet I created.
In case students experienced difficulties with the app on their phones, plan B was to access the Padlet using their laptops. Fortunately, all students were able to respond and post to the Padlet using their smartphones.
For the Zoom video tutorial project, I used a formal assessment by grading the subsequent references sections that students would submit throughout the course of the semester. I would argue the assessment is both formative and summative. Formative in the sense that prior to submitting their first references section, students have been introduced to and learned about proper APA formatting. Thus, when submitting their first references section, I am assessing students on how they are learning the material. This project is also summative in that students are submitting references sections numerous times throughout the course of the semester. The expectation is that students are improving on their performances with writing and formatting references sections throughout the semester. Thus, when students submit their final references section, my assessment is summative in that I am evaluating how much they have learned APA formatting over the course of the semester.
For the Padlet project, I used informal assessment. A main objective of this project was to increase student participation during class discussions. To assess this objective, I compared the number of responses to the same question (i.e., “What are your strengths and weaknesses as a psychology student?”)
In previous semesters without Padlet, this prompt would elicit an average of three to four responses. With using Padlet, the question now averaged 30 responses in a class of 20 students.
In previous courses, some students have expressed learning better through visual means. By accommodating students that learn better visually, the Zoom video tutorial provided these students with a useful resource for learning.
The Padlet project was valuable in that the app provided students an opportunity to not only be exposed to the thoughts and feelings of their classmates, but to be able to express their thoughts and feelings to their peers as well. These opportunities do not always readily present themselves in more traditional communication formats.
Project Reflections and Next Steps
One thing that did not work for the Zoom video tutorial project was being able to view how many times students either accessed or watched the video. This was due to the way in which I uploaded the video file on to Moodle. Next time, I will create a Panopto activity in the course Moodle page, which will then direct students to the video tutorial. That way, I can generate activity reports on how many students have watched the video.
Another thing I would change is how I assess the effectiveness of the video tutorial. At the beginning of the semester, prior to loading the video to Moodle, I can assess students on their skills and knowledge on APA formatting. Then, I can load the video to Moodle and re-assess the students on their APA formatting knowledge and skills a few weeks later. By incorporating the activity reports of the video tutorial into these analyses, I can assess how much of an impact watching the video tutorial had on their skills and performance.
A potential future modification of this project is to incorporate the video tutorial into an EdPuzzle. Not only does this provide an opportunity for a formative assessment of their skills, but also makes the tutorial interactive. A second potential modification is to break up the tutorial into segments. With a total running time of nearly 30 minutes, it might benefit students to view this tutorial over several videos that are shorter in duration.
For the Padlet project, one thing that did work was the number of responses received from students. Despite the frequency of responses, one thing that did not work was how much students elaborated with their responses. In the future, I will encourage students to elaborate on their responses. One way to modify this project in the future is to change the layout of the Padlet to perhaps a canvas or grid, depending on the nature and context of the question.
One big lesson that I learned from these projects is that encouraging students goes a long way to get them to buy into and utilize the technologies. For the tutorial video, I made somewhat of a fuss about it by making a point to let students know about this particular resource compared to the numerous other resources they have access to on the course Moodle page. By acknowledging this video and drawing their attention to it, I am not sure how many students would have used the video or perceive any value in the video. The same logic of getting students to buy into the technology applied to Padlet. Prior to using Padlet in course, most (if not all) students had no experience with this technology. Encouraging students to work through the (relatively quick) learning pains of using the app paid off as the class quickly got the hang of the app. Within minutes, students were responding using GIFs, emojis, and pictures.
Introduction forums are a common practice in online courses. Students present themselves personally and professionally through the written word. The written word can lack intonation, emotion, and personality which are key elements in a live introduction. I explored the multiple methods of audio/video presentations and tried VoiceThread a few times. Too often the VoiceThread technology was challenging for the student and became the focus of the task. I sought to find a simple method that required minimal technology experience but provided a video that could be used in lieu of a written introduction in the course. Since most students are comfortable using their smart devices for videotaping, FaceTime, and other video-related applications, it seemed feasible to seek a tool that could be used from their phone or another device.
Flipgrid is a video discussion platform that is simple to use and can be accessed through any smartphone, tablet, or computer. It gives students a voice and educators a means of creating a communicative learning environment. Students are able to respond to the videos posted by the instructor or peers by reacting, responding, and sharing their own videos. Flipgrid has many features which students are familiar with thus leveraging the elements of social media to engage students in the classroom and promote communication.
This video method of introduction should be more effective in humanizing the online class. Students can connect on a more personal level which is usually lacking in the written introduction. This personal introduction could enhance their interactions in weekly activities like discussion forums and encourage richer dialogue.
For my project, I incorporated Flipgrid into online RN-Bachelor of Science in Nursing (RN-BSN) and Master of Science in Nursing (MSN) courses I taught in the Fall 2018 and Spring 2019 semesters. I also used the Flipgrid introduction approach with first-year undergraduate Pathways to Nursing (PTN) students in Summer 2018 prior to the students meeting in person at the orientation.
The initial step in planning this process was a literature search to determine current best practices. Also, using Bloom’s Digital Integration Model, I determined the focus should be at the highest level—creating. The written introductions are at a lower order of thinking and creating and viewing introductory videos should be a more effective modality through utilizing higher level thinking. After this initial investigation, I sought to find a technology tool that would be most effective.
With support from the Instructional Technology team, I narrowed my search and focused on technology applications which were simple, functional, and compatible with Moodle. Flipgrid seemed to be the best choice. I embraced the tool and sought to master it prior to implementation. Microsoft recently purchased Flipgrid and improved the already successful tool. Flipgrid has excellent learning materials for both students and teachers. I took advantage of all that was offered and became certified as a Flipgrid instructor. Through this process I mastered the use of the tool and planned to incorporate it into the classes I taught last fall.
Retrieved from https://flipgrid.com/
The implementation process was bifurcated. I produced a Flipgrid introductory forum for two online courses in the RN-BSN and MSN programs. Then I created a Flipgrid forum for the incoming first-year on ground students.
First, I created an introductory Flipgrid forum for my online students. Students in the RN-BSN and MSN courses had experience with written introductory forums. I prompted them to introduce themselves both personally and professionally and discuss their intended outcomes for the class. The students created videos that were personable and demonstrated their true personality. Their peers were able to add comments or videos in response to the introduction. As the instructor, I found the introduction more personal and emotional. It provided the opportunity to meet them virtually, respond, and appreciate their intended outcomes for the class.
Secondly, I sent a Flipgrid link to all incoming first-year students to create a video and meet each other virtually. The first-year nursing student orientation is limited to two hours and the personal introduction of each of the 32 students would have left little time for the presentation of pertinent information. PTN is an undergraduate program where nursing students spend their first year on the Chatham Shadyside campus. The second and third years they attend UPMC Shadyside School of Nursing (SSON) and complete their degree with the online RN-BSN program their senior year. Success in nursing school requires collaboration with peers through mentoring and study groups. It is imperative that students know who their peers are and develop friendships with an academic partnership as soon as possible.
Several weeks prior to starting at Chatham, the PTN students were sent a link and asked to create an introductory Flipgrid video. They were prompted to introduce themselves personally and share why they want to become a nurse. Most students embraced this opportunity and videos were created all over the United States. One was created in Finland where the student was visiting prior to starting college.
Example of a PTN Flipgrid introduction provided with permission from the student.
Summative assessments were conducted using Qualtrics surveys a few months after students engaged in the Flipgrid introductory activity. Both the online RN-BSN and MSN students and the undergraduate PTN students found value in the activity. As indicated formally through the Qualtrics survey and personal discussion, the project provided the students a means to actively engage in introductory forums that were personal and enhanced communication.
RN-BSN and MSN Application
For the online RN-BSN and MSN courses, students were surveyed and asked to compare their Flipgrid introduction experience with the usual introductory forum using the written word. Qualtrics software was used to survey the students who participated in the Flipgrid introductions. Since some of the students might not be technologically savvy, they were asked if Flipgrid was easy to use and helped personalize the course.
The students found the videos more personal than the written introductions. They indicated the technology tool was simple to use and found it more impactful than the written introductions. They were able to know their peers on a more personal level, which many found an asset to class activities like discussion forums.
Question: Did you find Flipgrid easy to use?
Question: Do you feel a video introduction personalizes this course?
Pathways to Nursing (PTN) Application
Flipgrid was employed with the first-year nursing students who had never met before. They created the videos prior to the live orientation the first week on campus. The students posted videos that truly demonstrated their personality. They were prompted to discuss why they chose nursing as a career and the responses were passionate and varied. When the students entered the orientation room, they recognized each other and remembered key elements of each other’s videos. In many cases, it was as if they already knew each other. This made the orientation flow more easily and the presented information was better received.
Students were surveyed a few months after the Flipgrid introductory activity using Qualtrics. Two questions addressed the ease of use and the effect on increasing their comfort in starting college. The students found Flipgrid easy to use and indicated it helped reduce the anxiety with starting their college career. They found things in common with their peers and several of the students continued friendships and academic relationships. These relationships were the start of activities like study partners and groups and peer mentoring.
Question: Did you find Flipgrid easy to use?
Question: Did Flipgrid videos increase your comfort level for starting your college career?
Project Reflections and Next Steps
I plan to continue using Flipgrid in my course introductory forums. The personal introductions promoted a sense of community amongst online students who never meet in person and on ground students prior to meeting in person. Students found the tool easy to use and effective in personalizing peer interactions.
Moving forward I want to use Flipgrid as a type of formative assessment. Formative assessment is a common application for Flipgrid that has been used by teachers in various levels of education. This pedagogical approach allows educators to assess student learning in a unique and personal manner. Additionally, the tool is commonly used to gauge how students are feeling about the class and where they want to progress. They can connect class content with their own experiences. Since words are often misconstrued, video assessment provides an opportunity to provide feedback that is genuine and presented with the positive attributes of the spoken word. There are many uses for Flipgrid in the assessment process, both formative and summative. Educators are privy to a website, webinars, and personal support that afford them the opportunity to use Flipgrid in their classrooms in a multitude of ways. Ultimately, I hope to inspire other educators to use Flipgrid or another form of video technology to create a sense of personalization and community in their courses.
Dr. Pierette Appasamy’s article Fostering Student Engagement With Digital Microscopic Images Using ThingLink, an Image Annotation Program is published in The Journal of College Science Teaching.
This publication is a result of her tech fellows work with ThingLink and other technologies.
The Hechinger Report published a column on feedback called Has video killed the red grading pen? Both Monica Riordan and Meigan Robb were interviewed and quoted in the article. Faculty feedback is a topic that both faculty addressed in their tech fellow projects.
We are proud of their work and thrilled we get to work with such great faculty!
Conflict. Mission: Zhobia is a web-based simulation that places the student in the position of a non-governmental organization (NGO) worker tasked with developing a plan to reconstruct the justice system in the fictional conflict-ridden country of Zhobia. Students must perform background research into My project involved incorporating the Mission: Zhobia simulation-game into POL 302 Ethnic the conflict, interview a variety of political and social stakeholders, and ultimately provide recommendations regarding the location of the court, training structure for legal staff, and the underlying legal framework for the new justice system. Upon submitting this report, the students are then told what the after-effects of their plan were.
As Mission: Zhobia simulation is a free, pre-packaged website there was little planning involved in its implementation from a design stand-point. The simulation came at the end of the introductory unit of the course where we focus on different institutional design issues in managing ethnic conflict situations. The lectures leading up to the simulation consistently stress the trade-offs one must consider in designing post-conflict political instructions and the notion that no one institutional structure alone will solve all the problems.
I scheduled the simulation for the week following class readings and discussion regarding political institutions in conflict situations. We devoted the first one-hour and fifteen-minute class meeting of the week to students playing the simulation, and the second class meeting was a discussion and debriefing session. Students were asked to bring their own laptops to class that day and everyone was able to do so. The only logistical concern was whether students would be able to complete the exercise within the hour and fifteen minute time period. All but two students were able to and those students completed their exercise immediately after class.
I assessed the exercise with a combination of formal and informal tools. Formally, the students were asked to complete a questionnaire asking them to share specific decisions they made along the way and to reflect on their experience. Informally, we devoted the class session following the simulation to a discussion of the students’ evaluations and observations.
For the reflective portion of the questionnaire, the students were asked 1) what they felt were the three most important items critical to success in the simulation, 2) what they would do differently along their own path of completion, and 3) what they learned through the process that they had not realized beforehand.
In response to question 1, student consistently stated that the most important consideration was gaining the trust of regional stakeholders was the most important element to consider. They observed that more decision options were presented to the player as they engaged with local leaders. The second most cited consideration was adequate background preparation. A number of students seemed to jump quickly to engaging the local leaders without doing their homework on the conflict in Zhobia. When leaders would ask them questions pertaining to the conflict, if the student answered incorrectly they would lose trust, limiting their options down the line.
Question 2 yielded results consistent with the prior observation. Students said they would have spent more time on the background materials and preparing for the stakeholder interviews. They also indicated that they would have avoided decisions early on that restricted their options later in the simulation.
Finally, in response to question 3 the key lesson the majority of students took away was the difficulty in finding solutions to the conflict that all sides would find acceptable. In our class discussion, numerous students indicated that they had chosen options they felt were sound and yet found that the situation descended back into violence after their report was submitted.
Reflections and Next Steps
I feel the simulation was a resounding success. I was concerned that the game setting would appear frivolous in the context of a course with such a grim topic. And yet watching the students reactions while they were participating indicated that they were actively engaged. The assessment results indicated that the primary lessons to be learned were in fact taken to heart by the students.
It would be possible to run this simulation outside of actual class time and devote the same time to post-simulation assessment activities. This is something I would consider next time. On the other hand, being available to address questions and potential technical glitches was also important. The time restriction within class may have rushed some students yet I found that 1 hour 15 minutes was adequate time for background research and questioning. Most students completed in the time as well.
In the final analysis, I found the Mission: Zhobia! simulation to be a very useful classroom exercise. The simulation showed them through experience what I could only stress in lecture: the importance of preparation and openness of dialogue when engaging in peace-building measures. Numerous students indicated that they went back and played the simulation multiple times to try and achieve a perfect score. This is final evidence that they themselves found the simulation to be an engaging experience. I look forward to running this experiment again the next time I offer POL 302 Ethnic Conflict and comparing that student experience with this one.
For my project I used EdPuzzle in the NUR411 Geriatric Nursing class to give students a different perspective on loss, how difficult loss can be, and to help them experience some feelings their patients or patient’s families are experiencing. The essential idea is that, by incorporating EdPuzzle and changing how learning activities are presented, students will engage more in learning activities and increase their course learning. The provocative question is “Will presenting course activities in a different format result in increased student engagement and learning?” With this activity students should be able to display increased knowledge for the topics being covered.
When considering a project for Tech Fellows, I was in the midst of revising the NUR411 course and wanted to continue incorporating death and dying in the course. The course was becoming heavy in topics and there was a lot of reading then answering questions involved. I wanted to change that and give the students an activity that they could complete that did not involve reading articles or websites while being interesting and resulting in students learning about the topic. I had an idea for an activity to incorporate into the class and EdPuzzle was the best way to do this. EdPuzzle allowed me to present a scenario to students that they could immerse themselves into and experience the feelings that accompany a loss at the end-of-life. I could not find a video for this specific type of activity so I wrote a script and recorded a video myself allowing for the question/answer segments in the video to help guide the students through the activity.
While planning for this project I considered the type of activity to incorporate in the class, how long the activity should take, the potential outcomes of the activity, and any follow-up that might be needed from the activity.
This activity was not incorporated to meet any specific course or learning objectives. The activity was designed to help students engage in a class activity and gain knowledge from the activity.
Once the video was made and the EdPuzzle was created, incorporating the questions as appropriate, directions were written and all was inserted into the course shell. Several colleagues were asked to trial the EdPuzzle, using the directions as written in the course, to ensure quality of the EdPuzzle as well as the activity students were required to complete. Once all was in place the activity was open for the students as appropriate in the class.
A Plan B was not in place as all trials indicated that the EdPuzzle would work in the class. If the technology did not work I was planning to offer students journal articles or other readings to complete on the topic. If, after the students completed the activity, the students did not like the activity another method to approach this course material would need to be decided on.
After students completed the EdPuzzle activity in the course they completed the weekly discussion forum. One of the questions in the discussion forum focused on the topic of the activity: “Each of you were required to complete the Loss Exercise using EDpuzzle. I know this was a difficult exercise to complete and if you need/want to debrief contact me so we can talk. What are your thoughts after completing this exercise? How does this exercise change your thoughts about elders and the challenges they face? Is there anything else you’d like to share?” The responses were varied. Some students shared how they had never considered loss before or how they would feel facing this type of loss. Others shared that this activity gave them a fresh perspective that they can use to empathize with their patients, the patient’s family, or their own family. Overall responses were positive and displayed engagement and knowledge gain.
To assess the EdPuzzle itself the following questions were asked: “Feedback questions – your responses would be appreciated: What are your thoughts about EDpuzzle (not the specific focus on end-of-life and the dying process but the mode of education)? Is this an effective learning tool? Would you recommend this tool be used more frequently in the RN-BSN Program?” The vast majority of the students responded to the feedback questions and stated that they enjoyed the activity. They felt that it was an effective learning technique and a refreshing departure from the typical pattern of reading journal articles or websites and responding to essay questions. Students enjoyed the immersion into the activity and the personal feelings that came from completing this type of activity. Some students did struggle with the directions resulting in directions being reexamined and edited for future classes.
Overall I am extremely pleased with how this activity turned out. Students engaged in the activity and displayed new levels of knowledge and learning. Feedback was positive and encouraging for further technology uses. Some students requested further technology incorporation into the courses to move away from typical course set up.
Reflections and Next Steps
Overall everything worked. The activity has now run twice and both trials have been successful. I anticipate continuing to use the activity, as is, in the course for the foreseeable future.
I learned that the majority of students are eager for a different type of learning activity and they will engage in one if offered. This met my goal of increasing engagement and having students obtain a higher level of knowledge. My next steps are to look at other topics in the course and determine if there is a type of technology that can be incorporated to increase engagement and student interest there, as well.
This project utilizes the Feedback Survey feature in Moodle for students to assess their levels of confidence in various aspects of the Evidence-Based Practice/Process (EBP) (including objectives contained within the Accreditation Council for Occupational Therapy Education Standards). An accompanying One Sheet assignment challenges students to address the results of the self-assessment. Through the use of a pre-/post-test design, students level of confidence is assessed before and after completion of the One Sheet assignment.
I selected this project because of the timeframe in which the Tech Fellows presentations were scheduled to occur. Though I have other courses that would also benefit from additional use of instructional technology methods, a project that was focused on the EBP courses was more feasible, most notably as it related to implementation.
Because a large portion of the EBP series process involves student group projects, assessment of individual learning, particularly in the application of EBP processes, is challenging. Although the group processes address clinically relevant content that requires an exploration of available evidence, the skills that are needed to complete the EBP process are varied in level of complexity. Identifying individual strengths within the group at times results in student performance of tasks that are aligned with areas of strength. This may facilitate a more successful group process/product, but this approach is not optimal for individual skill building/individual learning. This project utilizes each of the course objectives from EBP I and applicable objectives from EBP II which are based upon the ACOTE Standards. For the purposes of this project, these objectives serve as the primary items for each student to individually self-assess their level of confidence. Some of the objectives are knowledge-based, while other objectives require active demonstration of skill (for example, the ability to articulate an EBP concept).
Because the Master of Occupational Therapy program curriculum sequence is based upon the concepts of skill acquisition, development, and active learning, this project is also the foundational underpinning, both for the assessment components and for the process of creating the One Sheet.
This project was developed and implemented as follows:
- Consultation with Lauren Panton regarding project concept/idea occurred.
- Confirmation of concept from Lauren Panton was received during this consultation.
- An individual exploration of the Feedback feature in Moodle was undertaken to determine applicability to this project.
- OTH612/EBP I, applicable OTH628/EBP II, and ACOTE Standards were collected.
- Each objective was formulated into the question beginning with “How confident are you in …….” (Actual examples will be provided).
- A total of 26 questions were loaded into the Feedback feature in my Sandbox.
- Consultation with Becky Borello occurred in order to trial the administration of the Feedback survey and to determine the reporting features associated with Feedback.
- A workaround was identified in order to achieve numerical data in Excel for purposes of analysis (utilization of the Find and Replace function).
- Students completed the Feedback survey, having been instructed to complete this survey outside of class, individually, and to base their responses in consideration of their own reflections and based upon any feedback they’ve received from any professor since their admission into the MOT program.
- All data was converted to a numerical score in an Excel file.
- Student averages were calculated.
- Individual scores were examined for the associated content area.
- Once all students had completed the Feedback survey, the students were randomly paired with a peer for the One Sheet assignment.
- Students received instruction on the One Sheet assignment while in class.
- A rubric was provided (Actual example will be provided) to facilitate the content level of the One Sheet.
- Students were asked to identify their lowest scored (based upon their self-assessment in the Feedback survey) items/content areas.
- Students received their responses via individualized email from the instructor.
- Students prepared the One Sheet with their peer which contained two items/content areas.
- Students had one week to create and submit their One Sheet.
- Submission deadline was set for an hour prior to the scheduled class.
- Students completed the Feedback post-survey while in class.
- All data was converted to a numerical score in an Excel file.
- Pre/post data was analyzed for the presence of change in scores and for changes in the individual average score (pre to post).
This project contained formal assessment. Because of the pre and post design and through utilization of a Likert scale to assess self-confidence, scores could be analyzed by individual item or by content area and could further be assessed in terms of change over time.
This project provides both formative and summative assessments specifically related to students’ level of confidence in specific evidence-based practice/process skills. Students are encouraged to reflect upon their knowledge-base and EBP skills and are also able to compare their change in levels of confidence.
This project also gives the instructor valuable information that may impact the information that is presented within the first course within the EBP series, how learning is assessed and what additional assignments might be used to facilitate active, self-directed learning.
Students also benefited from this project, in that 35 out of 39 students demonstrated an overall increase in their level of confidence upon completion of the pre/post Feedback Survey and the One Sheet assignment.
Reflections and Next Steps
The concept of using a survey to assess students’ level of confidence worked well. The ability to compare pre/post scores by individual item, by content area, and overall average contributes to the usefulness of this project. Utilization of the One Sheet, as well as the pairing of students, also contributed to the usefulness of this project.
Moving forward, it would likely be beneficial to utilize this survey at the very beginning of EBP I to get a baseline on students’ level of confidence. This might also result in changing the emphasis on various content areas/objectives.
Pairing the students on an assignment that requires selection and discussion of content areas, creativity, examination of additional resources, and development of individual, measurable goals all encourage self-reflection, self-directed learning, and accountability.
I would likely utilize a different avenue within Moodle to administer the survey (such as Questionnaire) because of the limitations of the Feedback option on students’ ability to retrieve their submitted ratings. Otherwise, I believe that the overall project was a success and would definitely utilize the various components of this project again.
The additional points I’ve learned…..again that there are many ways to utilize technology to enhance teaching/learning opportunities. Personally, I believe that my own growth (and level of confidence!) related to the use of technology has been noteworthy. The anxiety that initially presented itself in contemplating the use of the various forms of technology has been remediated with a growing knowledge base, an expanding skill set, and a knowing that persistence and patience (primarily with myself) will lead to great opportunities as an instructor and as a life-long learner.
Several additional notes: I want to express my thanks to Chatham University for providing such a wonderfully generous program. Without the time and the actual technology ‘equipment’ provided, it would be quite challenging to make the gains that I’ve experienced. The ability to interact with colleagues, my fellow Tech Fellows, and my Tech Fellow buddy regarding specific application of technology to the classroom has been very helpful as well.
Beyond that, a point that I have learned numerous times over since I joined the Chatham Community….that Lauren and Becky are amazing teachers, guides, facilitators, ….completely willing to problem-solve alongside, provide direction, provide numerous tips, strategies, presentations, all to facilitate successful use of technology for instructional design and implementation. Their calm, kind approach, combined with their sense of humor really have encouraged me to keep taking my next steps in exploring technology. This all has been a great triumph for me professionally, but has also been a wonderful personal achievement as well. My deep gratitude to Lauren and Becky.
My primary goal was to increase student learning and engagement by integrating innovative ways to communicate and disseminate information through exploring technology tie-ins to instruction, assessment, and presentations. I wanted my courses to provide an opportunity for students to explore culturally responsive educational practices and professional development learning experiences through social media platforms that are best for teachers to communicate with their future students, families, and the community.
Learning is no longer only defined by time and place. A wide variety of digital networks, platforms, and content resources are being created to personalize learning to ensure that the interests and values of the whole child are being met. I approach my work to transform teaching by equipping students to work in diverse learning ecosystems with children and their families. To date, I have used Sway, Twitter, video and audio equipment, and Reality Works Real Care Baby Simulator to increase student engagement through active learning and reflection.
During the Summer of 2017, the Technology Fellows summer professional development series provided exposure to instructional technologies and time to reflect on how we would redesign our courses with instructional supports and technology enhancements. We learned about a myriad of technology platforms, applications, and software to increase student learning and engagement. As I begin to explore and try out the new technologies through a trial run, I could not settle on just one technology tie-in to integrate into my courses. Therefore, I used a multifaceted interactive technological approach to incorporate into my courses.
During Fall 2017, I taught a new course online, EDU 606/607: Child Development and Adolescent Learning Theory. This course focuses on child development in the context of social, cultural, and instructional settings and how these factors play into learning theory. Students apply knowledge of developmental stages to create authentic classroom learning environments that are healthy, respectful, supportive, challenging by utilizing culturally responsive instructional techniques. I used the interactive platform, Sway to make the information and text more engaging. The purpose of Sway is to convey concepts quickly, easily and clearly. Unlike PowerPoint, it is primarily for presenting ideas onscreen rather than to an audience. Sway allowed me to integrate text, articles, and video into one interactive Word document so students did not have to click on multiple documents or folders to access the weekly course content.
Below are examples of Sway documents that I created for class.
(Click on the pictures below to view them larger)
This Spring 2018, the two courses that I decided to focus on were EDU 105: Child Development Birth through Grade 4 and EDU 505 Issues of Poverty and Race in Education. I partnered with Saturday Light Brigade to provide the audio recording equipment for my students to record their learning experiences. SLB Radio Productions, Inc. (SLB) uses radio and audio to encourage, amplify, share and archive the ideas, stories, and feelings of children, youth and families. They provide an innovative method to promote increased creative expression, critical thinking, and technical curiosity.
EDU 105 is a required course for PreK-4 education undergraduate students; however, it is open to other undergraduate majors as well. This course focuses on child development in the context of social, cultural, and instructional settings and how these factors play into learning theory. Students apply knowledge of developmental stages to create authentic classroom learning environments that are healthy, respectful, supportive, challenging by utilizing culturally responsive instructional techniques. I integrated the Reality works Real Care Baby Simulator project to provide a simulation experience of how a real baby acts. The students had to care for the baby as if it were real over a 48 period. They had to change, rock, feed and burp the baby.
Below are pictures of our project (Click on the pictures below to view them larger).
EDU 505 is a course taken by secondary education graduate students that focuses on the characteristics and effects of poverty and race in education through examining types of poverty and racial biases in schools, as well as the impact of poverty on cognitive and physical development.
I wanted to build upon the social media engagement that I incorporated into this course the previous semester. In this course, the students had to write weekly reflections after each class session on Twitter. We also had a course culminating Twitter Chat that will be held again this semester.
The students share strategies they learned during the semester to equip educators, parents and the community in ways to address issues of poverty and race in education. You can search #GOODChat505 on Twitter to see our class reflections.
Below are examples of our Twitter posts (Click on the pictures below to view them larger).
EDU 606/607: Child and Adolescent Development and Learning Theories
The students completed an open-ended questionnaire at the end of the semester to gather feedback on using SWAY for their weekly class information and assignments. The questions focused on the effectiveness and ease of the interactive document and whether or not it increased their engagement in the online course. Most of the students thought the Sway format was simple to navigate, appreciated the flow of the assignments and graphics, and indicated that they would consider using the tool in their own classroom teaching.
Please describe your thoughts about using the interactive SWAY’s that contained the weekly module focused content.
• Was the tool effective?
• Was it easy/hard to follow?
• Did it increase your engagement in the course?
• Did you enjoy learning this way? (Using One interactive Word Document)
• Should any content had been added or deleted from the SWAY’s? If so, please explain.
• Would you consider using a SWAY document in your classes as a classroom teacher?
EDU 105: Child Development: Birth through Grade 4
While engaged in the baby simulation, the students had to document their experiences through pictures, journal entries, and Vlogs. I was able to monitor their progress through the Reality Works Real Care Baby Simulator software. The students had to discuss and present their experience in class according to the learning outcomes.
EDU 505: Issues of Poverty and Race in Education
The students are encouraged to tag the authors of our course readings and other educational leaders to share their learning or to ask questions either for clarification or rhetorical when doing their Twitter postings. Although some students were reluctant in the beginning of the course, many students ended up embracing this reflective learning exercise. They become excited and more engaged when their Tweets were “Liked” or re-twitted by educational scholars, legislatures, or community leaders. Our class Twitter Chat for this semester is scheduled for April 2nd.
Reflections and Next Steps
In reflecting on my use of interactive technology tools that I integrated into my three courses over this past academic year, I believe that there was greater student engagement because of the hands-on software and digital educational resources that were used. I appreciated learning ways to keep me engaged and excited in teaching the courses. The Sway document not only was a success with the students, but helped me to organize my course materials more efficiently. The simulation baby project provided a project-based learning experience that is essential to the education major students on their journey to developing competencies in teaching. The use of Twitter in my course allowed me to connect my love for the social media platform to increase student engagement while they shared their learning publicly. I was able to instantly gauge the students’ comfort and knowledge after each class session. Sharing their thoughts and reflections via Twitter allowed the students to become a part of an online community of learners focused on equity in education. I plan to continue using the Reality Works Real Care Baby Simulator, and expand the integration of Twitter and SWAY into my other courses.
This project involved a redefinition of existing assignments in an online MSN course (Fall 2, 2017 – NUR/HCI 503), and in an online RN to BSN course the subsequent semester (Spring 1, 2018 NUR 412). Both assignments promote content learning. However, in order to better emulate the required teamwork required for clinical practice, both projects were redefined to afford learners the transformational opportunity related to a paired project within a collaborative learning space.
Both projects were redefined in order to afford learners a more realistic learning opportunity that mimics teamwork, specifically collaboration and communication elements, required for actual clinical practice.
The original MSN assignment was intended to create a draft of an automated Electronic Report of aggregate dashboard data that could be used to assess patient outcomes or staff compliance within the clinical arena. Learners in this course were from both the HealthCare Informatics and MSN graduate programs.
The original RN to BSN assignment was designed to have students use a Quality Improvement tool (FMEA) to analyze a clinical quality or safety issue related to the process of care delivery. All online learners in this course were or had been practicing as direct bedside caregivers.
In redesigning the project, some elements that underwent consideration include:
- How will this type of assignment (project) be completed in the actual clinical situation?
- How should/could the students be paired for the assignment? (self-select; random; specific assignment methodology)
- Is it a reasonable expectation for students to complete a paired project, given the lifestyles of online and various level learners?
- How might embracing of vs resistance to the assignment best occur with students?
- How to successfully cue for peer to peer communications?
- How can student choice be offered with a selection of collaborative learning space?
- In what format should the assignment be submitted (word doc vs link to actual space)?
- How will grading be accomplished? Individual vs. grouped?
- Peer Review needed for overall collegial (communications and collaboration) workings? How to incorporate feedback? What percentage of project grade?
Both assignments previously existed in the courses to meet specific learning outcomes.
In the MSN course related to Informatics Foundation and Health Care Technology, several learning objectives were enhanced by project implementation. They include:
- Analyze current and emerging technology utilized at point of care within a structured health setting and within a virtual community environment that supports safe practice.
- Discuss opportunities and strategies for extracting data from various systems to assess patient outcomes supporting a higher level of evidence-based care.
- Identify how technology and electronic data sources can be used to evaluate the effectiveness of clinical prevention interventions that affect individual and population-based health outcomes.
In the RN to BSN course related to Nursing Communication & Quality Improvement, specific learning objectives involving were targeted and enriched. They include:
- Demonstrate leadership and communication skills to effectively implement patient safety and quality improvement initiatives within the context of the interprofessional team.
- Participate in quality and patient safety initiatives, recognizing that these are complex system issues.
- Apply concepts of quality and safety using structure, process, and outcome measures to effectively implement and monitor patient safety initiatives.
- Demonstrate knowledge of interprofessional roles communication and effective teamwork.
Related to the personal goals of the author, a collaborative learning space had not been previously utilized and offering this project created a forced opportunity to learn about them.
The Substitution, Augmentation, Modification and Redefinition (SAMR) Model was used in planning both of the assignment revisions. Use the teaching-learning process to instill technology in the design process enhanced communication and collaboration skills.
Steps to completion (ins & outs):
- Understand the value of collaborative learning spaces.
- Revise assignment guidelines and rubrics.
- Create peer review form.
- Secure resources from Hoonuit for collaborative learning space and place in courses with clear direction.
- Craft detailed directions for parings – who will do, by when and why will do.
- Reflect on and design a plan on when to offer repeated announcements and encouragement for project completion.
Plan B was to revert to individual assignment as per previously used in both courses.
Both projects were summatively assessed in formal fashions.
The Peer Review form, required for project submission, solicited feedback on five quantitative behaviors for evaluation and qualitative questions specifically r/t teamwork effectiveness, teamwork behaviors and overall team functioning.
Value of the group assignment was overwhelmingly reinforced in both courses on these peer review forms. Interestingly, students did not always rate each other as 5/5 on the quantitative behaviors. Qualitative comments included those related to: working well together; the need for flexibility, as there is more than one way to get things done; technology being invaluable; need to utilize strengths of team members; learning about constructive criticism; communication is key, one needs to pay attention to the style used, and it is a the center of teamwork; need to establish a good rapport; importance of trust and respect; opportunity to practice leadership skills; enjoyed opportunity to work with someone form another profession; importance of clearly understanding other’s expectations.
Course Evaluations for the 503 course included qualitative comments such as: indicating learning to work in a group was one of the most important things learned in the course and enjoying working with other students on a project that at first seems overwhelming. 412 course evaluations are not at the time of this writing.
Anecdotally, the quality of assignment submissions was elevated overall in the MSN/HCI course; likely not as significantly in the BSN course for this challenging QI assignment.
Project Reflections and Next Steps
What worked and what didn’t work?
Worked – Clear guidelines expectations and rubric for assignments; communications related to the rationale for pairings; responding promptly and patiently to student inquiries; affording/encouraging students to hold one another accountable.
Worked, not so well – Not adjusting the peer review points for percentage moving from MSN to BSN course (math able to be calculated however).
Continue to strive towards more collaborative versus group learning where learners are mutually dependent on one another, yet held individually accountable. Promote increased group effort versus divide and conquer mentality; encourage an emphasis on process, in addition to the final product. With collaborative projects, the learners co-create knowledge and meaningful learning.
What would you change for next time?
In MSN course – leave as is if appropriate mix of HCI and MSN students.
In RN to BSN course – provide weekly reminders about the upcoming project and specifically direct to communicate one on one with their partner early in the course.
How would you modify the project?
Consider student submitting a link to the collaborative learning space instead of submitting in Word; afford the opportunity to self-select partner; tweak guidelines to further promote collaborative project outcomes.
What did you learn?
Technology can be slowly integrated into assignments to afford to learn by students and faculty alike; all techno changes do not have to be flashy; reinforced need to stay focused on learning outcomes and goals; and not all digital native students maybe be as technologically advanced and savvy as others.
My primary goal was to find a way to enhance student observation skills and the clinical decision making that stems from those observations. My target courses were our Foundations of Movement Science series (2 courses – one in spring of 1st year and the second in the fall of the 2nd year of the program). Knowing that students in the first year of our program do not yet have extensive knowledge about specific pathology or disease processes, I created a set of sub-goals that were related using observation and using anatomy and physiology knowledge that they already possess:
- Improve student ability to identify and describe (oral or written) abnormal posture or movement pattern
- Improve student ability to hypothesize which body systems might be contributing to abnormal posture/movements
- Utilize evidence in decision-making
During our summer workshop, I worked with Lauren to look at various apps that would allow video analysis. I was looking in particular for an app that allowed slow-motion playback of recorded video and annotation of the video (voice recording over the video, drawing or highlighting portions of the video etc.). This was to enable students to watch movement at slower than normal speeds, with the hope that with practice throughout our program, they could start to see movement abnormalities in real time. I also wanted them to be familiar with an app that they might continue to use while treating patients.
Additionally, I was looking for ways to collect evidence in one central location so students could create a repository of evidence-based information that can be accessed later. My goal was to find something that allowed access after graduation, so I felt that Moodle was not the best option for this information.
After looking at several apps during our summer workshop, I decided to try Hudl Technique, which is an app for video analysis that is free. I particularly liked the ability to play 2 videos side by side (to compare movement at one time to movement at another time), the variable speed for slow motion, and the annotation features. Additionally, I decided to use the Google site feature for compiling information as students would have access to this after graduation plus many students have private gmail accounts and already use Google Drive for class projects.
Although I wanted to start this project with the first year class, in the fall I only teach with second-year students. So I started project implementation in Foundations of Movement Science II because it occurs in the Fall semester each year. It was my “trial run” for the major implementation in Foundations of Movement Science I, which occurs in the spring. I first became familiar with the Hudl Technique app myself by recording video and annotating it myself to practice.
I then decided which activity would use the app and redesigned the lab/activity to incorporate the new technology. The lab I chose is used to practice movement observation with students watching each other perform certain movements. I added an activity where students watched videos on the app and compared the person moving at one point in time to another point in time. While this person had a pathology (CVA) that the students had not studied yet, I had them focus solely on describing the differences they saw in the movements over time. During the first part of the lab, I took small groups of students into another room, connected my phone to the screen, and walked them through use of the app. I also scheduled assistance from Instructional Support on this day in case there were any problems, but for the most part, this went smoothly. Students were able to complete the activity in class and made some nice observations about movement by using the app.
In the Spring of 2018, I used the app again, but this time for movement analysis in Foundations of Movement Science I. This was the primary target for my Tech Fellow learning as we had been discussing redesigning some of the content already. My course redesign goal was to increase the amount of instruction about therapeutic exercise and also to increase evidence use when prescribing therapeutic exercise for strengthening. My tech fellow goal was to improve observation skills, which is essential for prescribing, teaching, and refining exercise with our patients.
Students were instructed to download the app and a short class session was placed on the schedule for learning about the app. I also created a demonstration video explaining ways that students might use the app for their assignment. The assignment was given at the beginning of the semester, which gave students approximately 6 weeks to complete it. Students were placed into groups to research a muscle group and find the EVIDENCE that says which exercises activate the muscle the best. They were also required to:
- create a video on Hudl describing the top 3 exercises for their muscle group
- discuss start and end positions for the exercises on the video
- discuss what they thought best posture and form were for the exercises
- use the slow motion and annotation features to enhance their teaching of the exercises
In class on the day the assignment was due, groups taught the exercises to the class from their videos. At the same time, the group and instructors circulated in class to watch and correct other students in real time.
Additionally, students had to create tables extracting the data about the exercises found in the literature. These tables were uploaded to the class website so that students have access to them when not at Chatham. A screenshot of the landing page of the student Google site is seen here:
I assessed student impression of the Hudl app by using the Feedback feature in Moodle. 35 out of 38 students completed the survey (they were given a few minutes at the end of the next class to complete this). The results are presented below for the second year students who used the app during Foundations of Movement Science II.
I also asked students to reflect in a free text box on what they found the most and least useful for the app. There were consistent comments that students liked being able to slow the video down to watch it, being able to rewatch as many times as needed without making a person do the movement over and over, being able to watch close up on a personal device rather than on the big screen in class. The least favorite parts of the app were that some people had a hard time getting it to work due to low memory on phones or not functioning exactly as demonstrated on an iPhone (because they had an android device) and also that we only used it once in class.
Anecdotally, several students reported seeing this app used in the clinic for running analysis with clients.
The same questions were asked in my second semester of use to the first year students in Foundations of Movement Science I. First year students were more ambivalent about whether this app should be used more in class than the second year group, which is interesting. Both groups thought the app was useful and helped learning. This group of students were asked about whether they liked the website where they could deposit information about exercise. Overwhelmingly, they liked this concept.
The narratives from the first years students were similar to the second years: they liked being able to slow down the video, liked having a video reference for exercises, liked being able to voice over a video to explain what was happening and slow down to watch. The least favorite things were also similar and dealt with difficulty of the technology (cannot rotate videos or cut portions out)and not being able to share as easily as they would have liked. Most students could not suggest a better website platform, although one person recommended YouTube as a better place for video storage.
Reflections and Next Steps
Hudl Technique Reflections:
- There is a time lag between sharing a video and having it show up on the app for students to see. Share video the night before it needs to be used!
- The free app does not allow more than 1 “team”… this will create a problem with each successive year because sharing a video will share it with students no longer in class. I could delete students from my team, but then they will not have access to the videos they used in class.
- The free Hudl app does not allow downloading of video that is shared (which would be a way to solve the challenge presented above). I wanted to upload video to the class’ Google site as well and cannot do that by downloading videos from the app. I could “film the film” as it is playing on my computer – but this is a time-consuming step and one I would rather avoid! There are subscription versions of the Hudl application, but there is a cost associated.
Google Site Reflections:
- Was very easy to set up
- But – you need Google drive for any pictures or documents that you are going to upload. This may lead to an eventual storage problem of my Google Drive if I continue this year after year.
- Determine what to do about the inability to download from Hudl Technique – either explore funding for more advanced version, look at other apps, or have students create on Hudl and then download to their computers and share via email/Google drive/youtube with me.
- Look for other ways to use movement analysis app more than once during class (need to discuss with co-instructors and instructors of other courses in our program).
- See if students as they transition to Foundations of Movement Science II like the app even more because they are familiar with it.
- Look for other ways to utilize the app in class so that the learning occurs for more than one activity.
My goal has been to design the first digital humanities course for the History program, and I am teaching that course this semester. The course is titled HIS 309 Digital Local History, and in it, students learn about an aspect of local history, study some of the primary opportunities and challenges of using digital media to analyze and interpret histories, and then use available primary and secondary sources to create an online local history exhibit.
Digital Humanities has recently become an important subfield in multiple disciplines, including history. It encompasses using digital technologies in research as well as presentation of findings. In the field of history, scholars are increasingly relying on digitized texts and images in their research. More and more archivists are using optical character recognition software to translate typewritten documents of the past into searchable text for current researchers. And finally, historians and curators are creating online exhibits with the goal of stretching beyond the written word or the museum wall to online media that not only make their work more accessible to a broader audience but also incorporate new ways to visualize information and allow more user interaction.
All of this means that it is important for Chatham history students to learn about these developments, learn some of the techniques of digital humanities, and to use new skills on projects of their own. Furthermore, this is another opportunity for students to make the transition from being consumers of information to historians in their own right. Finally, this course incorporates project based learning that is typical in many museums, archives, and historical societies doing the work of digital humanities.
In the summer and fall of 2016 as part of the Tech Fellows program, I researched digital technologies that students might use to create an online exhibit. Lauren Panton recommended Timeline JS by Knight Lab of Northwestern University as the umbrella tool for bringing various elements of the project together. Timeline can display photographs, images, infographics, and maps as well as play audio clips. Becky Borrello recommended a variety of platforms for the website including WordPress, Weebly.com, and Wix.com as well as a storyboarding technique for web design.
I chose an online textbook by Daniel Cohen and Roy Rozenzweig titled Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web for the students to read to learn about the challenges and opportunities of digital history as well as some of the basics of planning on online exhibit and questions historians must ask themselves http://chnm.gmu.edu/digitalhistory/
I also am having students read articles on local African American history as well as selections from David Kyvig’s Nearby History: Exploring the Past Around You. These additional readings should help to ground students in the secondary literature and give them ideas for finding primary sources.
Finally, I modeled the course schedule on a similar class being taught by one of my colleagues at Shawnee State University. Dr. Andrew Feight has been teaching a digital history course where students add to a growing digital archive of local records and photographs as well as a smartphone app that helps people explore local history. Feight uses project-based learning to encourage students to identify goals and learn the skills needed to meet the goal. The students also identify roles for themselves within the group and recognize the need for different people to have different and complementary strengths.
The course started in January 2016 and the students spent the early weeks reading local history, learning about potential primary sources in nearby archives (including our own), and discussing their own project. This project will focus on the history of Westinghouse High School and will incorporate the school’s “Wall of Fame”—itself an effort to preserve the school’s history—as well as oral history interviews that past Chatham students have collected.
The students only recently began the process of designing the website and gathering materials. They identified roles for themselves. To start, they all decided to explore individual topics: music, sports, women, civil rights, WHS in Homewood history, and education.
At the beginning of the process, I asked students to identify values for the group from then on. They identified values such as respect for participants, respect for the past, positive stories, and commitment to the project.
I also asked the students to imagine a process to hold one another accountable and to be the basis of grades. One student recommended progress reports, and I suggested they be biweekly. Another student suggested the final project be graded on three C’s: creativity, content, and citations.
During the project gathering phase, the only grade they receive is on their biweekly reports, but they get feedback from the group on their contributions and informal presentations.
Sam Houston State University’s Center for Project Based Learning recently identified common elements of all project based learning:
- There must be the presence of a driving question or central concept.
- Students must learn through investigation of defined goals and should be constructive and knowledge building.
- Projects are student-centered with teacher facilitation or guidance.
- Projects are real-world and have significance to the student.
- There is a task, a process, a product and a reflection.
Digital Local History uses all six of these elements.
In Digital Local History, there are three assessments of the project based learning.
The first is the feedback and grades I give on the biweekly progress reports. So far, I have based these grades on the level of effort and introspection on the reports. Students who have spent times crafting the reports, detailing significant efforts, and contemplating their results in the context of the larger project have received A’s. Students whose reports show evidence of sloppiness and superficial thought and a lack of significant efforts to gather materials have, so far, received C’s and encouragement to rediscover their passion for their topic and to fall back on skills they have read about in class.
The second feedback will be from community partners. This is a common practice in PBL, and we are scheduled to present a nearly finished product to community partners near the end of the semester. This will be an opportunity for them to comment on the project’s accuracy, creativity, and its spirit—does it capture the history of Westinghouse High School as the community understands it?
Finally, I will give the project an overall grade based on criteria suggested by a student and agreed to by the others: creativity, content, and citations.
Successes and Challenges
One of the successes has been getting the students out of the classroom and into the local archives and brining community partners to the classroom. This has made the project all the more real for the students. Students have seemed to value their interactions with people who experienced the history they are discovering. And getting in the van to take a short trip has injected some feeling of going into “the real world” to explore history.
One of the challenges has been that this particular group of students is not particularly talkative, especially not the students who are most prepared for class. This has led to stilted conversations instead of exciting brainstorming sessions.
Furthermore, one of the essential elements of PBL is to have students develop their own goals and then learn skills along the way to achieving those goals. It has been hard to get student to visualize a “desired outcome” that encourages them to learn new skills. Instead, students want me as the instructor to tell them what to do, and they want me to show them templates for them to fill in. This undermines one of the elements of PBL, but given that the students are unaccustomed to PBL and are afraid to fail, this is one of the concessions I am making.
Reflections and Next Steps
Ultimately one of the biggest challenges for me is relying on the students to deliver a finished product for community partners to see and evaluate. Like most instructors, when I am in control of the content of the course and structure the class to ensure certain outcomes, I am in my comfort zone. This course has forced me to leave the comfort zone and entrust the students with more control and has forced me to have faith that they will deliver.
Over the next month, the students will bring together their text, images, and audio, assemble them into timelines and webpages, present them to community members, and make some revisions based on community feedback.
All students are required to understand and adhere to programmatic processes related to clinical education/clinical experience. Additionally, they are required to follow all policies and procedures associated with each assigned clinical site to include attainment and documentation of health requirements and clearances. Lastly, students benefit from tailored coaching and mentoring as they prepare to enter into their clinical experiences: be it there first experience or final experiences. Delivering this information by way of class lecture can be challenging. There is a perceived benefit to having online audio-visual recording to allow students to asynchronously access and further consider various elements of clinical experience expectations and preparation during self-selected time periods.
I first considered and prioritized key topic areas that could be reinforced with audio-visual recordings. I then organized them chronologically matching the sequential order for students to gain understanding of the content. From here I selected to best audio-visual format for content delivery.
- Castle Branch – Web ex recording (PowerPoint and program assistant presentation)
- CE overview – Panopto (PowerPoint and audio recording)
- APTA CPIWeb use and training – Panopt0 (PowerPoint and audio recording)
- Making the Cold Call – Panopto (PowerPoint and audio recording)
- CE I pep talk – Panopto (PowerPoint and audio recording)
- Continuum of Care – Sway
In all cases, each topic and its content is delivered via traditional classroom lecture. This takes place during didactic work when students are often preoccupied with current study requirements and lack full readiness to prepare for clinical education. Students will be provided with the traditional lecture component and then will be guided to these videos for viewing at a self-selected later date.
All Panopto audio-visual recordings are housed and viewed from Moodle. The number of times and the timing that each recording is viewed will be tracked. This data will be useful in determining students’ preferred timing for topic review as well as perceived investment in clinical experience preparation.
Additionally, a brief summative survey will be provided in Moodle and paired with the corresponding recording. These feedback forms will be optional for students to complete. The survey will solicit students’ perceptions of content clarity, resolution of questions or confusion etc. and open-ended feedback for improvement suggestions.
Reflections and Next Steps
With each Panopto recording I create, I determine ways to improve a re-recording. Finding contentment in a given recording seems to be my greatest challenge. However, I have determined that there are several key strategies when creating each recording that are applicable for all recording in the series: 1) keeping details generically applicable on a year-to-year bases is key for recycling videos and allowing them to be applicable for multiple cohorts and through several curricular cycles, 2) keeping my intonation captivating yet also absent of emotion is also important. I often tailor my classroom lecture style based on the current climate i.e. if they are gearing up for exams/practicals if they have just completed a testing cycle. The audio recordings should be relatable regardless of current didactic activities.
This project involved integrating SWAY technology into NUR 407, a course in the RN-BSN program. NUR 407 introduces registered nurses to the research process. The formal research process is foreign to most students in the program. Thus, a main goal was to bring specific research concepts and principles down the ladder of abstraction, by integrating connections into clinical practice. These types of connections can assist students in seeing how the research process can be applied to daily professional practice.
NUR 407 is an online course offered over a seven week session. In this particular course, professional experience as a registered nurse was between 1-2 years. Additionally, for some students this was one of the first courses taken in the program, for others it was a course taken towards the end of the program. Therefore, I considered which type of technology could meet the primary goal along with not being overwhelming to the student.
A variety of technologies to augment online learning were assessed for appropriateness and user friendliness. It was determined that SWAY met these criterion. The SWAY interface revolves around a storyline (selected by the user), into which users add a series of cards. Each of these are cards are filled with content that is applicable to the storyline. Various cards are available for different types of content and can be grouped together into sections. Content intended to be the user’s narrative can be easily rearranged, deleted or adapted. This flexibility allows stories, based upon educational content to be created in SWAY. This approach is readily adapted, efficient and free-flowing. It can serve as an alternative to PowerPoint presentations. Unlike PowerPoint, there is no option for creating content in SWAY itself. Content must be uploaded into the software as it’s intended to be used or it can be pulled directly from different sources from within SWAY. Current sources include You Tube, Facebook and Flickr.
To integrate SWAY into the course, I reviewed course content that was less familiar/more abstract in order to integrate SWAY accordingly. I then identified relevant topics, searched the literature, and developed a SWAY. Additionally, a Bubbl.us (concept mapping) exercise and Survey Monkey (assessment purposes) were built into the SWAY. I did a brief introduction of each technology piece and encouraged students to reach out with questions. In order to use Bubbl.us, students had to create an account, which had no associated costs.
Assessment was conducted by integrating a brief Survey Monkey questionnaire within SWAY. The majority of students rated SWAY and associated tools, i.e. Bubbl.us as positive learning tools. Likewise, the majority of students had never been exposed to these types of technology.
Students expressed an appreciation for doing something “visual” that assisted in clarifying meaning of abstract content. Additionally they were able to make connections between abstract research concepts and daily clinical practice. During the latter part of the course, students revisited SWAY content and commented on its effectiveness in meeting course objectives and overall learning.
Reflections and Next Steps
I plan to more fully integrate SWAY into graduate nursing courses, as applicable. The wide array of topics, visuals, and technologies that can be integrated into SWAY make it a viable technology for both undergraduate and graduate level courses. The user friendliness, ability to engage students and overall creative approaches make it a highly useful tool. Additionally, its usefulness allows for creatively highlighting specific educational content. I will continue to integrate both Bubbl.us and Survey Monkey into future SWAY presentations.
One of my tech fellow goals was to use technology to better facilitate the multiple learning styles of students taking my Advanced Data Analysis class. Research statistics can engender a good deal of anxiety for students. Math anxieties are common, and can lead to patterns of procrastination and avoidance when learning research design and statistics. As a psychologist, part of my teaching process includes finding ways to “demystify” complex processes and to address students’ level of anxiety at each step of the learning process. By breaking learning into small pieces, using repetition across a variety of learning methods, and assigning lots of weekly homework assignments, the course provides many “graded exposure” experiences, where students can, over time, master their anxiety of statistics and (at least for some!) start to take pleasure in the process.
For the current project, I wanted to try to incorporate brief SPSS tutorials into my Moodle shell, using both Atomic Learning (24/7 Technology Training) modules and some of my own brief tutorials developed using Panopto with direct screen capture of me conducting (and narrating) specific SPSS analyses. This technology would allow students, whose anxiety can interfere with memory consolidation during class, to go back to review how to implement basic SPSS procedures and analyses on their own time. In addition to my own lectures that incorporate conceptual discussion and marked-up static screen captures of SPSS outcomes, I thought that these tutorials would be a helpful addition for students who have difficulty following (or remembering) how analyses were conducted during classes.
As part of my planning process, I conducted a brief survey of students at the outset of my spring Advanced Data Analysis course. Survey questions included information about the students’ previous training in both research design and statistics, and their feelings about these previous courses. These data suggested that, in general, they had found their previous learning to be on a “pretty superficial level” and their previous experience of learning to be, on average, “tolerable, but not terribly interesting.” When asked to rate their current level of anxiety and enthusiasm about learning about research design and statistics, they generally rated themselves as more anxious than enthused.
In addition, I queried the students regarding their preferred methods of learning when it comes to research methods and statistics (see below). On average, the group showed a preference for working through problems as part of a larger group discussion and feedback session, with the lowest ratings for straight lectures about research concepts. They also expressed a mild preference for watching the instructor run/interpret analyses live in class (as opposed to watching videos of the instructor), and both of these methods were preferred above watching other internet tutorials. When discussed in class, students clarified that they would find brief instructor-led video tutorials to be helpful, but only as an addition to live classroom demonstrations, rather than in place of live sessions.
I have started implementation this semester. At the beginning of the semester, a number of brief Atomic Learning tutorials were posted to Moodle, most with the goal of aiding those students with little to no previous SPSS experience to become familiar with basic data manipulation and cleaning procedures. In addition, I have played with developing my own tutorials in Panopto, posting the first one on how to run moderation analyses in SPSS. More of these videos will be posted in the next section of the class covering ANOVA analyses, with formal assessments from students to be conducted at the end of the class.
Despite the “mean level” preferences reported in the above graph regarding student learning preferences, I was struck by the stark differences reported by different students with respect to their preferred modes of learning. This makes me think that continuing to incorporate multiple media approaches to learning will be the way to go, in order to enhance learning experiences across a range of different learners. It will also be interesting to see if the students’ preferred learning strategies and/or levels of anxiety or enthusiasm for the material changes at all following completion of the class.
Formal assessments from students will be collected at the end of this spring term class. Informally, I’ve found that that not all of the Atomic Learning videos are great, so it does take some time to find the tutorials that will be most helpful for student learning. I’ve also learned the importance of maintaining consistency when selecting and/or developing on-line tutorials of basic procedures that we cover in class. While you can often find a number of ways to run the same basic procedures in SPSS, early learners of the software get confused if you introduce too many options too early in the learning process. I’ve also learned that it is unlikely that I will do away with running most all analyses together in class (in a format where students can ask questions and talk through the process, and in which we can interpret outputs together as a group). Indeed, when I floated the idea of only posting some taped tutorials (which do supplement both book procedures and my lecture slides), there was mild panic from the class.
Reflections and Next Steps
This semester, I will work to develop a few more basic tutorials based on the basic ANOVA analyses we will be running in class. I also plan to go through additional Atomic Learning modules covering basic Regression and ANOVA procedures. I will retest the group at the end of the course, and will use those data to inform further modifications to the class for next year.
I incorporated the use of ThingLink, an interactive multimedia platform, into my Histology (BIO458/558) course in the fall of 2015. This course teaches students the skills needed to identify and characterize the various parts of the human body at the microscopic level. In previous years of teaching this course, I have had each student give a presentation on a specific type of tissue or part of the body as part of the course requirements. The students would project digital images that they would describe the class and they would include question and answer session for the other students in which they would ask students about to identify specific parts of an image. However, I found that these were increasingly using up precious classroom time and I also wanted an opportunity for the other students to evaluate the images on their own time as they prepared for examinations. When ThingLink was introduced in the Tech Fellows meetings over the summer, I realized that this could be a useful tool to allow each student give a presentation outside of the classroom, using digital images that they could annotate and attach other media to, and would be available to all the students to review whenever they wished.
The first thing that I had to do when planning the project was learn how to use ThingLink. I practiced using some digital images of histology slides that were available to me, and annotated them using the tools available in ThingLink. It was also necessary to set up an account that the students could log into and then be able to use all the functionality of ThingLink. Chatham purchased several accounts for this purpose, although I needed only one.
A major course objective was the development of skills to correctly identify and characterize different parts of the body using microscopic images, and this project fit well with that objective.
This technology allowed for a substitution of an in-class project with an outside project that would be available to allow students via the classroom Moodle site.
The use of ThingLink for this project allowed for all categories of Bloom’s taxonomy to be used, including recalling basic concepts (Remember), explaining concepts (Understand), using information in new situations (Apply) since they had to identify the various parts of each section using what they previously learned, and they had opportunities to draw connections (Analysis). The final product was uniquely their project, and therefore was new work (Create).
Each student was assigned a password and given an access code to my ThingLink “classroom”. I first had all the students learn how to use ThingLink by annotating a single image, and they received a grade for that assignment.
Once I was comfortable that they were proficient at using ThingLink, each student was assigned a specific part of the body or a tissue to present using ThingLink. These were spaced out through the semester, and each ThingLink was completed just before an exam, so that the other students could use the ThingLink for a self-testing tool to help prepare for the exam. A link to each ThingLink presentation was posted on Moodle by me, so that it was easy for students to access the presentations.
An example of a couple of ThingLink presentations that were completed by my students can be found below:
Fortunately, it was not necessary to have a plan B.
I used both formal and informal assessments. I would ask the students about how they liked ThingLink from time to time. The most common complaint was that some students had trouble creating a set of annotated images in the order that they wanted. All ThingLink presentations could be viewed like a slide show.
The formal assessment was in the form of a questionnaire that each student completed. Based on the questionnaire results, the students found ThingLink relatively easy to use, most students viewed other students’ projects,
Surprisingly, a relatively large number of students felt that viewing the ThingLink presentations of other students was of little value. In contrast, slightly more students felt that it was of significant value when they were preparing their own presentation. One student suggested that I have some way of requiring students to view other’s presentations, and possibly give bonus points for that.
92% of the students agreed that ThingLink should be used in next year’s Histology course.
Results of ThingLink questionnaire, given at the end of the fall semester:
1. On a scale of 1-5, with 1 being the least difficult, and 5 being the most difficult, rate how difficult you felt the process of learning ThingLink was and applying it to the histology unit to which you were assigned:
|not difficult at all||9 (75%)||1 (8%)||1 (8%)||1 (8%)||0||12|
|a little difficult||5 (42%)||4 (33%)||3 (25%)||0||0||12|
|moderately difficult||7 (58%)||2 (17%)||2 (17%)||0||1 (8%)||12|
|difficult||8 (67%)||3 (25%)||0||1 (8%)||0||12|
|excessively difficult||10 (83%)||1 (8%)||1 (8%)||0||0||12|
2. Rate how often you viewed other students Thinglink sessions:
|never||11 (92%)||1 (8%)||0||0||0||12|
|one or two||8 (67%)||3 (25%)||1 (8%)||0||0||12|
|three or four||8 (67%)||1 (8%)||2 (17%)||0||1 (8%)||12|
|most (more than 4)||10 (83%)||1 (8%)||0||0||1 (8%)||12|
|all||7 (58%)||1 (8%)||0||3 (25%)||1 (8%)||12|
3. Rate the value, to you, of viewing OTHER STUDENTS ThingLink workshops, in terms of how it helped reinforce the histology concepts for that section.
|no value||11 (92%)||1 (8%)||0||0||0||12|
|a little value||11 (92%)||1 (8%)||0||0||0||12|
|some value||7 (58%)||2 (17%)||2 (17%)||0||1 (8%)||12|
|moderate value||8 (67%)||1 (8%)||1 (8%)||2 (17%)||0||12|
|considerable value||8 (67%)||1 (8%)||1 (8%)||0||2 (17%)||12|
4. Rate the value, to you, of preparing your ThingLink, in terms of how it helped reinforce the histology concepts for that section.
|no value||10 (83%)||0||0||1 (8%)||1 (8%)||12|
|a little value||8 (67%)||2 (17%)||0||2 (17%)||0||12|
|some value||9 (75%)||1 (8%)||1 (8%)||1 (8%)||0||12|
|valuable||6 (50%)||1 (8%)||0||4 (33%)||1 (8%)||12|
|extremely valuable||6 (50%)||0||0||1 (8%)||5 (42%)||12|
5. Do you feel that ThingLink workshops should be used for next year’s histology class?
6. If you said NO, ThingLink workshops should not be used next year, please describe why you said no.
|It should be used again next year.|
|I said yes to the ThingLink being available for next year’s histology class.|
|I did not say no.|
|I would rather spend more time looking at slides than making a ThingLink, it was more helpful talking about discussing the slides in class.|
|just some minor tweaks and I feel it could be used again|
|I said yes.|
7. Briefly describe the things that you liked about making or viewing ThingLink workshops:
|it’s easier. I review them after I study to quiz myself.|
|I think it was a good study tool|
|I looked at and interpreted a lot of digital slides while preparing for my ThingLink assignment. Great study tool.|
|I liked seeing images and diagrams from other ThingLink workshops that I did not find yet. I definitely think that the ThingLink helped provide additional images and questions for studying. Personally creating a ThingLink did a really good job at reinforcing the materials for the assigned section.|
|Uploading images from online is easy and helps facilitate the projects. Identifying the different parts of the cells helped with memorization of the units we were studying.|
|You can reinforce learned topics that were discussed in class at your leisure.|
|Making up questions within the ThingLink was a great study tool because it made you think about the concepts that may be seen on the test. Also, searching for different histology slides to put in the ThingLink helped with remembering what to look for in the glass slides. Finally, the way you could see an overall picture and then have a zoomed in view under the microscope put the information of the material in a better perspective.|
|I barely viewed others ThingLinks however, making them helps reinforce what we learned in class that day.|
|Creating my own ThingLink helped me learn the material the most… I just didn’t really look at others’ ThingLinks, so I’m not sure how to make it possible every project to help each student. Maybe make some of the questions on the ThingLinks bonus so everyone will look at all of the projects?|
|It was helpful getting to know everything about the topic assigned, and I had a full knowledge of the workshop I posted, but not much helpful when it came to other’s and their workshop.|
|I think making the ThingLinks was helpful because if forced you to have a reinforcement of the materal. Maybe if it doesn’t remain a part of the coursework for future classes, something along the lines of a pre-test that would make the students have to think about the material in terms of how it would be asked on a test.|
|How it was up to you to make the audience engaged|
8. Briefly describe the things that you disliked about ThingLink workshops:
|At times it was difficult to find different images from .edu websites|
|The interface was difficult to learn, but once I got the hang of it, the program was easy to use.|
|The only problem with the ThingLink workshops are some of the labeled images provided wrong answers or mislabeling. This was the only hinderance to the workshops since it made me second guess myself a few times on the material.|
|The icons were not very specific. Having a more specific icon would allow for smaller identification of details.|
|It was a little confusing at first because I didn’t know how to follow individuals in order to view their channels but sending the links to the professor and having her upload the link helped fix the problem.|
|It was just a pain to find photos that were able to upload in the web url portion.|
|Since my project was closer to the end of the semester, I felt that I couldn’t dedicate enough time to the project with all of the other assignments that I was also working on.|
|Cannot make corrections or move slides around so some presentations were a little out of order, which confused me.|
|Sometimes student would have two dots: on for a question and one for an answer. I would sometimes scroll over the answer first and then the question was kind of wasted. If there was a way to hid the answers it would have been helpful.|
|That depending on the subject it took long to find pictures and to add certain details|
9. Please provide any additional information or comments about ThingLink workshops not covered by the previous questions.
|it worth it|
|Good supplemental study tool to test yourself|
|I think the workshops were a good job at providing the class with additional digital images for outside of class. Some of the images selected by classmates were very similar to ones on the exams, so I felt very prepared from studying from the workshops.|
|The separation of units seemed fair and it was appreciated that not every unit only had one thinglink|
|Overall, it was good to use other classmates ThingLinks as an extra study tool.|
|I definitely think that ThingLink is a useful resource for Histology.|
|It was user friendly, it just took time to figure how to work it.|
|Maybe find some other way to engage the class and also to help study, in addition to ThingLink?|
Reflections and Next Steps
For the most part, the entire process worked well. Some students put more effort into their presentations than others, but this was a graded assignment, so that the greater effort resulted in a higher grade. Next year, I would like to modify the project by having the students take pictures of microscope slides, instead of using digital images acquired from the internet. This would require a higher level of skill, and was what I originally intended to have them do, but realized that the digital camera setup that I intended to use wasn’t quite ready for them to use.
My goal for year one of the Faculty Technology Fellowship involved expanding the ways that I have promoted visual communication with my online students. Some specific activities that I wanted to implement included using video messages to provide feedback and information to students rather than inserting a text box or forum message. Other ideas to overall enhance the visual communication with students involved the use of images/photos in the Moodle shell, the use of a standard Panopto Welcome message to be placed in all courses being taught by me, and uploading a personal profile photo in Outlook.
Since the curriculum is set for all programs in Nursing and the assignments are also prescribed, my goal was to focus on the student experience and likability of the courses. I found that some of the tools already available to us as part of Moodle can be cumbersome and not always quick and easy to use. I wanted to find a solution that can be done with little notice, planning or effort. I also wanted to be able to grab my iPhone or iPad when I thought of something I wanted to tell my students. By videotaping a message in this way, I was able to substitute it for a written message. This provided a different means in which to quickly communicate with students. Ease of accessibility and convenience were most important when searching for a solution.
With the help of Lauren and Becky, I identified “Capture” as an application on my handheld device that could be downloaded for free. Using my handheld device (mostly iPhone), I was able to record and store a video message wherever I was at the time. After the Capture app was downloaded, all I had to do was open the app and search for the video I created for my students using my iPhone video recording function. I selected to “upload” it and after a few minutes, a URL was created for private use. For step-by-step instructions on using Capture, please read these directions. Finally, I emailed this URL to my Chatham email, and then coped and pasted it in a Moodle Course shell into the applicable block (week). By posting the URL as a Label in Moodle, the video box displayed in the course rather than just a link.
The assessment was informal and based on student and faculty feedback. Students seemed to like the informality of the message and the ability to see and hear me. A faculty member who first saw the video message in my course said she was surprised to see the video and thought it was a friendly way in which to communicate the students. Some faculty didn’t like the “close up” effect of the video if you held your own hand held device in your hand while you were recording (kind of like a selfie). A way around that would be to rest it someplace at a suitable, desirable distance for recording. Some faculty thought this seemed much more convenient than Panopto whereas other faculty found Panopto to be just as convenient.
Reflections and Next Steps
Being an online educator for the past 9 years, my courses/programs were already developed to adequately capture formative and summative assessments. All content and assessment methods in each course were designed well to capture the accreditation requirements for the various nursing degrees. My goal as a tech fellow is to find easy, convenient solutions to use technology as a tool for student and faculty workflow, promote student satisfaction, and foster faculty and student relationships. A future project will involve the use of technology to help further explain content or projects that are found to be more difficult for students to grasp. By using technology to aid in student learning of a particular component of the curriculum, student work flow and satisfaction should improve.
As part of year 1 of my Technology Fellowship (2015-2016), I wanted to focus on enhancing feedback to online doctoral students on their capstone projects, and improving the peer review process already in place within the occupational therapy doctorate (OTD) capstone courses. As a result, I explored the use of Turnitin’s GradeMark and PeerMark in detail, and piloted use of these tools in several courses. As part of this process, I also undertook the task of revising the analytic rubrics for each of the 6 capstone chapters.
In planning this project, I had to consider both the course learning objectives as well as my personal goals for the project. In the OTD program, students take a series of evidence-based practice courses designed to guide them through the development, implementation, and evaluation of their doctoral capstone projects. This process includes the writing of 6 capstone chapters with peer review integrated throughout the courses. Goals of peer review include helping the students to increase the quality of their work and to emulate the peer review process inherent in pursing publication, since this is also an objective of the courses/program.
Previously, the peer review process involved instructor pairing of peers, exchange of papers among peers, and general provision of feedback to each other using the assignment rubric as a guide. In the past, both instructor feedback and feedback from peers was delivered via the Track Changes feature in Microsoft Word. This process entailed downloading the student’s file, pasting in the rubric, saving to your computer, adding comments, completing the rubric, resaving, and then uploading the feedback file to Moodle. This process is cumbersome and time consuming, so my personal goal was to streamline the process and be able to provide each student with richer feedback in a timely manner.
Goals for the project included:
- Improve the quality of feedback/grading provided on student assignments, to increase quality of student work and student satisfaction and decrease instructor time commitment. (Technology used to augment, modify)
- Improve peer review process to improve quality of student writing/publication. (Technology used to modify)
The first step in the project was to redesign the analytic rubrics for the capstone courses. The prior rubrics were analytic in the sense that they listed the assignment criteria with each criteria having 4 possible scores, including outstanding, meets criteria, approaching criteria, and below expectations. Since these courses are taught by several full time faculty, as well as adjunct faculty, it became apparent that the scoring needed to be more objective. A variety of resources on Bloom’s taxonomy and rubrics were consulted in development of these rubrics. Weighting was also used for assignment criteria to emphasize categories according to course objectives.
Next, to improve the quality of feedback that students both give and receive in the peer review process, structured peer review questions were developed for each capstone chapter via modification of questions within the PeerMark library. For example, these are the peer review questions for the chapter 1:
|Scan this paper for errors in formatting of in-text citations, direct quotes, and the reference list. Give several examples of these errors, if they exist.
Question type: Free Response
Minimum answer length: 5
|Does the writer use sufficient evidence/references to support the existence of and the need to address the identified problem? If yes, explain your rationale for this answer. If no, explain where support is lacking and how this section of the paper could be strengthened.
Question type: Free Response
Minimum answer length: 100
|How effective was the writer’s use of language related to readability and clarity of the subject matter? Very effective would be similar to the language used in professional journals.
Question type: Scale
Highest: very effective, Lowest: very ineffective
|Does the writer give a clear and concise description of the setting (omitting all extraneous details and leaving no unanswered questions)? Please provide the rationale for your answer as well as suggestions to improve this section if necessary.
Question type: Free Response
Minimum answer length: 100
|Does the writer acknowledge all applicable supports and barriers in the setting? Provide suggestions of additional supports and barriers to be considered if applicable.
Question type: Free Response
Minimum answer length: 1
Next, the revised rubrics and peer review questions had to be entered into Turnitin within Moodle, and I had to test/pilot these features to be sure that I understood the functionality and settings available. An additional benefit of using Turnitin, is the availability of the originality report, since these capstone assignments involve increased use of external resources, quoting, and citations.
I also had to consider that this would likely be NEW technology for most of the students, so tutorials on how to navigate the technology would be necessary. As a result, 4 videos demonstrating how to upload a paper to Turnitin, how to retrieve instructor feedback, how to complete a peer review, and how to access peer review comments were created by Instructional Technology and posted within the courses.
I assessed the project both formally, through a survey created within SurveyMonkey, and informally via dialogue with students during synchronous classes, an onsite visit, and phone conversations. Some info about the project was also gleaned from Chatham course evaluations as several students commented on this process in those evaluations. These formative assessment methods revealed the following:
- Some students struggled with navigation of the technology, but not all students took advantage of the how-to videos posted within the course. An extra synchronous online class was held to answer students’ questions specifically about Turnitin & PeerMark.
- 56% of students who responded to the survey said they preferred feedback via Turnitin (as opposed to the Track Changes files within Microsoft Word) or liked both methods equally.
- Features that students liked best about Turnitin: the originality reports, audio feedback from the instructor, ease of use and retrieval of feedback, variety of options to mark papers with ease.
- Students struggled with the use of PeerMark to complete the peer review process. Issues included: difficulty with technology, not viewing how-to videos, mismatched pairs for review resulting in some students getting multiple reviews of their papers and other students getting none.
- Despite these glitches, the average of all student responses to the question “How valuable do you feel the peer review process is to the capstone process on a scale of 1 to 10? (1=not valuable at all; 10 = extremely valuable) was 7.5.
- 88% of students reported utilizing outside sources to verify information when completing their reviews of peers’ papers and reviewing others work helped them to better understand course content and strengthen their own work.
- As an instructor, I also felt the comments students made on their peer reviews were more appropriately directed toward the content and of higher quality than previous.
Reflections and Next Steps
I consider the use of Turnitin’s GradeMark a success. Students had little issue with submission and retrieval of feedback via this system and I found it easier to give detailed feedback. I particularly valued the ability to record an audio comment with each assignment and to save custom QuickMarks for use in future papers.
The use of PeerMark for the peer review was definitely a challenge on many levels. Going through the process helped me to hone the questions that students answered about their peers’ papers, and to realize that the students do understand the purpose and value of the activity. As a result of the issues encountered with this process, I’ve moved the peer review process to an online forum within Moodle, but continue to have students answer the more detailed questions. I’d consider piloting the use of PeerMark again in another course, but would likely opt to hold a live synchronous class to review the process, in addition to posting how-to videos in the course.
My goals for year 2 include:
I wanted to find technology that would support students’ writing and perhaps improve two assignments in the Psychometrics course that I teach in the first semester of our PsyD program. The assignments are an accurate paraphrasing activity that did not work very well in the previous semester and a final paper.
The technology I found was a program called NoodleTools. NoodleTools is a program intended to help students take notes, create outlines, and create correct bibliographies in several accepted formats. NoodleTools allows students to share their materials with instructors throughout the note taking and outlining process.
I wanted to find technology that would support students’ writing. Specifically, the Psychometrics (PsyD) course includes a project where students are asked to identify an assessment (interview, test, survey) that they are interested in, research the measurement properties (reliability, validity, sensitivity, norms, etc.) of that assessment, and write a final paper summarizing those measurement properties and expressing a professional opinion about where that assessment can and should ethically be used. In past semesters, I have noticed that students struggle with various aspects of this project including accurate paraphrasing and avoiding unintentional plagiarism, organizing information logically, thinking critically about the research they read, and writing in general. I’ve responded to these difficulties by creating an accurate paraphrasing activity, providing a general outline for the paper, and occasionally requiring a draft of the paper. However, this is still a challenging project in the first semester of doctoral study and the accurate paraphrasing assignment was frustrating for students in the previous year, so it needed to be improved. I hoped that technology would provide a novel way to modify the assignment and also provide another avenue to support student research and writing.
The final paper in the course is supported by the accurate paraphrasing activity I designed and is intended to demonstrate several student learning outcomes from the course:
- find and describe the psychometric properties of test and measures
- apply the concept of reliability to the evaluation test and measures
- apply the concept of validity to the evaluation of test and measures
- articulate the ethical dilemmas faced when selecting tests and measures
- demonstrate ethical decision-making by identifying choices consistent with the ethical guidelines related to assessment
I created a ‘Project’ for the accurate paraphrasing assignment, did an example for students to look at (Millon Behavioral Medicine Diagnostic), and a Dropbox folder (Psychometrics) connected to NoodleTools for students to use to share their projects with me.
NoodleTools focuses on how students take notes (using notecards) and on linking the notes to sources. Students create notecards which can be sorted, stacked, tagged, etc. and are displayed visually.
With the accurate paraphrasing assignment, students were asked to start with a direct quote from a source and then paraphrase it. There is also space to add questions or critical ideas.
When they shared their project with me, I commented on the paraphrasing and answered questions directly on their note card.
I was definitely learning NoodleTools as I went along.
My plan B was to simply modify the Turnitin assignment I had created the previous year.
I assessed my project using a brief anonymous survey of students at the beginning of the semester, before I had completed a presentation on unintentional plagiarism.* Students were asked to rate their agreement or disagreement with a handful of statements (that I made up) about aspects writing papers. I repeated this survey at the end of the semester.
At the beginning of the semester, most students used direct quotes in their notes, felt that they could paraphrase accurately, but did not consider unintentional plagiarism.
At the end of the semester, there was more variability in how much they started with direct quotes, they were more confident in their accurate paraphrasing skills, and they were all considering unintentional plagiarism (a win!).
They also generally thought that NoodleTools was easy to use (yay!) and helped them paraphrase accurately (yay!), which is most likely a function of the assignment they had to complete in NoodleTools.
But they did not find that NoodleTools helped them think critically about the research they were reading, organize their ideas, or write their paper.
Overall, they didn’t think NoodleTools was worth the effort.
Reflections and Next Steps
The focal assignment for this NoodleTools project was modifying the accurate paraphrasing activity I had created previously. NoodleTools worked for that. What it didn’t do – help students organize ideas, think more critically about the research they were reading, create an outline, write their papers – I hadn’t supported. I would like to learn more about NoodleTools, particularly the use of tags, sorting and organizing notecards, and creating outlines so that NoodleTools begins to support those aspects of the final paper. For next year, I will not provide students with a general outline for the final paper and will instead ask them to create their own outlines in NoodleTools. I also plan to add more of a focus on the “My Ideas” section of the notecard so that I can increase the focus on critical thinking about the research students read.