Wednesday, October 28, 2015

Inquiry Project: Psychometric Testing

The inquiry project was a tough one for me.  I had difficulty coming up with a topic because I'm in a state of life right now where it would be cool to learn something completely new (like quilting, for example.... I would like to learn to quilt) but if I am going to dedicate 6 hours to a project, right now it better be to contribute to my learning on a project I must complete -- because there are many of them on the go right now.  There were several home improvement projects that fell on that list, but my significant other is fussy.  He loves doing that kind of thing so much, to the point that he wants to be the one that does it all. Hence our conversation by text the first night of class about what I could build for this project.  Sigh......   I love him.  (P.S. I'm a feminist, really, I am).

But in doing my reading for a research project I am currently working, it struck me that I not only could I, with some self-study, replicate some of the tests done on the questionnaire described in the dissertation I was reading, but I could also use it for my project.  And so the psychometrics project was born. 

Learning the statistical procedures turned out to be not that difficult but it did take me most of a work day to put it all together.  The presentation had it's own challenges because it was hard to describe what was going on in each slide in such a way that it would only take 15 seconds to read.  I feel I make a few leaps of understanding in my descriptions but I did the best I could to keep it brief but still make it understandable and interesting. 

Interestingly enough, I have opportunity to use this format again at the Manitoba Cycling AGM, as we were invited to do a presentation on women and cycling but were only given 5 minutes.  No problem!

Below is the script to the video that is imbedded above.  Please enjoy. 

1. Psychometrics are the statistical tests performed on a questionnaire to assess if that questionnaire is performing consistently and measuring what it says it is measuring.

2. Usually when I do research, I hire a statistician. Here is my statistician, Tom Harrigan. Statisticians don’t have time to think about petty things like office cleaning when they are performing mathematics on data so here is Tom on a day that he cleaned his office and we were all amazed

3. The questionnaire I developed was an instrument used to measure Writing Self-Efficacy.  It uses a 4-point Likert scale measuring agreement to disagreement on 10 items related to confidence in writing ability.  The lowest possible score on this questionnaire is 10 and the highest is 40.

4. The inspiration for this project came when I was reading a dissertation describing the psychometrics of a similar scale designed for Adult Basic Education Students.  I felt that I could replicate many of the statistical procedures explained in this dissertation.

5. One of the key psychometric testing procedures is Factor Analysis. I took Biostatistics in 2002 when I was in the masters of nursing program.  I found my old textbook and notes hoping there would be information on Factor Analysis …… but there wasn’t.

6. So I asked statistician Tom what he thought about my project and he told me I needed more than 6 hours to learn Factor Analysis. And given the complexity of the formulas for factor analysis, I could see what he meant.  So I performed some of the other tests described in the dissertation instead.

7. A program like SPSS is much more efficient for statistical analysis but for this project, I had to go with what I had, which was Excel. By using you tube videos, I learned how to draw graphs, perform correlation, and create binned tables.  

8. So, I pooled the data collected from 231 participants over two past studies. I reordered the data so that the total scores from the questionnaire were ranked from lowest total writing self-efficacy to the highest. The lowest score in the sample was 15 and the highest was 40.

9. Many of the statistical tests I performed required analyzing each question by comparing the top 25% to the bottom 25% of the sample. Strong questions would show low writing self-efficacy students disagreeing, and high self-efficacy students agreeing with the statements on the questionnaire.

10. For example: Question 1: “I feel I have the skills to write a scholarly paper” can be considered a successful question, as can be seen in this plot mapping the number of strongly disagree, disagree, agree or strongly agree responses provided by both the low and high writing self-efficacy participants.

11. I also experimented with using an online stats calculator from a QuickCalcs website to calculate mean, standard deviation, and perform T-tests both on each question, comparing the highest and lowest scoring students, as well on the total sample of 231.

12. Using question 10 as an example: The independent group T-test successfully showed that all the individual questions demonstrated statistically different means between the low and high writing self-efficacy group which shows the questionnaire could correctly identify these opposing groups.

13. The dissertation I followed suggested that each individual question should have a mean score between 2 and 3 and a standard deviation between 0.5 and 1, when these tests are performed on the total sample of 231.

14. You’ll see by the arrows that question 3 met the criteria for the mean but did not meet the criteria for Standard deviation “SD” where it scored below .5.  Question 4, however, met the criteria for both mean and standard deviation.

15. Now for the hard part….the gratuitous selfie!  What does this all mean?  And who cares? It required a lot of thinking on my part.

16. Most questions faired like question 8:  The mean fell between 2 and 3, the standard deviation fell between 0.5 and 1 and high self-efficacy students (in blue) were more likely to agree or strongly agree with the question, while low self-efficacy students (in red) were more likely to disagree or strongly disagree.

17. Compare that to question 3 which, as already indicated, had a low standard deviation of .45.  The graph shows that low self-efficacy students (in blue) were just as likely to “agree” with the statement as high self-efficacy students (in red). Suggesting there was not enough variability in the data.

18. A similar observation can be made with question 9 where the mean score was greater than 3. The graph shows that low and high self-efficacy students were both likely to agree with the question presented.  Subjective analysis is then important to suggest why these questions were not as strong as the others.

19. Question 3 may be measuring a general ability to overcome difficulties while question 9 may be measuring a general ability to be on time, rather than measuring these behaviours as specific to writing. Likely both these items need editing or removal from the questionnaire.

20. In conclusion. I gathered some important information about my questionnaire. And it has made me more determined to eventually learn Factor Analysis.  This image shows me brainstorming the possible factor categories for my questionnaire. Thank you.

Thursday, October 1, 2015

The WebQuest

I created a WebQuest a couple of years ago for another CAE course. It is one of a small handful of assignments I have done in this program that I ultimately put to use in my teaching.  The webquest was used to help students understand the CARS checklist in evaluating the quality of website material for academic use.  Unfortunately, because it was part of an online course, I have never received feedback from students on if they found it useful to help them understand the website evaluation process.  I chose this topic for my webquest assignment because I found students had difficulty identifying the different components of websites for an assignment they had in my scholarly writing course.  We called the assignment a "website evaluation" but what we really wanted them to focus on was, not an entire website, but a particular article on the web that was giving health information. We wanted them to choose an article with content that was not a news article, was not from an academic journal, was not an entire webpage, and was preferably not a PDF publication.  I hoped the webquest would help them wade through the different types of content on the web. 

Now my teaching focus has changed and, among other teaching duties as assigned, I primarily teach a course called research and scholarship in nursing.  I am finding students having a similar struggle with the kind of research related documents out there that they can access about topics.  They have difficulty identifying the difference between qualitative and quantitative research, systematic reviews (meta analysis and synthesis), and just articles written for discussion, opinion, theory, or to provide a summary of information about a topic. 

In a search for research related webquests, I found very little.  This webquest focuses on psychological experiments and how to write up a lab report. This webquest looks specifically at the difference between qualitative and quantitative research.

The webquest I would create to help students recognize the different types of peer reviewed research sources would involve providing them with information about the difference between qualitative and quantitative research.  Knowing the different components to each will help with the recognition process. They also need to have a good understanding of the type of peer review required in academic articles.  I would then ensure they had an understanding of the different formats of systematic review. This table from Duke University gives an excellent summary. Other information that students would require in order to complete the task is understanding how to use the RRC library EBSCO host to search for relevant research materials. 

The main portion of the webquest would present students with actual peer reviewed article examples and ask them to identify them as qualitative primary studies, quantitative primary studies, a type of systematic review (and what type), and articles written to be informative.  I could also challenge them by throwing in the occasional non peer reviewed source. 

The Flipped Classroom

Until I started taking this course, I had never heard the term "flipped classroom".  So the obvious place to start was to read the first article on the list that defined it, but upon finishing that article I found myself wanting to learn more because I realized that I had attempted to teach this way in the past but I never felt that I did it well and the second part of the article promised to tell me more about how to do that.  So I ended up clicking the link to the second part of that article to find out how to prepare myself to teach that way better. 

I initially thought a flipped classroom must be having the students teach the material to the class.  I think that CAN be a version of flipped classroom but the key component to a flipped classroom is to create a lesson that requires students to be prepared up front.  I have been a student in this method many times as that is how graduate program seminars, where their might be 10 students in a course,  work.  Unfortunately, I am stuck in a scenario where I teach to large classes of 50+ students.  The larger the classroom, the more likely it is they can hide and not be prepared when they walk in the room. And they operate on that assumption and they don't prepare.  So I lecture. A lot. And I use question and answer. A lot.  I spoon feed.  A lot.

But nearly 5 years ago I floated in, last minute, to be a sessional at the University (the big one) and teach a course in Women and Health. In fact, I wrote a blog on my trepidation related to this employment opportunity. (I do apologize a head of time for the image(s) that will invade your senses if you click on that link..... if you are a heterosexual male you may enjoy it though...... made you look).  The class size was about 30 but the course was set up in such a way that there were no exams so the trick was to motivate the students to show up, because they could do the readings, complete the assignments, and never attend class and still pass. (I felt participation marks just for showing up, was not very adult centered, nor was it women centered and this was a feminist course, so I refused to make that an evaluation component. Really, the students have the right to decide how to structure their lives. Come if you want. Don't come if you don't want.). So I used readings and the discussion of those pre-assigned readings to be the focus of class time. 

For one topic, I think it was on the medicalization of pregnancy and birth but it doesn't really matter because it would have worked for near any of the topics in the course, I set up a debate format.  I randomly divided the class in half and had one half of the class take the stance that the technology and testing rampant in pregnancy and birth these days (and for the last 20 years or more) saved lives and was beneficial to mothers and babies.  The other half of the class took the stance that it was unnecessary and took away from person centered holistic care.  Women giving birth are not numbers on a monitor printout.  I gave them class time to prepare for their debate and then the two sides had a discussion -- I didn't demand debate rules.  But they had to pre-read to have that discussion so it was a flipped classroom before I'd ever heard the term.

It was an interesting exercise because I had people who ended up on the side they agreed with strongly but I had others who disagreed with their assigned sided of the argument and had to stretch their thinking. Never a bad thing.  As I recall, the debate went well.  I didn't have to do much to keep the discussion going (I just had to re-direct, and keep it respectful and peaceful).  There was some prep up front but it wasn't as intensive as preparing a lecture.  There are some topics that just shouldn't be lectured on and I am hoping to recreate this class at RRC so I made to use flipped classroom a lot at some point in the near future.

Portfolios and Student Centered Learning

It's been a couple years since I posted in this blog but a need has arisen to resurrect it temporarily in the form of a class I am taking. So if any old followers come across this..... bear with me, no I am not planning to come back to write more lengthy reflections on bikes, life, love and the pursuit of happiness. 

(PS... I also have no idea how to stay within word requirements and 300 words is not enough. I tend to not be very good a following rules if they don't make sense to my personal reality).

One of my roles in the Nursing Department is quality control of the scholarly writing assignments that instructors create and ask students to complete.  I was drawn to the article about portfolio creation related to student centered learning because I see a huge need to implement something like this in our department.  It is a discussion that our curriculum team is having and my personal bias is to see a component of this possible future nursing portfolio to be an amalgamation of every writing assignment that students completed in their nursing program.  The portfolio type, as described in the reading, would be a growth portfolio, meaning students, in addition to including their writing assignments, will also have to reflect upon how they've grown as academic writers throughout the program and how it has helped them be better quality nursing practitioners.  But first, the academic writing process in our faculty needs some serious revision in terms of quality of assignments, progression of complexity and demands from first year to third year, stronger evaluation in the form of rubrics, and buy-in from students and staff.  I am very fortunate to be working with a very supportive Chair and faculty who consult me on these issues. 

I have made it a mission in my role on faculty to promote writing across our curriculum. I gave a presentation to our faculty last June on how I believed writing should be threaded in assignment format from first year to third year. My plan involves, much to the delight of every CAE instructor I've worked with, the incorporation of principles of Bloom's taxonomy, writing scaffolding, and self-efficacy theory.  I had a conversation with a college staff member who asked me a very interesting question, because it is his job to ask interesting questions:  "What is the evidence that shows the need for more writing in nursing education.  Do employers want this?"  It's beyond the scope of this brief blog for me to discuss my answer to that question but in short the answer is there is some, mostly anecdotal, evidence, and, yes, employers want it, maybe -- at least one former manager told me without hesitation, that nursing employers require good communicators in writing.  The bottom line is we are a Baccalaureate program and we don't just prepare students for the work world (as most of the rest of RRC programs do), we also prepare them for graduate school. But the role of writing in academic programs is to teach students how to communicate evidence in the language of their profession.  I have a hypothesis that writing makes students stronger nursing practitioners. 

Writing is hard sell to students.  It's hard work and it isn't "mickey mouse" as they seem to wish of everything we ask them to do. In nursing, students can't see the connection between writing a paper and starting an IV. And they don't see the connection between scholarly writing and charting which some students feel is the only writing they should ever have to do.  If they can't see the value in academic writing it is only because we are not doing a good enough job to teach the value.  Faculty also struggle with writing assignments for overlapping reasons. Many faculty also have low-self efficacy with their own writing which makes them fear their ability to evaluate students. And there is no arguing that assigning a paper in a course doubles an instructor's workload on many levels.  If students grow in their writing, develop self-efficacy, and are able to recognize and articulate that growth and can then connect that growth with strong nursing critical thinking, as a portfolio should do, we've built a great nurse. 

We don't choose to teach in a higher education environment unless we feel that we were somehow superior students ourselves in some way and have something to share with the younger generations about our experience.  I have every paper I ever wrote in every academic program I have ever participated in (two undergraduate and one masters degree program) coil bound with their feedback from various instructors saved (because I am a geek like that).  Every few years something motivates me (usually related to a teaching experience or a faculty discussion I've had) to go back and flip through this portfolio, of sorts, and remind myself how far I've come as an academic writer.  It allows me to empathize with students', often valid, frustrations with the way they are graded in their writing experiences.  It allows me to empathize with some of the ridiculous things we insert into assignments that are frustrating to address.

We also have to complete a portfolio as a part of the completion of the CAE program (although, as an aside, I'm not sure I have to do one because I began the program before that was a stated requirement and I've heard I may not have to) and I am sure that if I do have to complete the process I will find it valuable.  I have created many projects in this program which I have used, many I have not used because they don't fit my teaching environment, and it has created many reflections.