digital dissonance

This applied research proposal was written in Spring 2014 for CCE 588 (Graduation Portfolio and Applied Research Proposal), and is based on an abridged version of the Human Subjects Research Exemption Form from the WWU Office of Research and Sponsored Programs.

Digital Dissonance? An Exploration of Student Experience with Web Technology and Students Attitudes Towards LMS Features

7.  What is your research question, or the specific hypothesis?

Is there a relationship between freshman university student attitudes about online classwork conducted with LMS technology and prior student experience with web technology outside the LMS environment?

Murphy (2012) observes that influences on student attitudes about LMS environments are complex (pp. 829-830), and suggests that students are experiencing “digital dissonance.” (pp. 827, 833). This pilot study seeks to measure “digital dissonance” by exploring whether a relationship exists between student attitudes towards LMS features and prior student experience with web technology outside the LMS (Gay et al., pp. 150-152).

8.  What are the potential benefits of the proposed research to the field?

The goal of this study is to explore potential directions for future research (Gay et al., 2009, p. 200, discussing purposes of relationship studies), including whether first-year undergraduate student satisfaction with online learning experiences may improve if LMS technology becomes more student-centered by adapting to reflect how students use web technology outside of the LMS. This study may help articulate themes related to information control and information management in both LMS and non-LMS environments (e.g Smith et al., 2011, pp. 1-2, 5-6).

According to the ECAR Study of Undergraduate Students and Information Technology, 2012, “listening to what students say about new or different technologies to integrate into the learning environment could be a wise investment.”(Dahlstrom, 2012, p. 7). Wong (2012) notes that “Buzzetto-More (2008) and Sanders and Morrison-Shetlar (2002) report that student attitudes toward technology are influential in determining the educational benefits of online learning resources and experiences.”(p. 196). According to Dzuiban et al. (2013), “the student voice in higher education is becoming more evident and influential” and “understanding that voice is critical to building an effective learning climate.” (p. 7; See Also Dahlstrom, 2012, p. 6).

9.  What are the potential benefits, if any, of the proposed research to the subjects?

Students receive no direct benefits, but students will have an opportunity to be heard about their needs and preferences in online learning environments outside of the usual class evaluation process. An opportunity for reflection on LMS technology and non-LMS web tools may help freshman participants with their orientation to the current LMS technology in use at their school.

10.  Answer a), then answer either b) or c) as appropriate.

a)         Describe the population your research is designed to study, including the number of subjects

This research is designed to study a subject population of 50 (Gay et al., 2009, p. 196; Adler, 2014) freshman undergraduate students, age 18 or older, who have completed at least one class with LMS components during their first two quarters of enrollment. To control for the variables of course content and instructional methods, student subjects will be drawn from the population of one course with LMS components.

If this study is conducted at Western Washington University (WWU), the freshman population in Fall 2012 is reported as 2,688 enrolled students. (Office of the Provost & Office of Institutional Research). WWU reports that in Fall 2012, 23% of new freshmen and 21.7% of new transfers “self-reported their racial or ethnic identity as Black/African American, Hispanic/Latino, Asian, American Indian, Alaska Native, Native Hawaiian, or other Pacific Islander,” and the total undergraduate gender distribution for Fall 2012 is reported as 55.5% female and 44.5% male. (Western Washington University, 2014).

b)  Describe how you will recruit subjects from your population of interest.

A simple random sampling method will be used to identify subjects. (Gay et al. 2009, pp. 124-127). Students will be contacted through the official communication method (e.g. email with school logo) used by the school, informed that their feedback is valuable to the institution and that responses to the survey will be anonymized. (Gay et al., p. 181).

The cover letter will inform students that questions about the authenticity of the survey and requests for reasonable accommodations (e.g. audio questions, additional languages) can be directed to the appropriate school administrative offices listed in the letter.

The use of the official institutional communication method and the cover letter is designed to encourage participation by reassuring students about the authenticity of the request to participate, the purpose of the survey, the security of the systems used to record and store responses, and the anonymous nature of their participation.

Students will be required to log into the survey with their university ID number and password. After logging in, students will be required to acknowledge their informed consent for participation in the survey by clicking an “I agree” button, after reviewing the description of the survey and the anonymity of the responses, and before being permitted to participate in the survey.

Questions about the survey questions will be addressed by survey language encouraging respondents to make their best guess and to use comment sections to explain confusion or other issues with the questions.

11.  Briefly describe the research methodology.

An online survey will consist of three sections. Part 1 asks students to rate their attitudes about various aspects of their online learning experience in response to Likert-scale questions, with an opportunity to add comments about their ratings. Example questions include this pattern:

Rate your overall satisfaction with discussion boards on [LMS name]  1 2 3 4 5

Please explain [text box]

In Part 2, students are asked to identify the types of web technology that they use, the estimated frequency of use, and the purposes for which they use the technology. An example question is a 3-column table with:

1) A column with a list of popular web tools and LMS features, and “check all that apply” to indicate that the respondent uses the tool. This column includes an opportunity to add rows for “other” web technologies not listed in the table, added by the respondent in a text box.

2) A column with a list of “check all that apply” options to describe the purpose of each tool listed in Column 1. The list of purposes will include research, discussion, writing, and a text box to add “other” purposes not listed.

3) A column with “fill in the blank” questions about estimated frequency of use, via a drop down menu to select ranges of frequency (1-5, 6-10, etc.,) and increments (per day/week/month/year) for each purpose.

Part 3 asks students to report demographic data, including age, socioeconomic background, prior education and prior LMS experience.

This basic correlational research design will produce results expressed as a correlation coefficient to indicate the degree of relationship between variables. (Gay et al., 2009, p. 197).

Quantitative data will be analyzed with the Spearman rho, which is appropriate for small samples and the ranked ordinal data collected from Likert-scale questions. (Gay et al., 2009, pp. 146, 201-202, 318). An example of the ranked data analysis for the LMS feature “discussion board” is to rank students by reported frequency of engagement in non-LMS online discussion, and then correlate the ranked data with Likert-scale data about attitudes towards LMS discussion boards.

Qualitative data will be coded for themes that emerge from written comments and analyzed to help determine the degree of validity for the pilot test instrument as to whether it “measures what it is supposed to measure and, consequently, permits appropriate interpretation of scores,” (Gay et al., pp. 154, see also p. 158 e.g. unclear test directions, confusing or ambiguous terms). In addition, qualitative data may determine reliability and “whether the data would be collected consistently if the same techniques were utilized over time,” (Gay et al., 2009, pp. 162, 378), because written comments may articulate situational contexts influencing the quantitative responses (p. 463).

12.  Give specific examples (with literature citations) for the use of your test instruments/ questionnaires, or similar ones, in previous similar studies in your field.

Dzuiban et al. (2013) emphasize the importance of assessing ambivalence about online learning, noting that students may express “simultaneous positive and negative feelings toward their online experiences.” (pp. 2-3). The Likert-scale questions and opportunity to provide comments in this proposed study incorporate the reasoning by Dzuiban et al. (2013, p. 3) to record a range of attitudes and permit comments to validate the ambivalent nature of student responses.

The design of this proposed study also reflects discussion by Dzuiban et al. (2013, p.6) about their survey results, which are described as confirming “Craig and Martinez’s (2005) contention that attitudes indexed by such things as student satisfaction with online learning are much more complex than commonly accepted.” This study seeks to assess a range of student attitudes that includes ambivalence towards online learning experiences in the LMS environment, and goes beyond the factors measured by Dzuiban et al. (2013) by including student experience with web technology outside the LMS environment.

Wong (2012) conducted “a student survey on attitudes towards e-learning which rates the quality and usefulness of the online teaching materials to support student learning” (p. 197), with a 4-point rating scale to assess satisfaction with various course components. (p. 204). As noted above, the 5-point Likert scale used by Dzuiban et al. (2013) appears more appropriate to assess possible ambivalence in student attitudes, but similar to Wong (2012, p. 204), this proposed study will assess categories of LMS features (e.g. discussion boards, chat, wiki, etc).

This proposed study will expand on the category approach by Wong (2012) to also seek data related to web experience outside of an LMS, which may be a relevant dimension of student attitudes about various LMS features. Wong (2012) used an open-ended question to solicit student suggestions about additional features “to be included on the [course] website to help with your learning,” and the data include references to tools outside of the LMS (e.g. Skype, Facebook, Twitter) (p. 205); See also Murphy (2012, p. 830, discussing Google products).

13.  Describe how your study design is appropriate to examine your question or specific hypothesis. Include a description of controls used, if any.

This cross-sectional survey collects data at one point in time (Gay et al., 2009, p. 176) and seeks to examine a potential relationship between student attitudes towards LMS technology and prior student experience with web technology outside the LMS environment (p. 200). To control for the variables of course content and instructional methods, student subjects will be drawn from the population of one course with LMS components.

Due to the complex and potentially ambivalent nature of attitudes, this mixed methods study will collect both quantitative and qualitative data, and does not seek to address causation. (Gay et al., 2009, p. 200). Data related to demographic information, such as prior education and LMS experience (e.g. Naveh et al., 2010), will help assess the validity of this “snapshot of the current behaviors, attitudes, and beliefs in a population.” (Gay et al., 2009, p. 176).

According to Gay et al., (2009), self-report instruments, including attitude scales, have “notable limits.” (p. 153). This study collects data after the usual post-course evaluation conducted by the school to minimize possible ‘response sets’ to class evaluations, and this survey will also offer anonymity to encourage honest responses (Gay, et al., 2009, p.153).

According to Gay et al. (2009), “a minimally acceptable sample size is generally 30 participants” in a correlational study (p. 196). Since this is a non-validated survey, this proposed study is a pilot test to help provide “information about deficiencies and suggestions for improvement.” (Gay et al., 2009, pp. 181, 196; see also Smith et al., 2011, p. 10).

This study will include questions about student demographics because it “has been acknowledged that age, sex, socio-economic background and ethnicity contribute to and shape students’ expectations of university, their adjustment to being university students, and ultimately their overall teaching and learning experience (McInnes, James, & Mc Naught, 1995).” (Wong, 2012, p. 197). As noted by Gay et al. (2009), “possible explanations for certain attitudes and behaviors can be explored by identifying factors that seem to be related to certain responses […] only if demographic information about the respondents is collected on the questionnaire” (p. 186). The collection of demographic data will help assess the validity of the instrument (Gay et al., 2009, pp. 290, 539) and whether a representative sample of the school population has been obtained (pp. 124-125). However, this study does not seek to become a “correlational treasure hunt” (Gay et al., 2009, p. 196), and will focus on variables related to the frequency and nature of student web technology use and student attitudes about LMS technology (p. 201).

The generalizability of this study is limited by the rapid pace of development in web technology and its use by students – as noted by the Education Week Research Center in 2011, “the kinds of studies that produce meaningful data often take several years to complete—a timeline that lags far behind the fast pace of emerging and evolving technologies.” (Education Week, 2011).

While a snapshot of a single point in time “often does not provide a broad enough perspective to inform decisions about changes in processes and systems” in a reliable manner (Gay et al., 2009, p. 176), the goal of this study is to help develop directions for future research (pp. 200, 463).

14.  Give specific examples (with literature citations) for the use of your study design, or similar ones, in previous similar studies in your field.

Smith et al. (2011) note that “learners may have preexisting attitudes about the value of instructional multimedia based on their past learning experiences and knowledge of their personal learning preferences” (p. 2), but do not specifically address prior experience with web technology in the Likert-scale and open-ended questions used to assess student attitudes about instructional multimedia. (pp. 5, 7-8, 10-11).

In the ECAR Study of Undergraduate Students and Information Technology, 2011, Dahlstrom et al., report on survey research that found “the average student spends at least some time engaging in about 21 different kinds of software applications and activities out of 40 they were asked about.“ (2011, p. 13). This proposed study will also offer respondents an opportunity to describe suggested applications in higher education, similar to Dahlstrom et al. (2011), who found:

“many students go out of their way in making these optional comments to say that they are drawn to unconventional uses of technology in higher education, citing the potential for learning with online multi-user games, educational games and simulations, social studying sites, e-portfolios, geotagging, and music, among other technologies. Quantitatively, while each of these technologies currently attracts only a minority of interest in the survey—perhaps because instructors rarely model them in an academic context—collectively they indicate a realm of potential future expansion of educational value.” (p. 29)

Additional details related to the study design are described in the response to Question 12.

15.  Describe how you will address privacy and/or confidentiality.

Student responses will be anonymized and encrypted before delivery to the researchers (Gay et al., p. 20). Students verify their identity with their student ID number and password to access the survey, but identifying information will be stripped from the encrypted data sent to researchers.

Students will review the consent form on the survey launch page and select “I have reviewed the consent form above and I agree to participate in this survey” before being permitted to log in. Responses will be stored on an encrypted and password-protected server that only the research team is authorized to access.

References:

Adler, J. (2014, April 28). Does the Science of Human Behavior Only Show Us What We Want to See? Pacific Standard. Retrieved from http://www.psmag.com/navigation/health-and-behavior/can-social-scientists-save-themselves-human-behavior-78858/

Dahlstrom, E., de Boor, T., Grunwald, P., & Vockley, M., with a foreword by Diana Oblinger. The ECAR National Study of Undergraduate Students and Information Technology, 2011 (Research Report). Boulder, CO: EDUCAUSE Center for Applied Research, October 2011, available from http://www.educause.edu/ecar

Dahlstrom, E., with a foreword by Charles Dziuban and J.D. Walker. ECAR Study of Undergraduate Students and Information Technology, 2012 (Research Report). Louisville, CO: EDUCAUSE Center for Applied Research, September 2012, available from http://www.educause.edu/ecar

Dziuban, C., Moskal, P., Kramer, L., & Thompson, J. (2013). Student satisfaction with online learning in the presence of ambivalence: Looking for the will-o’-the-wisp. The Internet and Higher Education, 17, 1–8. doi:10.1016/j.iheduc.2012.08.001

Education Week Research Center. (2011, September 1). Technology in Education. Education Week. Retrieved from http://www.edweek.org/ew/issues/technology-in-education/

Gay, L. R., Mills, G.E., & Airasian, P. (2009). Educational Research: Competencies for Analysis and Applications. (9th ed.). New Jersey: Pearson Education

Lehman, R. M., & Conceição, S. C. (2013). (citing Dahlstrom et al., 2011, at p. 10). Motivating and Retaining Online Students: Research-based Strategies that Work. John Wiley & Sons.

Murphy, J. (2012). LMS Teaching Versus Community Learning: A Call For the Latter. Asia Pacific Journal of Marketing and Logistics, 24(5), 826–841. doi:10.1108/13555851211278529

Naveh, G., Tubin, D., & Pliskin, N. (2010). Student LMS use and satisfaction in academic institutions: The organizational perspective. The Internet and Higher Education, 13(3), 127–133. doi:10.1016/j.iheduc.2010.02.004

Office of the Provost, & Office of Institutional Research. (n.d.). Key Performance Indicators. Western Washington University. Retrieved May 11, 2014 from https://jweb-admcs.wwu.edu/KPI/

Smith, A. R., Cavanaugh, C., & Moore, W. A. (2011). Instructional multimedia: An investigation of student and instructor attitudes and student study behavior. BMC Medical Education, 11(1), 38. doi:10.1186/1472-6920-11-38

Western Washington University. (2014, February 27). Diversity: Diversity Statistics. Retrieved May 11, 2014, from http://www.wwu.edu/diversity/stats.shtml

Wong, L. (2012). Student Attitudes towards E-Learning: The First Year Accounting Experience. Issues in Informing Science & Information Technology, 9.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: