Explore chapters and articles related to this topic
Challenges of online non-probability surveys
Published in Uwe Engel, Anabel Quan-Haase, Sunny Xun Liu, Lars Lyberg, Handbook of Computational Social Science, Volume 2, 2021
And then came the internet. Around 1995 the first online surveys emerged. These surveys are almost always self-administered: respondents visit the survey website and complete the questionnaire by answering the questions. Online surveys became rapidly popular. This is not surprising, as online data collection has some attractive properties: Many people are connected to the internet, so an online survey is a simple means of getting access to a large group of potential respondents.Online surveys are cheap compared to other modes of data collection. No interviewers are required, and there are no travel, printing, or mailing costs.A survey can be set up and carried out very quickly. It is just a matter of designing a questionnaire and putting it on the internet.So online surveys seem to be an easy, cheap, and fast means of collecting large amounts of data. Moreover, one does not have to be a survey expert to conduct an online survey. Everyone can do it. There are many software tools on the internet, some of them for free, for setting up and carrying out an online survey. A popular example is SurveyMonkey.
Research Users
Published in Brian Still, Kate Crane, Fundamentals of User-Centered Design, 2017
Locating your users’ networks can help you reach representative users. You can have the highest return rate possible of your survey, but if you are not getting data from representative users, you are pretty much wasting your time. However, if you find professional associations, discussion boards, or wherever the users are, you can target those networks for your survey sample. Luckily, you have several options for distributing surveys. If you want to send surveys to a large organization, perhaps through a listserv or newsletter, you can embed a link to an online survey. Survey formats offered by SurveyMonkey, Qualtrics, Google Forms, and Question Pro (just to name a few) are either free or affordable options that accommodate most researchers’ needs. In cases in which your users may not be as computer literate, you can still print and distribute surveys at the users’ location and ask for them to be returned by mail or to a collection box where the surveys remain anonymous. Still another option is to mail out surveys and ask participants to return surveys by mail. The print option is more expensive, and there is no guarantee that surveys will be returned. However, if you know the participants are more likely to respond in print than by an online survey, this may be the best option.
Undertaking a research project
Published in Perry R. Hinton, Isabella McMurray, Presenting Your Data with SPSS Explained, 2017
Perry R. Hinton, Isabella McMurray
Many surveys are carried out using online survey computer programs. The Applied Studies Student Survey can easily be run online. However, a key consideration is the convenience and ease of completion for the participants. While running a survey online has its advantages, such as the data being automatically stored for each participant, the researchers need to make sure that the participants have access to an input device with which to complete the survey (such as a smartphone, tablet or computer). This is not always convenient for all participants, so a paper-and-pen version of a survey, which might appear old fashioned in this digital age, can also have distinct advantages over online presentation. In the case of the Applied Studies Student Survey, the researchers know that the students have a final class at the end of the academic year, providing them with information about their second-year studies. Allocating 10 minutes at the end of this session, to fill in the survey, means that most students will be available if they wish to complete it. Handing out copies of the survey to be completed at the end of the session means that the researchers are available to answer any questions about the survey immediately and also the response rate is likely to be significantly higher than if all the students were sent an online link to click on to complete the survey.
Exploring Volunteer Motivation, Identity and Meaning-Making in Digital Science-Based Research Volunteering
Published in International Journal of Human–Computer Interaction, 2022
Khushnood Z. Naqshbandi, Yun-Hee Jeon, Naseem Ahmadpour
Our research was approved by the ethics committee at the University of Sydney (reference number 2018/680). This research was conducted at a time when StepUp volunteers were not yet assigned to any dementia research. We chose to use an online survey as it is a fast and efficient way of collecting information from many participants. REDCap, an online survey tool was used to disseminate the survey. The participant information sheet (PIS) was integrated into the survey and shown to participants before starting the survey. Participant consent was obtained via submission of their responses. A pilot survey was tested by the research team and a few pilot participants to ensure the clarity of questions. The survey was then advertised on the StepUp for Dementia platform, after which all registered volunteers were notified about the availability of this study.
Exploring the relationship between student-perceived faculty encouragement, self-efficacy, and intent to persist in engineering programs
Published in European Journal of Engineering Education, 2021
Hsien-Yuan Hsu, Yanfen Li, Suzanne Dugger, James Jones
A few limitations of the current exploratory study provide a window into other future research needs. First, the response rate of the survey was 28.24% in the present study. Declining response rates for college survey participation have received extensive attention (Sarraf 2019). Although low response rates may or may not result in non-response bias (Fosnacht et al. 2017), the findings of this study should be verified by future replicated studies. In addition, an empirical-based strategy could be introduced to increase the response rate of online survey. For example, Brown et al. (2016) found that a post-paid incentive can significantly increase the response rate to an online survey and participants preferred a cash incentive over the e-certificate. Second, we focused on major cognitive factors (self-efficacy and outcome expectations) rather than measuring all the variables of SCCT choice model, such as students’ interests (e.g. interest in solving complicated technical problems) or contextual factors. Future studies could comprehensively collect data on other variables of the SCCT choice model and extend the examination of the direct or indirect relation between students’ perception of faculty encouragement and other SCCT variables.
Factors That Influence Employees’ Security Policy Compliance: An Awareness-Motivation-Capability Perspective
Published in Journal of Computer Information Systems, 2018
Xiaofeng Chen, Liqiang Chen, Dazhong Wu
The participants of this study are employees of a mid-size regional university in the northwest of the USA. The office of the CIO of the university approved our request to conduct an online survey regarding employee ISP compliance intentions. We used Qualtrics.com to host the online survey. The survey lasted for four weeks with two e-mail invitations sent to the employees of the university: The first e-mail invitation was sent on the first day of the first week of the quarter to all university employees; a follow-up e-mail invitation was sent at the end of the second week. The survey was closed at the end of the fourth week. A total of 406 employees completed the survey. There were 175 surveys missing some measurement values, and they were excluded from the analysis of this study. The characteristics of the 231 remaining participants are shown in Table 1.