Explore chapters and articles related to this topic
Toolkit for Assessing and Monitoring Leadership and Safety Culture
Published in Cindy L. Caldwell, Safety Culture and High-Risk Environments, 2017
The results of a questionnaire are not valid unless they represent the surveyed population. Response rate is a commonly accepted indication of representativeness. Questionnaire response rates are best addressed during the design and data collection phases of the assessment. This can be done by pre-testing the survey, increasing the data collection period, and sending reminders throughout the data collection period. While the survey is being conducted, it is advisable to monitor response rates. Survey research expert Babbie (2007, p. 262) asserts that “a response rate of at least 50 percent is considered adequate for analysis and reporting. A response of 60 percent is good; a response rate of 70 percent is very good.” Many experts agree that below 50% the data should be evaluated for non-response bias (Babbie, 2007). Non-response bias is the bias that results when respondents differ in meaningful ways from non-respondents. There are many variables that could affect non-responders. For example, groups of people who fail to respond in the study could be reluctant to respond, too busy to respond, or have negative beliefs about how the organization handles survey data. Substantial differences between respondents and non-respondents make it difficult to assume representativeness across the entire population (Dillman, 1999). One method to check for non-response bias is to compare response rates across key subgroups of the target population (Groves, 2006). This may point to subgroups that could be underrepresented or justify the representativeness of the responses across the surveyed population.
Classical Statistics and Modern Machine Learning
Published in Mark Chang, Artificial Intelligence for Drug Development, Precision Medicine, and Healthcare, 2020
Response bias refers to a wide range of tendencies for participants to respond inaccurately or falsely to questions. Response bias can be induced or caused by numerous factors, all relating to the idea that human subjects do not respond passively to stimuli, but rather actively integrate multiple sources of information to generate a response in a given situation. Examples of response bias include the phrasing of questions in surveys, the demeanor of the researcher, the way the experiment is conducted, or the desires of the participant to be a good experimental subject and to provide socially desirable responses. All of these can affect the response in some way.
An intelligent socially assistive robot-wearable sensors system for personalized user dressing assistance
Published in Advanced Robotics, 2023
Fraser Robinson, Zinan Cen, Hani Naguib, Goldie Nejat
Participant pool bias may be present in the feedback received from relevant stakeholders given the conferences were focused on various types of technology. However, stakeholder familiarity with different types of technology reduces the risk of new information bias, as it has been shown that respondents unfamiliar with new technology can actually rate novel systems more positively [59]. Completion of our questionnaire required individuals to approach the demonstration. This may result in response bias where more extreme responses, both positive and negative, are received from respondents who actively seek to provide feedback [60]. However, for our questionnaire results, even though we observed some extremely positive and negative responses, the overall median ratings showed that the majority of responses were moderate, suggesting response bias did not have a significant effect on our results.
The Effects of Organizational Structure on MBSE Adoption in Industry: Insights from Practitioners
Published in Engineering Management Journal, 2023
Kaitlin Henderson, Alejandro Salado
Non-response bias occurs when the people from the sample population who responded have different characteristics from those who did not respond, thus questioning whether the sample results can be generalized to reflect the true population (Rogelberg & Stanton, 2007). Nonresponse is a common concern with survey distribution, and appears to be getting worse over time (Rogelberg & Stanton, 2007). A low response rate does not automatically mean the results are biased, but there are several ways that non-response can affect results. One of the ways this can occur is when the topic of the survey is one that can elicit a strong opinion-based response (e.g., gun control) (Wells et al., 2012). In other words, people with extreme opinions in either direction may be overrepresented because they are more compelled to respond to the survey. The topic of this survey, MBSE, is also tied to the sample population, people who use MBSE. Whereas in the example from Wells et al. (2012), the respondents were general college students who tended to be extremely pro- or anti-gun control. The portion of the population that was underrepresented were people who had mid-level opinions. The respondents in this survey had to use MBSE in an organization for at least one year. Since the questions in this survey are largely not opinion-based, this type of non-response bias should not be an issue.
Attitudes towards gamification advertising in Vietnam: a social commerce context
Published in Behaviour & Information Technology, 2023
Hai Ho Nguyen, Bang Nguyen-Viet, Yen Thi Hoang Nguyen
The participants were recruited from two sources: a major student sample from universities and a minor sample from respondents at tea/coffee shops in Ho Chi Minh City, the largest city in Vietnam. College students were considered appropriate participants for this study because they not only interact with gamification advertising on s-commerce platforms but are also major consumers of several popular social applications in Vietnam. The respondents were informed of the objectives of the study and confidentiality of the data to counteract non-response bias. Prior to the final survey, a preliminary test was conducted with 50 respondents to ensure the structure and validity of the survey instrument. The survey instrument was developed using Google Forms, and a link to the questionnaire was generated.