In the process of designing a survey for my research on increasing participation in Year Lead and Unit tutorials, I faced the challenge of creating questions that could capture the nuanced experiences and barriers faced by students. Drawing on both survey design best practices and insights from relevant literature, I aimed to develop questions that are clear, inclusive, and relevant to the objectives of participation, justice, and engagement. Here, I’ll discuss how I approached creating the survey questions and the rationale behind their design.
Step 1: Establishing Clear Objectives and Aligning with Research Aims
To begin, I revisited the core objectives of my research: understanding and addressing the barriers students encounter in tutorial participation, with a specific focus on social, cultural, and racial aspects. This focus aligned with Decolonising the University by Bhambra, Gebrial, and Niṣanciolu (2018), which emphasises the importance of representing diverse experiences in educational settings. My aim was to design questions that would allow students to communicate their unique experiences, ensuring the survey itself was inclusive and mindful of potential cultural barriers.
The literature also highlighted that the survey should not only address barriers to participation but also uncover students’ comfort levels and engagement methods, which can provide insights into potential interventions. This is inspired by Braun and Clarke’s (2022) thematic analysis approach, which encourages a balance between structured data and rich, nuanced responses.
Step 2: Crafting Clear and Standardised Questions
Drawing from Converse and Presser’s (2011) Survey Questions: Handcrafting the Standardized Questionnaire, I prioritised clarity and simplicity in wording to avoid misunderstandings. This involved removing jargon, refining vague terms, and ensuring questions could be easily interpreted across diverse language backgrounds. For example, rather than asking broadly about “barriers to participation,” I specified options like “language differences,” and “Disability impacts,” also open ended options so they can share their own “previous educational experiences,” allowing respondents to select the answers most relevant to their situations.
Step 3: Structuring Questions for Depth and Relevance
In developing the questions, I combined closed-ended and open-ended formats. Closed-ended questions allowed me to gather standardised data across key themes, such as:
- “How often do you attend Unit Tutorials?” (scaled response)
- “Do you feel that any of the following factors affect your ability to participate in tutorials? (Select all that apply, with an optional follow-up)’’
However, inspired by Feigenbaum and Alamalhodaei The Data Storytelling Workbook (2020), I also incorporated open-ended prompts, such as “What do you think could improve the effectiveness of Year Lead and Unit tutorials?” This approach allows students to share their stories, helping me capture richer insights that may not fit neatly within predetermined categories. These narrative responses will also support deeper thematic analysis later in the research process.
Step 4: Emphasising Inclusivity and Cultural Sensitivity
Bhambra et al. (2018) emphasise the need to consider cultural perspectives when designing research questions, particularly in a diverse cohort. Therefore, I included questions that explicitly acknowledge different cultural experiences, such as:
- “Do you feel that your background and experiences are reflected in the content and discussions during tutorials?”
- “What suggestions do you have for making tutorials more inclusive and supportive of all students?”
These questions help address the research’s social justice focus by recognising and respecting each student’s background and potential challenges.
Step 5: Testing and Revising for Accessibility and Ethical Considerations
With guidance from the British Educational Research Association’s (2024) Ethical Guidelines, I ensured the survey is voluntary, each question was respectful, and designed with confidentiality in mind. For example, I avoided questions that could be too personal or intrusive, focusing instead on general experiences rather than specifics that could inadvertently disclose private information.
In addition, I reviewed the questions to ensure they adhered to accessibility standards, making the survey clear and approachable for students of various backgrounds and abilities. Based on Kember and Ginns’ (2011) Evaluating Teaching and Learning, accessible survey design is essential to achieving valid and reliable responses, especially in diverse educational settings.
Step 6: Refining Based on Literature and Expert Insights
The literature review process provided further insights into refining the survey. By integrating techniques from Pew Research Center (2021) and Bell and Waters’ Doing Your Research Project (2014), I adjusted questions to capture both behavioural and emotional aspects of engagement. For instance, instead of asking broadly about participation, I included questions probing both attendance and the comfort level of expressing opinions in tutorials.
Final Reflections on the Survey Design Process
The process of crafting this survey highlighted the importance of detailed planning, ethical considerations, and alignment with my research objectives. By grounding my questions in survey design theory and literature, I aimed to create a tool that not only captures relevant data but also respects the experiences of each participant. This approach ensures that the survey can yield actionable insights into improving tutorial participation, addressing the very barriers that might otherwise prevent students from fully engaging in their learning experience.
References
- Bell, J. and Waters, S. (2014) Doing Your Research Project: A Guide for First-Time Researchers. 6th edn. Maidenhead: Open University Press.
- Bhambra, G., Gebrail, D., and Niṣancioǧlu, K. (eds.) (2018) Decolonising the University. London: Pluto Press.
- Braun, V. and Clarke, V. (2022) Thematic analysis: a practical guide to understanding and doing. Thousand Oaks: Sage. Available at: https://bit.ly/3ICHBEr (Accessed: 20th Oct 2024).
- British Educational Research Association (2024) Ethical Guidelines for Educational Research. 5th ed. Available at: https://www.bera.ac.uk/publication/ethical-guidelines-for-educational-research-fifth-edition-2024 (Accessed: 20th Oct 2024).
- Converse, J. M. and Presser, S. (2011) Survey questions: handcrafting the standardized questionnaire. Thousand Oaks: Sage. Available at: https://methods-sagepub-com.arts.idm.oclc.org/book/survey-questions/n3.xml (Accessed: 20th October 2024).
- Feigenbaum, A. and Alamalhodaei, A. (2020) The data storytelling workbook. London: Routledge.
- Kember, D. and Ginns, P. (2011) Evaluating Teaching and Learning: A Practical Handbook for Colleges, Universities and the Scholarship of Teaching. London: Routledge. Available at: https://doi.org/10.4324/9780203817575 (Accessed: 20th Oct 2024).
- Pew Research Center (2021) Writing Survey Questions: Guidance for Effective Survey Design. Available at: https://www.pewresearch.org (Accessed: 15 October 2024).