Data Collection: Exploring Participation and Engagement in Tutorials

Introduction

As part of my Action Research Project (ARP), I set out to explore a key question: How can participation methods and dialogue-building techniques from large cohort courses such as the BA (Hons) in Fashion Buying and Merchandising be adapted to increase equitable engagement in Year Lead and Unit tutorials, addressing social, cultural, and racial barriers?

Through a combination of student and staff surveys, I aimed to identify the barriers to tutorial engagement, discover effective strategies, and ultimately develop actionable recommendations. This blog post details my journey, the methods used, and the analytical process leading to a unified thematic analysis. It also showcases how I overcame challenges in data collection and analysis.

Data Collection Tools

To gather data from students and staff, I designed two surveys comprising both closed- and open-ended questions. Screenshots of these questions are in the attached word Document. These surveys allowed for quantitative insights into attendance patterns and participation levels, while also providing qualitative insights through open-text responses.

Student Survey Design

  • Questions Covered: Attendance patterns, barriers to participation, comfort levels, perceptions of inclusivity, and preferred tutorial formats.
  • Key Actions Taken:
    • Initially distributed via a PowerPoint slide shown at the end of lectures.
    • Followed up with weekly email reminders through the “Week Ahead” student communications.
    • Shifted to a face-to-face survey collection at the start of a lecture session, which significantly boosted the response rate from 13 to 39 students (81.25% engagement).

Staff Survey Design

  • Questions Covered: Experience levels, perceived student barriers, effective participation methods, use of dialogue-building techniques, and inclusivity challenges.
  • Key Actions Taken:
    • Distributed during a team meeting and followed up via email.
    • Achieved a 50% response rate (9 out of 18 staff members).

Visual Example: I have included screenshots of the PowerPoint slide used to introduce the student survey, Survey links, as well as snippets from the email communications and reminders.


PowerPoint slide:


Student Survey:

https://forms.office.com/Pages/DesignPageV2.aspx?subpage=design&FormId=xClkjH8We0e4y3fugnWNETGUkxytLPVFlXk5JEXOoU5UMlZXN1BHS0NKMDI4UlREQUdFRUpVMU1BNS4u&Token=f82886868ae5434fa73b80b9c1c9fefd

Staff Survey:

https://forms.office.com/Pages/DesignPageV2.aspx?subpage=design&FormId=xClkjH8We0e4y3fugnWNETGUkxytLPVFlXk5JEXOoU5UQkxWQUtKQTZYMTMxN1JQSFA5SVE3WVRNUi4u&Token=0eeecab46b12406b84e8f9907fc863ff

Week Ahead Student communication:

Staff survey Email Communication:

Interventions and Observations

One key intervention was the shift to in-person survey collection, which demonstrated the importance of face-to-face interactions in encouraging participation. This was done during a teaching session with Year 2 students on 19/11/2025. Students expressed curiosity and a willingness to contribute when the purpose of the research was explained directly.

Expanding this Strategy for Tutorials:

This success highlights the potential of face-to-face engagement as a broader strategy for developing participation in tutorials. By creating a space where students feel directly connected to the purpose of their engagement, similar approaches—such as introducing tutorials with an explicit explanation of their value and goals—could help students feel more invested in attending and participating. In practice, this could include beginning each tutorial with a five-minute discussion of its objectives and relevance or hosting brief, informal drop-in sessions before formal tutorials to build student confidence and rapport.

Examples of Data

The student survey collected responses on attendance patterns, barriers to participation, and tutorial preferences. I mostly used Excel to process the data. Here are some highlights:

From Student Survey

  • Attendance Patterns: Only 28% of students attended Year Lead tutorials regularly, while 41% attended Unit tutorials regularly.
  • Barriers to Participation: Timing conflicts (51%) and social anxiety (33%) were the most common barriers.
  • Preferred Formats: A mix of online and in-person tutorials was the most popular choice (51%).

From the staff survey:

  • Participation Levels: 44% rated student participation as “low,” with lack of preparation and understanding of tutorial value as the primary barriers.
  • Effective Methods: One-on-one engagement (78%) was the most-used method, but interactive activities were underutilised (22%).

Visual Example: Screenshots of Excel table below used to organise and code the data follows:

Data Analysis and Coding

Using Braun & Clarke’s (2022) six-step thematic analysis framework, I analysed the data from both surveys.

Steps in Thematic Analysis

  1. Familiarisation: I reviewed responses multiple times to note patterns and recurring ideas.
  2. Coding: Each response was assigned a meaningful code. For example, “timing conflicts” became part of the “Attendance Barriers” theme.
  3. Searching for Themes: Codes were grouped into broader categories. For example:
    • Student Code: “Social Anxiety”
    • Staff Code: “Fear of Judgment”
    • Combined Theme: Barriers to Participation
  4. Reviewing Themes: Themes were cross-referenced across student and staff data for consistency.
  5. Defining Themes: Each theme was given a clear name and linked to actionable insights.
  6. Reporting: Themes were synthesised into the unified analysis so it could be more easily summarised.

Visual Example: Following are examples of the Tables mapping individual survey questions to themes (e.g., “Timing Barriers”, “Preferred techniques”) and screenshots of thematic coding.

Unified Thematic Analysis

The final unified analysis combined insights from both surveys under five key themes:

  1. Attendance Patterns: Sporadic attendance highlighted timing conflicts and perceived lack of tutorial value.
  2. Barriers to Participation: Both students and staff identified psychological (e.g., fear of judgment) and structural barriers (e.g., unclear expectations).
  3. Inclusivity and Representation: Students desired more culturally relevant content, while staff acknowledged diversity challenges.
  4. Preferred Formats and Techniques: Students favored hybrid and interactive formats; staff relied on traditional methods.
  5. Communication and Clarity: Both groups emphasized the importance of clear objectives, agendas, and follow-up resources.

Visual Example: Following is a table giving example of linking individual survey themes (students & staff) to these unified themes.

I will expand and give a full summary of the Unified themes in the next blog post. This is to purely demonstrate examples of the data collection and tools for the thematic process.

Conclusion

This blog post captures the process, from initial data collection challenges to the development of a unified thematic analysis. By synthesising student and staff perspectives, I aim to create more inclusive, engaging tutorials that address systemic barriers and support diverse student cohorts.

However having two surveys with a large volume of non-identical questions made the data analysis process a time-consuming and complex task. This highlights a consideration for future research: data collection tools could be simplified, such as using interviews or surveys with identical questions for both students and staff. These approaches could streamline analysis and allow for more direct comparisons between the two groups.


References

Braun, V. and Clarke, V. (2022) Thematic Analysis: A Practical Guide . Thousand Oaks: Sage. Available at: bit.ly/3ICHBEr (Accessed: 11 October 2024).

This entry was posted in Uncategorised. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *