ARP Presentation Slides

Attached are the slides used for the presentation on Monday 13th January 2025

Posted in Uncategorised | Leave a comment

Reflective Review of the Action Research Project (ARP)

To evaluate the entire process of my Action Research Project (ARP), I will use Gibbs’ Reflective Cycle (1988) to explore my experiences, evaluate decisions, and consider future actions. The project aimed to address barriers to student participation in Year Lead and Unit tutorials within the BA Fashion Buying and Merchandising program. This reflection includes insights into the literature review, the decision to use a thematic analysis approach, the survey design, and the overall research journey.

(Source: Glow Scotland, 2015)

1. Description:

The ARP sought to understand and address participation barriers in tutorials by engaging both students and staff through surveys. I collected 39 student responses and 9 staff responses, achieving response rates of 50% from staff and approximately 82.5% from Year 2 students. The process involved multiple iterations to refine the methodology and data collection approach:

  • Literature Review: I grounded the research in studies on participation, inclusivity, and dialogue-building, referencing works by Freire (2000), hooks (1994, 2003), and Braun & Clarke (2006).
  • Survey Design: Using insights from Cohen, Manion, and Morrison (2018), I designed surveys with a mix of closed and open-ended questions to allow for both quantitative and qualitative analysis.
  • Data Collection: Initial responses to the student survey were minimal when distributed via a PowerPoint slide at the end of a lecture and added to the “Week Ahead” email. Responses improved significantly when I introduced the survey during a face-to-face session, emphasising its importance and giving students time to complete it in class.
  • Analysis: I used Braun & Clarke’s (2022) thematic analysis approach to analyse open-ended responses and quantitative methods for closed-ended data. Findings were triangulated to identify barriers and actionable recommendations.

2. Feelings:

The process brought many emotions, combining frustration, learning, and eventual satisfaction. Initially, I felt disheartened by the low student engagement from passive distribution methods like email and PowerPoint slides, particularly as it made me question whether my research had relevance or impact. However, the shift to face-to-face engagement was transformative—it not only increased response rates but also reaffirmed the value of personal, real-time communication in educational research. Explaining the research context directly to students created a sense of shared purpose, and their curiosity and willingness to contribute were both encouraging and validating.

When engaging staff, I initially felt optimistic because team meetings and direct emails provided a professional context to explain the project. However, the 50% response rate dampened my enthusiasm, revealing complexities I hadn’t fully anticipated, such as staff workloads, conflicting schedules, and identifying which staff members actively led student-facing tutorials. It highlighted the challenges of balancing the theoretical potential of collaboration with the realities of competing priorities in a higher education setting.

As the thematic analysis process unfolded, I was struck by its complexity and found moments of deep reflection when trying to synthesise seemingly disparate responses into cohesive themes. It forced me to confront the limitations of my initial approach while offering opportunities to rethink my strategies for future projects. I feel it pushed me to become a more nuanced and thoughtful researcher.

3. Evaluation:

Positive Aspects:

  • Rich Theoretical Foundation: The literature review anchored my project in critical educational theories, particularly around social justice, inclusivity, and participatory practices. Drawing from Freire (2000) and hooks (1994), I felt confident that my work had both academic rigor and practical relevance.
  • Survey Reach: Surveys allowed me to efficiently collect diverse perspectives from both students and staff, enabling me to capture a broad range of barriers and strategies in a short timeframe.
  • Adaptability in Data Collection: Shifting to in-class survey completion demonstrated my ability to adapt in response to low engagement. This practical adjustment significantly improved participation, highlighting the importance of responsiveness in research design.

Negative Aspects:

  • Passive Distribution Shortcomings: The reliance on passive communication methods (email, PowerPoint slides) in the early stages hindered student engagement. This approach underestimated the competing demands on students’ attention and the need for immediate context.
  • Staff Participation Challenges: While staff engagement was productive, the 50% response rate underscored the difficulty of securing broad participation, particularly from those with minimal tutorial responsibilities. A more targeted approach, such as focus groups, could have enabled deeper insights and a more collaborative exploration of barriers.
  • Complexity of Thematic Analysis: The sheer volume of data and the need to synthesise diverse perspectives into unified themes proved time-consuming and intellectually challenging. While this process was rewarding, it also highlighted areas where my preparation for data analysis could have been more robust.

4. Analysis:

The low student response rate to passive distribution methods illuminated the disconnect between digital communication and student priorities. This experience reinforced Freire’s (2000) emphasis on the importance of dialogue and active engagement in developing meaningful participation. By presenting the survey face-to-face and linking it to the students’ lived experiences, I created an immediate sense of relevance that dramatically increased their willingness to engage. This outcome suggests that research participation can benefit from the same dialogic principles as effective teaching practices.

For staff, the limited response rate likely stemmed from competing institutional priorities and time constraints, as highlighted by Cohen, Manion, and Morrison (2018). The process also revealed a need for more nuanced research tools to address specific barriers—such as focus groups or interviews, which could have allowed for deeper exploration of themes like inclusivity and tutorial value. Reflecting on the challenges of thematic analysis, I realised that more preparation and familiarity with the coding process might have streamlined the synthesis of data. The task of uniting student and staff insights into cohesive themes required me to constantly reassess my assumptions and adjust my approach, developing critical self-awareness as a researcher.

Future Considerations:
Moving forward, I would prioritise methods that emphasise immediacy and depth—such as combining surveys with interviews or focus groups. I also see the potential to implement strategies like pre-intervention discussions with staff to secure greater buy-in and participation. Finally, the insights gained from the complexity of the thematic analysis process will inform how I structure future data collection tools, ensuring they better align with my analytical capacity and project goals.

5. Conclusion:

The survey provided valuable insights into barriers to participation and actionable recommendations, but additional methods could have improved the data and strengthened the project’s overall impact. Below, I reflect on critical aspects of the ARP and its broader significance:

Critical Awareness of the Research Context

The ARP effectively demonstrated a critical understanding of barriers to participation in the BA Fashion Buying and Merchandising program. By grounding the project in literature (e.g., Freire, 2000; hooks, 1994) and practical observations, I identified key challenges such as timing conflicts, social anxiety, and unclear expectations. These findings contextualised participation barriers within the systemic issues of inclusivity and equity in higher education, offering clear, targeted interventions to address these concerns.

However, additional methodologies could have deepened this critical awareness. For example, observational studies might have enriched the findings by capturing real-time participation dynamics in tutorials (Cohen et al., 2018). Observations would have allowed for cross-validation of survey results and provided richer insights into nuanced behaviours, such as non-verbal cues or interactions that influence engagement.

Actionable Change within UAL

The research has already prompted institutional change, with mandatory tutorials being introduced from February. This intervention directly addresses low attendance and perceived tutorial value, reflecting the project’s actionable contribution to UAL’s equity and inclusivity goals. Moving forward, combining this institutional step with additional strategies, such as hybrid tutorial formats or anonymous feedback mechanisms, could further enhance participation.

Future iterations could also incorporate focus groups or reflective journaling to build on the survey data. For instance, focus groups with students and staff could develop collaborative problem-solving by exploring different experiences in more depth. These discussions could refine shared strategies for inclusivity and engagement while creating a dialogue that reinforces UAL’s commitment to equitable learning environments.

Broader Significance for Similar Programs

The findings of this project extend beyond the immediate context of the BA Fashion Buying and Merchandising program. The strategies outlined—such as hybrid tutorial formats, cultural competency training, and enhanced communication practices—are scalable to other large-cohort, multicultural programs at UAL and similar institutions. Implementing these solutions across programs could help address systemic barriers and develop inclusive environments for diverse student populations.

Additionally, incorporating reflective journals into tutorials, as recommended by Braun and Clarke (2022), could offer longitudinal insights into participation dynamics. These insights could inspire similar practices in other programs, helping educators track engagement over time and identify evolving barriers.

Mixed-Methods Approach for Future Research

Future research should adopt a mixed-methods approach, combining surveys with complementary methods like focus groups or interviews. As Bryman (2016) suggests, integrating qualitative and quantitative approaches provides deeper exploration of key themes and a more holistic understanding of participation barriers. For instance, follow-up interviews could probe the underlying reasons behind survey responses, while focus groups could encourage collaborative exploration of solutions.

By expanding the methodological scope, future projects could generate richer data and stronger evidence to support recommendations, ultimately leading to more impactful and sustainable change.

6. Action Plan:

Engagement Strategies:

  • For future research, I will prioritise face-to-face or synchronous digital engagement for surveys and include incentives, such as sharing anonymised results, participation prizes. Additionally, incorporating focus groups or interviews as complementary data collection methods could address challenges related to response rates and data alignment. These methods would allow for deeper exploration of participation barriers and preferences while ensuring consistency in questioning across both students and staff. A mixed-methods approach combining surveys and discussions could provide richer insights, creating a more comprehensive understanding of tutorial participation challenges.

Alternative Methods:

  • Incorporate focus groups and observational studies to complement survey data and provide richer insights.

Continuous Improvement in Teaching Practice:

  • Use findings to implement structural changes in tutorials, such as offering hybrid formats, diverse content, and culturally responsive teaching practices.
  • Establish regular feedback loops to monitor the effectiveness of changes and iterate based on student and staff input.

Advocacy for Institutional Support:

  • Advocate for further targeted training in cultural competency and inclusivity for all staff, as emphasised by hooks (2003), to ensure that tutorials provide safe and welcoming spaces for all students.

Final Reflection

The ARP process has been transformative, reinforcing the importance of adaptability, dialogue, and inclusivity in educational research and practice. It has deepened my understanding of barriers to participation and equipped me with actionable strategies to address them. By grounding my approach in the literature and reflecting on challenges and successes, I have developed a clearer vision for developing equitable and engaging learning environments. This project not only contributes to my teaching practice but also aligns with broader institutional goals of equity and social justice in education.

References

  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101.
  • Braun, V., & Clarke, V. (2022). Thematic analysis: A practical guide to understanding and doing. Sage.
  • Bryman, A. (2016). Social research methods (5th ed.). Oxford University Press.
  • Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education (8th ed.). Routledge.
  • Freire, P. (2000). Pedagogy of the oppressed. Bloomsbury.
  • hooks, b. (1994). Teaching to transgress: Education as the practice of freedom. Routledge.
  • hooks, b. (2003). Teaching community: A pedagogy of hope. Routledge.
  • Kember, D., & Ginns, P. (2011). Evaluating teaching and learning: A practical handbook for colleges, universities and the scholarship of teaching. Routledge.
Posted in Uncategorised | Leave a comment

Action Plan: Unified Thematic Analysis of Student and Staff Surveys

The following summary integrates thematic insights carried out from both the student and staff survey data, combining open and closed-ended responses to present a cohesive analysis. Further notes from the individual Student and Staff thematic analysis that lead to this cohesive summary are in a previous blog post.

The findings below emphasise barriers, preferences, and strategies for enhancing participation in Year Lead and Unit tutorials while also aligning with broader goals of equity, inclusivity, and social justice. I have chosen the following 5 key themes to help combine this analysis:

Key Themes from Combined Analysis

1. Attendance Patterns

  • Students: Sporadic attendance for Year Lead tutorials (28% attend regularly, 21% never attend); higher regular attendance for Unit tutorials (41%). Timing conflicts are the most significant barrier (51%).
  • Staff: Majority rate student participation as “low” (44%) or “very low” (22%). They cite lack of understanding of tutorial value (89%) as a primary issue.

Action: Communicate the value of tutorials more effectively and explore flexible scheduling to improve accessibility for students balancing commitments.

2. Barriers to Participation

  • Students: Common barriers include social anxiety (33%), language challenges (21%), and unclear expectations (15%).
  • Staff: Identify similar barriers, such as fear of judgment (78%) and cultural/language challenges (67%).

Action: Train tutors in cultural competency, provide clear expectations, and use anonymous participation tools to mitigate social anxiety.

3. Inclusivity and Representation

  • Students: 62% feel their backgrounds are “somewhat” reflected in tutorials; 59% rate tutorials as inclusive but note room for improvement.
  • Staff: 33% acknowledge diversity and inclusion challenges, citing language and cultural norms as key issues.

Action: Incorporate culturally diverse content into tutorials and involve students in co-creating examples and case studies.

4. Preferred Formats and Techniques

  • Students: Favor smaller discussion groups (41%), hybrid formats, and interactive methods like breakout rooms.
  • Staff: Rely on one-on-one engagement (78%) and pre-tutorial assignments (33%) but underuse interactive activities (22%).

Action: Shift toward interactive techniques (e.g., peer discussions) and pilot hybrid formats to accommodate diverse learning preferences.

5. Communication and Clarity

  • Students: Emphasise the need for clear objectives, structured agendas, and post-session resources.
  • Staff: Stress the importance of well-organised tutorials and pre-session materials to prepare students.

Action: Provide clear tutorial agendas, share resources in advance, and summarize key takeaways after each session.

Key Insights and Literature Integration

Timing and Scheduling:

Aligns with findings by Cohen, Manion, and Morrison (2018) that flexibility in tutorial timing supports accessibility, particularly for international and working students.

Inclusivity:

Hooks (1994) highlights the importance of creating inclusive environments where diverse student voices are valued, which students and staff identified as critical.

Interactive Methods:

Freire’s (2000) emphasis on dialogue resonates with students’ preference for interactive formats and peer-to-peer learning opportunities.

Unified Action Plan

1. Attendance PatternsFlexible Scheduling:

  • Offer varied tutorial times, through the day, consider giving multiple timing choices (e.g morning, midday and afternoon slots).
  • Pilot tutorials directly after related sessions to improve continuity, increase engagement.

2. Barriers to ParticipationSupporting Students’ Voices:

  • Incorporate anonymous feedback mechanisms (e.g., online polls).
  • Involve students in co-designing tutorial content (clear student and tutor agendas).

3. Inclusivity and RepresentationEnhancing Inclusivity:

  • Further Train teaching staff in cultural competency and unconscious bias. Making them aware is potential barriers to engaging with tutorials and what they can do to improve this.
    • Use diverse examples and culturally relevant content to specific year cohorts. Tailor examples to be more inclusive.

4. Preferred Formats and TechniquesInteractive and Structured Formats:

  • Introduce interactive tools (e.g., breakout rooms, Q&A, anonymous surveys).
  • Provide clear agendas and structured activities for tutorials. Some of the data suggest students may want to have a clear plan for tutorials.

5. Communication and ClarityResource Accessibility:

  • Share tutorial recordings and supplementary materials online for future student referral .
  • Ensure resources are accessible to all students, including those with disabilities.

Practical Implementation: Bringing Recommendations to Life

1. Hybrid Tutorial Formats (January 2025)

  • Pilot Program: Begin with a single Unit or Year Lead tutorial, offering students the choice between in-person and online participation. Currently Face to face is encouraged so would give choice. Measure participation rates and engagement levels over one term to assess its success.
  • Implementation Timeline:
    • Term 1 (Pilot): Identify one or two tutorials to trial the hybrid format. Gather pre-implementation feedback from students.
    • Term 2 (Rollout): Adjust based on pilot feedback and expand hybrid formats across other Units or tutorials.

2. Cultural Competency Training for Staff (Date TBC)

  • Training Modules: Introduce short workshops for the B&M teaching team focused on unconscious bias and inclusive teaching practices. Use real-life examples from tutorials to develop the training.
  • Implementation Timeline:
    • Pre-Term: Schedule a mandatory training day for all tutorial staff.
    • Ongoing: Follow up with monthly reflective sessions to discuss challenges and successes in creating inclusive environments.

3. Mandatory Tutorials (February Rollout)

  • Rationale: To ensure equitable engagement, tutorials will become mandatory across the BA Fashion Buying and Merchandising program from February. This aligns with feedback from both students and staff highlighting the need for increased accountability in attendance.
  • Implementation Plan:
    • Communication: Share the transition plan with students via email and in-class announcements by late January, explaining the benefits and support measures (e.g., flexible scheduling).
    • Support Measures: Address barriers identified in the research (e.g., timing conflicts, anxiety) by offering hybrid options and anonymous feedback tools to increase comfort and accessibility.
    • Monitoring: Track attendance patterns and engagement levels from February to assess the impact of this change.

4. Anonymous Participation Tools (January 2025)

  • Implementation Plan: Use digital tools like Padlet or Mentimeter to allow students to submit questions or comments anonymously during tutorials.
  • Pilot: Start by using these tools in tutorials focused on difficult or abstract topics to gauge how anonymity impacts participation.
  • Monitoring: review usage levels and the types of questions used on Digital tools.

Next Steps
By implementing these changes, including the transition to mandatory tutorials, the program aims to develop greater accountability, inclusivity, and engagement among students. Regular review points (e.g., after the February rollout and at the end of Term 2) will allow the B&M team to make iterative improvements and adapt strategies as needed.

Conclusion

By synthesising student and staff insights, this thematic analysis reveals actionable strategies to enhance tutorial participation and inclusivity. These recommendations align with the goals of the Action Research Project Unit by addressing systemic barriers, developing equitable learning environments, and promoting social justice in the BA in Fashion Buying and Merchandising program. Through the use of participation methods such as hybrid tutorial formats, anonymous digital tools, and structured agendas, as well as dialogue-building techniques like interactive discussions and inclusive case studies, the project demonstrates how these approaches can be adapted to meet the needs of large and diverse cohorts. Furthermore, the findings explicitly address social, cultural, and racial barriers by advocating for culturally responsive teaching, training in cultural competency, and the inclusion of diverse student voices. Together, these strategies create a roadmap for increasing equitable engagement in Year Lead and Unit tutorials.

References

Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education (8th ed.). Routledge.

Hooks, b. (1994). Teaching to transgress: Education as the practice of freedom. Routledge.

Freire, P. (2000). Pedagogy of the oppressed. Bloomsbury.

Posted in Uncategorised | Leave a comment

Data Collection: Exploring Participation and Engagement in Tutorials

Introduction

As part of my Action Research Project (ARP), I set out to explore a key question: How can participation methods and dialogue-building techniques from large cohort courses such as the BA (Hons) in Fashion Buying and Merchandising be adapted to increase equitable engagement in Year Lead and Unit tutorials, addressing social, cultural, and racial barriers?

Through a combination of student and staff surveys, I aimed to identify the barriers to tutorial engagement, discover effective strategies, and ultimately develop actionable recommendations. This blog post details my journey, the methods used, and the analytical process leading to a unified thematic analysis. It also showcases how I overcame challenges in data collection and analysis.

Data Collection Tools

To gather data from students and staff, I designed two surveys comprising both closed- and open-ended questions. Screenshots of these questions are in the attached word Document. These surveys allowed for quantitative insights into attendance patterns and participation levels, while also providing qualitative insights through open-text responses.

Student Survey Design

  • Questions Covered: Attendance patterns, barriers to participation, comfort levels, perceptions of inclusivity, and preferred tutorial formats.
  • Key Actions Taken:
    • Initially distributed via a PowerPoint slide shown at the end of lectures.
    • Followed up with weekly email reminders through the “Week Ahead” student communications.
    • Shifted to a face-to-face survey collection at the start of a lecture session, which significantly boosted the response rate from 13 to 39 students (81.25% engagement).

Staff Survey Design

  • Questions Covered: Experience levels, perceived student barriers, effective participation methods, use of dialogue-building techniques, and inclusivity challenges.
  • Key Actions Taken:
    • Distributed during a team meeting and followed up via email.
    • Achieved a 50% response rate (9 out of 18 staff members).

Visual Example: I have included screenshots of the PowerPoint slide used to introduce the student survey, Survey links, as well as snippets from the email communications and reminders.


PowerPoint slide:


Student Survey:

https://forms.office.com/Pages/DesignPageV2.aspx?subpage=design&FormId=xClkjH8We0e4y3fugnWNETGUkxytLPVFlXk5JEXOoU5UMlZXN1BHS0NKMDI4UlREQUdFRUpVMU1BNS4u&Token=f82886868ae5434fa73b80b9c1c9fefd

Staff Survey:

https://forms.office.com/Pages/DesignPageV2.aspx?subpage=design&FormId=xClkjH8We0e4y3fugnWNETGUkxytLPVFlXk5JEXOoU5UQkxWQUtKQTZYMTMxN1JQSFA5SVE3WVRNUi4u&Token=0eeecab46b12406b84e8f9907fc863ff

Week Ahead Student communication:

Staff survey Email Communication:

Interventions and Observations

One key intervention was the shift to in-person survey collection, which demonstrated the importance of face-to-face interactions in encouraging participation. This was done during a teaching session with Year 2 students on 19/11/2025. Students expressed curiosity and a willingness to contribute when the purpose of the research was explained directly.

Expanding this Strategy for Tutorials:

This success highlights the potential of face-to-face engagement as a broader strategy for developing participation in tutorials. By creating a space where students feel directly connected to the purpose of their engagement, similar approaches—such as introducing tutorials with an explicit explanation of their value and goals—could help students feel more invested in attending and participating. In practice, this could include beginning each tutorial with a five-minute discussion of its objectives and relevance or hosting brief, informal drop-in sessions before formal tutorials to build student confidence and rapport.

Examples of Data

The student survey collected responses on attendance patterns, barriers to participation, and tutorial preferences. I mostly used Excel to process the data. Here are some highlights:

From Student Survey

  • Attendance Patterns: Only 28% of students attended Year Lead tutorials regularly, while 41% attended Unit tutorials regularly.
  • Barriers to Participation: Timing conflicts (51%) and social anxiety (33%) were the most common barriers.
  • Preferred Formats: A mix of online and in-person tutorials was the most popular choice (51%).

From the staff survey:

  • Participation Levels: 44% rated student participation as “low,” with lack of preparation and understanding of tutorial value as the primary barriers.
  • Effective Methods: One-on-one engagement (78%) was the most-used method, but interactive activities were underutilised (22%).

Visual Example: Screenshots of Excel table below used to organise and code the data follows:

Data Analysis and Coding

Using Braun & Clarke’s (2022) six-step thematic analysis framework, I analysed the data from both surveys.

Steps in Thematic Analysis

  1. Familiarisation: I reviewed responses multiple times to note patterns and recurring ideas.
  2. Coding: Each response was assigned a meaningful code. For example, “timing conflicts” became part of the “Attendance Barriers” theme.
  3. Searching for Themes: Codes were grouped into broader categories. For example:
    • Student Code: “Social Anxiety”
    • Staff Code: “Fear of Judgment”
    • Combined Theme: Barriers to Participation
  4. Reviewing Themes: Themes were cross-referenced across student and staff data for consistency.
  5. Defining Themes: Each theme was given a clear name and linked to actionable insights.
  6. Reporting: Themes were synthesised into the unified analysis so it could be more easily summarised.

Visual Example: Following are examples of the Tables mapping individual survey questions to themes (e.g., “Timing Barriers”, “Preferred techniques”) and screenshots of thematic coding.

Unified Thematic Analysis

The final unified analysis combined insights from both surveys under five key themes:

  1. Attendance Patterns: Sporadic attendance highlighted timing conflicts and perceived lack of tutorial value.
  2. Barriers to Participation: Both students and staff identified psychological (e.g., fear of judgment) and structural barriers (e.g., unclear expectations).
  3. Inclusivity and Representation: Students desired more culturally relevant content, while staff acknowledged diversity challenges.
  4. Preferred Formats and Techniques: Students favored hybrid and interactive formats; staff relied on traditional methods.
  5. Communication and Clarity: Both groups emphasized the importance of clear objectives, agendas, and follow-up resources.

Visual Example: Following is a table giving example of linking individual survey themes (students & staff) to these unified themes.

I will expand and give a full summary of the Unified themes in the next blog post. This is to purely demonstrate examples of the data collection and tools for the thematic process.

Conclusion

This blog post captures the process, from initial data collection challenges to the development of a unified thematic analysis. By synthesising student and staff perspectives, I aim to create more inclusive, engaging tutorials that address systemic barriers and support diverse student cohorts.

However having two surveys with a large volume of non-identical questions made the data analysis process a time-consuming and complex task. This highlights a consideration for future research: data collection tools could be simplified, such as using interviews or surveys with identical questions for both students and staff. These approaches could streamline analysis and allow for more direct comparisons between the two groups.


References

Braun, V. and Clarke, V. (2022) Thematic Analysis: A Practical Guide . Thousand Oaks: Sage. Available at: bit.ly/3ICHBEr (Accessed: 11 October 2024).

Posted in Uncategorised | Leave a comment

Literature Review (references): Exploring Participation and Engagement in Fashion Buying and Merchandising Education

In investigating how to increase participation in Year Lead and Unit tutorials, particularly within diverse cohorts in fashion Buying and Merchandising programs, this literature review explores research and techniques relevant to survey methods, ethical considerations, and thematic analysis. The project specifically addresses barriers to participation, including social, cultural, and racial challenges, aiming to create a supportive learning environment that increases engagement. This review draws from selected academic works and methodological resources to establish a foundation for the data collection, analysis, and the development of actionable solutions to improve tutorial engagement.

Understanding Participation Barriers and Engaging Diverse Cohorts

Bhambra, Gebrail, and Nisanciolu’s (2018) Decolonising the University presents essential perspectives on structural inequities in education that can marginalise students. Their work emphasises that educational spaces must acknowledge and address the diverse cultural experiences that students bring with them, encouraging approaches that consider racial, cultural, and social dynamics in academic participation. In line with this, Lenette’s (2022) Participatory Action Research: Ethics and Decolonization stresses the ethical considerations necessary when working with diverse cohorts, highlighting the importance of incorporating decolonial perspectives to better support underrepresented groups in higher education. Both resources underline that the first step in understanding participation barriers is recognising the unique challenges faced by students from varied backgrounds.

Survey Design and Question Crafting for Effective Data Collection

Summarising from my previous methodolgy blog – Converse and Presser (2011) in Survey Questions: Handcrafting the Standardized Questionnaire emphasise the need for well-constructed questions that are clear, unbiased, and relevant to the research focus. Their guidance on survey question design aligns with the objective of uncovering nuanced data on social, cultural, and racial barriers that affect tutorial engagement. Additionally, the Pew Research Center (2021) guide on writing survey questions provides a contemporary perspective on structuring questions to avoid leading respondents, which is crucial for gathering honest and representative responses in educational research contexts.

Survey design must align with the ethical principles outlined by the British Educational Research Association (2024) in their Ethical Guidelines for Educational Research. These guidelines reinforce that surveys should be voluntary, anonymous, and conducted with a sensitivity toward the respondent’s potential discomfort when addressing sensitive issues, such as social or racial barriers. By adhering to these ethical considerations, this project seeks to responsibly handle data that reflect the diverse experiences of students within the program.

Analysing Survey Data Through Thematic Analysis

As a significant portion of the project involves analysing qualitative survey responses, Braun and Clarke’s (2022) Thematic Analysis: A Practical Guide provides a practical framework for analysing textual data. Their six-step method facilitates identifying, organising, and interpreting themes, particularly in complex data sets related to social issues and educational engagement. This method will allow for a systematic exploration of the underlying themes that affect student participation, offering a structured approach to understanding how barriers manifest in specific patterns across the cohort.

Furthermore, Bradbury’s (2015) The SAGE Handbook of Action Research emphasises the utility of action research in generating meaningful change within educational institutions. By grounding analysis within an action research framework, this project can contribute insights that inform ongoing adjustments in teaching and engagement methods in the London College of Fashion, and so aligning with the broader goals of participatory and inclusive education at UAL.

Reviewing Engagement Methods and Evaluating Interventions

To contextualise the findings and relate them to actionable strategies, Kember and Ginns’ (2011) Evaluating Teaching and Learning provides an insightful guide for assessing educational interventions. They discuss practical evaluation techniques that are effective in both formative and summative contexts, enabling a clear measurement of the impact that modified tutorial formats may have on student engagement. This resource will be instrumental when evaluating the effectiveness of proposed changes based on the survey findings and themes identified during analysis.

The Data Storytelling Workbook by Feigenbaum and Alamalhodaei (2020)also offers strategies for presenting data in a compelling manner that resonates with an academic audience. Through storytelling techniques, this project can frame the findings in a way that not only demonstrates existing barriers but also highlights potential pathways for improving participation. By presenting data as a narrative, it is possible to contextualise the numbers within the lived experiences of students, making the case for targeted changes that develop inclusivity.

Conclusion

This literature review establishes a foundation for exploring participation and engagement in tutorials within the popular BA (Hons) Fashion Buying and Merchandising  cohorts. Through resources that address survey design, ethical considerations, thematic analysis, and action research, this project is positioned to investigate and mitigate barriers faced by diverse student cohorts. By following these methodologies and drawing on the latest literature, the study will provide a robust analysis aimed at developing a more inclusive and participatory educational environment.

References

  • Bhambra, G., Gebrail, D. and Niṣancioǧlu, K. (eds.) (2018) Decolonising the University. London: Pluto Press.
  • Bradbury, H. (ed.) (2015) The SAGE Handbook of Action Research. Los Angeles: SAGE. Available at: SAGE Handbook of Action Research (Accessed:  October 2024).
  • Braun, V. and Clarke, V. (2022) Thematic Analysis: A Practical Guide . Thousand Oaks: Sage. Available at: bit.ly/3ICHBEr (Accessed: 11 October 2024).
  • British Educational Research Association (2024) Ethical Guidelines for Educational Research. 5th ed. Available at: BERA Ethical Guidelines (Accessed: 11 October 2024).
  • Converse, J. M. and Presser, S. (2011) Survey Questions: Handcrafting the Standardized Questionnaire. Thousand Oaks: Sage. Available at: SAGE Survey Questions (Accessed: 10 October 2024).
  • Feigenbaum, A. and Alamalhodaei, A. (2020) The data storytelling workbook. London: Routledge.
  • Kember, D. and Ginns, P. (2011) Evaluating Teaching and Learning: A Practical Handbook for Colleges, Universities and the Scholarship of Teaching. London: Routledge.
  • Lenette, C. (2022) Participatory Action Research: Ethics and Decolonization. Oxford: Oxford University Press.
  • Pew Research Center (2021) Writing Survey Questions. Available at: Pew Research Survey Writing (Accessed: 17 October 2024).
Posted in Uncategorised | Leave a comment

Research Methods: Developing and Writing Effective Survey Questions for Research on Tutorial Participation

In the process of designing a survey for my research on increasing participation in Year Lead and Unit tutorials, I faced the challenge of creating questions that could capture the nuanced experiences and barriers faced by students. Drawing on both survey design best practices and insights from relevant literature, I aimed to develop questions that are clear, inclusive, and relevant to the objectives of participation, justice, and engagement. Here, I’ll discuss how I approached creating the survey questions and the rationale behind their design.

Step 1: Establishing Clear Objectives and Aligning with Research Aims

To begin, I revisited the core objectives of my research: understanding and addressing the barriers students encounter in tutorial participation, with a specific focus on social, cultural, and racial aspects. This focus aligned with Decolonising the University by Bhambra, Gebrial, and Niṣanciolu (2018), which emphasises the importance of representing diverse experiences in educational settings. My aim was to design questions that would allow students to communicate their unique experiences, ensuring the survey itself was inclusive and mindful of potential cultural barriers.

The literature also highlighted that the survey should not only address barriers to participation but also uncover students’ comfort levels and engagement methods, which can provide insights into potential interventions. This is inspired by Braun and Clarke’s (2022) thematic analysis approach, which encourages a balance between structured data and rich, nuanced responses.

Step 2: Crafting Clear and Standardised Questions

Drawing from Converse and Presser’s (2011) Survey Questions: Handcrafting the Standardized Questionnaire, I prioritised clarity and simplicity in wording to avoid misunderstandings. This involved removing jargon, refining vague terms, and ensuring questions could be easily interpreted across diverse language backgrounds. For example, rather than asking broadly about “barriers to participation,” I specified options like “language differences,” and “Disability impacts,” also open ended options so they can share their own “previous educational experiences,” allowing respondents to select the answers most relevant to their situations.

Step 3: Structuring Questions for Depth and Relevance

In developing the questions, I combined closed-ended and open-ended formats. Closed-ended questions allowed me to gather standardised data across key themes, such as:

  • “How often do you attend Unit Tutorials?” (scaled response)
  • “Do you feel that any of the following factors affect your ability to participate in tutorials? (Select all that apply, with an optional follow-up)’’

However, inspired by Feigenbaum and Alamalhodaei The Data Storytelling Workbook (2020), I also incorporated open-ended prompts, such as “What do you think could improve the effectiveness of Year Lead and Unit tutorials?” This approach allows students to share their stories, helping me capture richer insights that may not fit neatly within predetermined categories. These narrative responses will also support deeper thematic analysis later in the research process.

Step 4: Emphasising Inclusivity and Cultural Sensitivity

Bhambra et al. (2018) emphasise the need to consider cultural perspectives when designing research questions, particularly in a diverse cohort. Therefore, I included questions that explicitly acknowledge different cultural experiences, such as:

  • “Do you feel that your background and experiences are reflected in the content and discussions during tutorials?”
  • “What suggestions do you have for making tutorials more inclusive and supportive of all students?”

These questions help address the research’s social justice focus by recognising and respecting each student’s background and potential challenges.

Step 5: Testing and Revising for Accessibility and Ethical Considerations

With guidance from the British Educational Research Association’s (2024) Ethical Guidelines, I ensured the survey is voluntary, each question was respectful, and designed with confidentiality in mind. For example, I avoided questions that could be too personal or intrusive, focusing instead on general experiences rather than specifics that could inadvertently disclose private information.

In addition, I reviewed the questions to ensure they adhered to accessibility standards, making the survey clear and approachable for students of various backgrounds and abilities. Based on Kember and Ginns’ (2011) Evaluating Teaching and Learning, accessible survey design is essential to achieving valid and reliable responses, especially in diverse educational settings.

Step 6: Refining Based on Literature and Expert Insights

The literature review process provided further insights into refining the survey. By integrating techniques from Pew Research Center (2021) and Bell and Waters’ Doing Your Research Project (2014), I adjusted questions to capture both behavioural and emotional aspects of engagement. For instance, instead of asking broadly about participation, I included questions probing both attendance and the comfort level of expressing opinions in tutorials.

Final Reflections on the Survey Design Process

The process of crafting this survey highlighted the importance of detailed planning, ethical considerations, and alignment with my research objectives. By grounding my questions in survey design theory and literature, I aimed to create a tool that not only captures relevant data but also respects the experiences of each participant. This approach ensures that the survey can yield actionable insights into improving tutorial participation, addressing the very barriers that might otherwise prevent students from fully engaging in their learning experience.

References

  • Bell, J. and Waters, S. (2014) Doing Your Research Project: A Guide for First-Time Researchers. 6th edn. Maidenhead: Open University Press.
  • Bhambra, G., Gebrail, D., and Niṣancioǧlu, K. (eds.) (2018) Decolonising the University. London: Pluto Press.
  • Braun, V. and Clarke, V. (2022) Thematic analysis: a practical guide to understanding and doing. Thousand Oaks: Sage. Available at: https://bit.ly/3ICHBEr (Accessed: 20th Oct 2024).
  • British Educational Research Association (2024) Ethical Guidelines for Educational Research. 5th ed. Available at: https://www.bera.ac.uk/publication/ethical-guidelines-for-educational-research-fifth-edition-2024 (Accessed: 20th Oct 2024).
  • Converse, J. M. and Presser, S. (2011) Survey questions: handcrafting the standardized questionnaire. Thousand Oaks: Sage. Available at: https://methods-sagepub-com.arts.idm.oclc.org/book/survey-questions/n3.xml (Accessed: 20th October 2024).
  • Feigenbaum, A. and Alamalhodaei, A. (2020) The data storytelling workbook. London: Routledge.
  • Kember, D. and Ginns, P. (2011) Evaluating Teaching and Learning: A Practical Handbook for Colleges, Universities and the Scholarship of Teaching. London: Routledge. Available at: https://doi.org/10.4324/9780203817575 (Accessed: 20th Oct 2024).
  • Pew Research Center (2021) Writing Survey Questions: Guidance for Effective Survey Design. Available at: https://www.pewresearch.org (Accessed: 15 October 2024).
Posted in Uncategorised | Leave a comment

Rationale Blog Post: Why I Chose a Survey for Investigating Participation in Tutorials

For my primary research, I chose a survey through Microsoft Forms as the method to investigate student participation in Year Lead and Unit tutorials within the Fashion Buying and Merchandising program. This decision reflects both my personal motivation as a Lecturer and Year Lead to create a more inclusive learning environment and my responsibility to support a diverse student body effectively. Given the range of nationalities, cultures, and educational backgrounds among my students, surveys offer an anonymous/accessible way to reach a wide audience, allowing for broad representation of student perspectives.

Personal Motivation and Context

I was motivated to explore this topic based on my experiences with low participation in tutorials and the barriers that students face in engaging actively. I have observed that students may not always feel comfortable sharing openly in tutorials, particularly when language or cultural differences are at play. This challenge has driven my interest in understanding what specific barriers exist and how they might be mitigated. As Freire (2000) argues, creating a space that enables students to participate without fear or inhibition is essential for developing true educational dialogue and growth. This approach aims to ensure that every student’s voice can be heard and valued. Studying this PGcert has also pushed me to develop my own knowledge of what equitable and socially just education looks like so I hope this research and new knowledge can benefit my students first hand.

Relevance to My Role and Department

In my role as an educator, I am responsible for developing an inclusive and engaging learning environment that meets the needs of all students. Addressing barriers to participation is essential to promoting a supportive educational experience, particularly within a global program like Fashion Buying and Merchandising, where students’ diverse backgrounds enrich discussions and learning outcomes. By investigating participation barriers, this research will enable me to better understand and address specific needs within my tutorials. This research also contributes to department goals around student engagement and retention, helping to create a more equitable and supportive academic community. I hope to be able to share my findings with my colleagues in other large cohort courses to further drive social justice and equity throughout UAL.

Context within UAL and higher education

This research is not only relevant to my immediate teaching context but also resonates with broader institutional goals around equity, diversity, and inclusion. Universities are increasingly committed to creating learning environments that recognise and embrace diversity, as echoed by Bell Hooks (1994) and her advocacy for classrooms that are democratic and inclusive. By addressing issues related to social justice and accessibility, this research aligns with institutional commitments to provide equitable learning opportunities. Furthermore, in the field of fashion education, where global perspectives are increasingly valued, enhancing participation across diverse cohorts is essential for producing well-rounded graduates who are prepared to operate in an international industry.

Supporting Literature and Methodology

In selecting a survey for this project, I also drew from established research practices that emphasise inclusivity and representativeness. According to Cohen, Manion, and Morrison (2018), surveys are particularly effective for gathering data from large and diverse cohorts, making them ideal for the varied demographics within my program. Bryman (2016) also notes that surveys provide anonymity, which can reduce social desirability bias and lead to more honest responses, especially on sensitive issues like race, class, and culture. This aligns with bell hooks’ (2003) call for educational practices that empower all students to share their perspectives, especially those who may feel marginalised in traditional classroom settings.

Furthermore, Thomas and Harden (2008) argue that written surveys allow respondents to process questions at their own pace, encouraging reflection and more in-depth responses. This supports the broader objective of creating learning environments that, as hooks (1994) suggests, “privilege multiple forms of expression.” My survey aims to capture the voices of students who might feel hesitant to participate in live discussions, ensuring their insights are incorporated into the development of more effective participation methods.

Conclusion

Ultimately, this research serves as a step toward building a socially just and responsive teaching practice. By using surveys, I aim to understand and address the barriers to tutorial participation, creating an environment where all students feel valued and engaged. Informed by both practical considerations and an ethical commitment to inclusivity, the survey method will enable me to uncover the unique challenges faced by different groups and adapt my teaching strategies accordingly.

References
– Bryman, A. (2016). Social research methods (5th ed.). Oxford University Press.
– Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education (8th ed.). Routledge.
– Freire, P. (2000). Pedagogy of the oppressed. Bloomsbury.
– Hooks, b. (1994). Teaching to transgress: Education as the practice of freedom. Routledge.
– Hooks, b. (2003). Teaching community: A pedagogy of hope. Routledge.
– Mertens, D. M. (2015). Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative, and mixed methods (4th ed.). SAGE Publications.
– Thomas, J., & Harden, A. (2008). Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Medical Research Methodology, 8(1), 45.

Posted in Uncategorised | Leave a comment

ARP WEEKLY ACTION PLAN

For my Action Research Project I am working to a tight deadline to gather and evaluate the research. To do this I have created a working document mapping out my rough plan of action plan by week until our Summative Assessment 13th January 2025. I will update and amend this plan as necessary:

This plan has been invaluable for keeping my research on track alongside my other personal and professional commitments.

Posted in Uncategorised | Leave a comment

Ethics in Action; Reflecting on Feedback and Responses for My Action Research Project

Attached is my submitted and approved Ethic form for reference.

The ethics of any research project are foundational, especially when working within the context of a diverse and global institution like UAL. My project, which explores how participation methods and dialogue-building techniques can be adapted to increase equitable engagement in tutorials, posed both practical and ethical considerations. Here, I reflect on the valuable feedback provided by my tutor regarding my ethics form and the steps I took to address these points.

1. Tackling a Common Challenge

My tutor emphasised the relevance of my project in the context of UAL’s global recruitment strategy during uncertain financial times. This feedback reinforced the importance of my research in developing inclusive teaching practices that support a diverse student body.

Response:
I aligned my research objectives more closely with UAL’s institutional goals, ensuring that my study not only addresses tutorial participation but also contributes to wider conversations on inclusivity and equity in education for the wider university. I hope some of my actions can be considered at UAL.

2. Thorough Reading Around Methods

The recommendation to explore academic practice journals and focus on methods was a good reminder. Emphasising that method-based literature would support my decision-making, problem-solving, and evaluation processes.

Response:
I expanded my reading to include seminal works on action research and thematic analysis (e.g., Braun & Clarke, 2006; Cohen et al., 2018) and ensured my methodology was evidence-based. This strengthened the justification for my survey approach and analysis techniques.

3. Narrowing the Focus

I was encouraged to avoid trying to test or include too much at every stage, suggesting that I narrow my focus to specific interventions or areas of impact.

Response:
I refined my project scope, focusing primarily on student surveys as the primary data collection method and designing questions targeted at identifying actionable barriers and preferences. I realised that my planned Focus groups research wouldn’t give a fair response and would be too much to achieve in this Unit alongside the surveys. This allowed for more depth in my analysis and clearer outcomes.

4. Consent and Accessibility

The feedback highlighted the need for thoughtful survey design, particularly regarding accessibility, understanding, and comfort in sharing. Concerns were raised about whether students might hesitate to share candid feedback with a tutor in charge.

Response:
I integrated a consent statement at the start of the survey, ensuring students were fully informed about anonymity and their right to withdraw. I also was able to deliver this verbally in lectures. Additionally, I considered language and cultural barriers, simplifying the survey language and testing it with colleagues for clarity. Reflecting on potential low response rates, I introduced multiple modes of engagement (e.g., in-class explanation, inclusion in the “Week Ahead” email) to reach a broader student audience.

5. Emotional Dynamics

My tutor appreciated my awareness of the emotional dynamics in the research process, particularly the need to ensure students felt safe. Suggestions included providing emotional support information at multiple stages.

Response:
I ensured students were reminded verbally and in writing about their anonymity and the voluntary nature of their participation, both when introducing the survey and at the conclusion of their participation. I also shared information about student support services to address any concerns arising from the research process.

6. Data Storage and Protection

Specificity about where and for how long the data would be stored was recommended.

Response:
I committed to storing data securely on UAL’s institutional systems for the duration of the project and survey responses are anonymous so not traceable. This detail was explicitly communicated to participants.

7. Defining Informed Consent

My tutor prompted me to consider what “informed consent” meant in this context and how I would ensure students truly understood their participation.

Response:
I provided a clear, accessible explanation of the survey’s purpose, potential risks, and their rights (e.g., withdrawing at any time). Consent was built into the survey itself, with a checkbox ensuring participants actively agreed to proceed. Along with the selection of surveys as the key tool due to it anonymity.  

8. Feedback Integration: Adapting and Evolving the ARP Process

Throughout the ARP process, feedback from tutors and peers played a pivotal role in shaping my approach. For example, suggestions to narrow the project’s focus and avoid attempting to address too many objectives at once allowed me to concentrate on actionable outcomes, such as refining the survey design and prioritising thematic analysis as the central methodology. Similarly, feedback emphasising ethical considerations prompted me to enhance participant consent protocols and make the surveys more accessible and inclusive.

The recommendation to engage students directly during lectures led to a significant improvement in survey response rates, highlighting the value of in-person communication—a strategy that could also be applied to improving tutorial participation. This responsiveness to feedback not only strengthened the validity and inclusivity of the research but also demonstrated adaptability, an essential quality in action research.

By reflecting on and incorporating this input, I was able to align my research process with the goals of equity and social justice, ensuring the ARP addressed both the practical challenges and the institutional context of UAL.

Final Reflections

The feedback from my tutor significantly shaped the ethical dimensions of my project. It not only guided me in refining my survey design but also deepened my understanding of the complexities involved in equitable research practices.

Moving forward, I plan to continue reflecting on the inclusivity of my research methods. For future projects, I would consider alternative approaches such as focus groups or interviews to engage students who may not participate in written surveys, thereby reaching a broader spectrum of voices.

By prioritising ethical considerations, this project contributes to a more just and inclusive academic environment, aligning with UAL’s values and developing meaningful student engagement as well as aligning with my own passion to provide the best pedagogical support for my students.

Posted in Uncategorised | Leave a comment

ARP Research Question:

How can participation methods and dialogue-building techniques from large cohort courses such as the BA (Hons) in Fashion Buying and Merchandising be adapted to increase equitable engagement in Year Lead and Unit tutorials, addressing social, cultural, and racial barriers?

Posted in Uncategorised | Leave a comment