About the Survey

How Migrant Justice Institute conducted the
2024 National Migrant Work Survey

In 2024, Migrant Justice Institute conducted the National Temporary Migrant Work Survey – the first national empirical study of migrants’ experiences of work in Australia post-COVID. The survey was open to anyone 18 years and over who had worked in Australia while holding a temporary visa, or while having overstayed a visa. It yielded 9,963 valid responses.

The survey was designed to illuminate the experiences and behaviours of different migrant worker cohorts in Australia with respect to labour exploitation and unsafe work. This includes migrant workers who have engaged assistance and services, as well as those who have limited connections to organisations in Australia. We aimed to reflect the diversity of Australia’s migrant worker population in terms of nationalities, gender, visas, geographic locations (by state and regional/cities), industries, age, year of arrival and union membership.

Ethics approval for the study was obtained by the Human Ethics Research Committee at UNSW Sydney. Migrant Justice Institute received financial contributions towards the survey from the grants program of the Commonwealth Attorney General’s Department National Action Plan to Combat Modern Slavery 2020-2025, and Migrant Workers Centre and financial support for the reports and engagement from Minderoo Foundation’s Walk Free initiative.

How and when participants took the survey

This study involved an anonymous national online survey of adults who have worked in Australia on a temporary visa, or while undocumented. The survey was available on the Qualtrics platform between 8 July and 31 August 2024. Before commencing the survey, participants were provided with information about the survey and how their data would be used, to which they could consent through their participation in the survey. Participants who completed the survey were able to enter a draw for prizes which included fifty $200 Mastercard vouchers. They were asked for their phone number for the purpose of advising winners of the prizes. There was no way to connect the phone number provided in the prize survey with answers provided to the temporary work survey which remained strictly anonymous.

Thank you for asking me these questions. It makes me feel a bit better because neither my previous employers nor anyone else has asked me these before. Sometimes, I feel very lonely.
— Male international student from Turkey, 24, in NSW

The development of the survey instrument

Survey topics

The survey covered a range of problems that migrants encounter at work, ranging from underpayment and other non-compliance with workplace laws to modern slavery, and how migrants respond to these. It also canvassed migrants’ perceptions and knowledge that inform their responses and decision-making. Topics included:

  • Participants’ demographics, including visa, gender, nationality, age, location, year of arrival and, for current international students, their type of education provider and course of study;

  • The nature and structure of participants’ lowest paid job in 2023-24;

  • Wages and entitlements received in participant’s lowest paid job in 2023-24;

  • Employer coercion and modern slavery indicators experienced in any job;

  • Sexual harassment;

  • Safety and injury in the workplace (including access to medical care) in all jobs;

  • Problems in accommodation linked to a job or employer;

  • Whether participants sought assistance with wage recovery (where underpaid) or with other problems associated with experiences of forced labour indicators (where relevant), where they went, the outcome of help-seeking, and, if they did not seek help, the barriers that prevented them from doing so;

  • Perceptions of prevalence of problems and responses among their migrant worker peers; and

  • Participants’ knowledge of workplace rights, entitlements, and common misconceptions about their rights or culpability.

Survey drafting and testing

We began with a foundation set of survey questions based on similar or modified versions of questions asked in our previous national surveys. In selecting and developing these questions, we drew on substantive data collected and lessons learned in relation to question structure and formulation in those surveys. In some areas, this enables us to compare responses to similar questions over time (noting that our survey sample changes for each survey, and so our data is not longitudinal).

We conducted a literature review of other surveys conducted in similar topic areas, as well as academic articles, reports and submissions on substantive topics covered in the survey, published over the past 5 years.

We held consultations with a wide range of academic experts, community stakeholders, service providers, state and federal government agencies and departments, unions, research and public policy organisations, and a range of migrants with lived experience.  A list of all organisations that assisted with survey design and distribution is available below.

We developed a new question to obtain data on participants’ experiences of, and responses to, forced labour indicators. This question asks participants to indicate whether they had experienced any of a range of ‘problems’ in any job in Australia. Our selection of this list of indicators was informed by the ILO Indicators of Forced Labour and the ‘Indicators of Modern Slavery’ in the Attorney-General’s Department’s Commonwealth Modern Slavery Act 2018 Guidance for Reporting Entities (2023), as well as Walk Free’s Modern Slavery Index. We sought to include indicators that bring together elements of serious labour exploitation and experiences of coercion and deception which are the hallmarks of forced labour.

We consulted on survey drafts with a range of service providers, experts and migrant workers, iteratively incorporating feedback and refining the survey during the consultation process. We then uploaded the survey to the Qualtrics platform, and paid individual migrant workers in the target cohorts to pilot the survey and provide feedback on their experience, including the survey’s accessibility, clarity of instructions, and the formulation and sequencing of questions and multiple-choice responses. After incorporating feedback from user testing, we had the survey professionally translated into multiple languages, and paid individuals to test each translation of the survey and provide feedback on phrasing. We incorporated feedback into the final version of the survey in each language.

Key survey features

The survey contained 76 multiple choice questions in addition to a number of follow-up questions for subsets of participants who selected a particular response. A small number of questions allowed open answers, mostly where respondents selected ‘Other’ among multiple choice options.

There were 9 questions that provided an opportunity for discursive responses as well as an open text field at the end of the survey that asked participants if they wanted to add any further observations or information. 3,656 participants provided an open text response to this final question.

The survey used branching to ask follow-up questions based on a participant’s response to a particular question. It also displayed certain questions to particular cohorts for whom the question was relevant, based on responses to demographics questions or responses to other earlier questions in the survey.  

The survey contained a mix of questions that required a response in order to proceed to the next question, and questions that could be skipped.  Participants could move back to previous questions but could not go further back than any question that was the basis for proceeding down a particular branch of follow-up questions.

The English-language survey was translated into 5 languages: Chinese (simplified), Spanish, Nepali, Tamil and Arabic. We settled on these languages after consultation with stakeholders on those languages that were most needed to enable participation by a breadth of temporary visa holders with limited English, where we also had strong distribution networks to reach these language speakers.

Participants were free to stop the survey at any time. As some participants exited the survey at different points before the end, the number of respondents varied between questions. In addition, some follow-up questions were only shown to participants who selected particular responses.

We were interested in understanding correlations between features of labour noncompliance and other circumstances in a particular job, as well as experiences of problems in any job. Given the imperative to include the fewest possible number of questions and reduce the time required to complete the survey, we obtained this data via:

  1. A set of questions related to participants’ lowest paid job in Australia in the 18 months prior to the survey (from 1 January 2023 to July/August 2024). This provides a large dataset on current wage rates and underpayment in particular industries in specific locations for migrants on particular visas, taking into account whether the respondent was working as an employee or contractor (ABN), whether they were a casual employee, whether they were paid in cash, received payslips, received superannuation, had deductions taken from pay, and other features of the specific job.

  2. Several sets of questions on problems experienced in any job in Australia, including forded labour indicators, recruitment, workplace injury, and employer-mandated accommodation.

All respondents were provided with correct information in relation to the questions they were asked, in addition to referrals to relevant support services by location.

After the survey, respondents were offered the option to enter a prize draw via a separate Qualtrics survey.

Survey as unique migrant information and empowerment tool

As with MJI’s previous surveys, the 2024 survey was designed as a learning and empowerment tool for every migrant participant. At various points in the survey participants were asked about their knowledge of aspects of their workplace rights and Australian law, and after responding were given the correct answer with a simple explanation and where to find further information. At the end of the survey participants were directed to a referral portal with information on where to get help in their state for a range of different workplace issues, and were provided with a summary of key information on all the topics addressed in the survey, and short video explainers produced by Gabrielle Marchetti at Jobwatch. These remain available on MJI’s website under the “For Migrants” tab.

Ethical considerations

All elements of the survey methodology were subject to scrutiny prior to approval by UNSW’s Human Ethics Research Committee, and ratified by the University of Technology Sydney Human Research Ethics Committee.

Before commencing the survey, participants were provided with information about the survey and how their data would be used, to which they could consent through their participation in the survey.  The survey was confidential and anonymous. Participants were not asked their name. At the end of the survey, participants were invited to provide a phone number or email address if they were interested in participating in further research on topics covered by the survey but this was not required. After completing the survey, participants were invited to enter a separate prize draw to win one of 50 $200 Mastercard voucher. They were asked for their phone number or email for the purpose of advising winners of the prizes.  There was no way to connect the phone number provided in the prize survey with answers provided to the temporary work survey which remained strictly anonymous.

All data is kept on secure servers and is only accessible to members of the research team approved in the Ethics application. Where confidential findings are provided to individual organisations this only includes aggregated findings where it is not possible to identify any individual. No raw data will ever be shared. We will not publish any information that could identify any individual participant, education provider, employer or other person.

Participant recruitment

Our dissemination strategy was developed through engagement across migrant communities, the union movement, the community sector and the international student sector.

Participants were recruited in collaboration with many individuals and organisations who assisted us to reach their communities, members, clients and networks. Recruitment channels included emails, newsletters, social media, community media (radio, print), websites and flyers/posters. The content of promotional materials were developed in consultation with migrants and service providers.

Partners and distribution channels

The survey was promoted to migrants through outreach by many community service providers (including community legal centres),federal government channels, state government agencies, unions, consulates and embassies, settlement and migrant peak bodies, education providers, student groups, hostels, community organisations and individuals.

We communicated with these community organisations, service providers and unions and individuals via eDMs and LinkedIn at key stages with information about the survey and reminders to promote the survey. We gave briefings about the survey at organisations’ events and webinars for members or clients. We developed a wide range of survey promotional materials, including suggested email and social media text, social media tiles and images, flyers and posters. These were made publicly available on our website for our community partners (or anyone who wanted to promote the survey) to use, and included materials and language tailored to reaching the large international student audience. Promotional materials were available in the 5 languages into which the survey was translated, as well as 12 additional community languages.

Australian and foreign government channels: The Commonwealth Attorney-General’s Department assisted us by disseminating the survey in their networks. We also created a database of foreign embassies and consulates in Australia, and sent an eDM to each with information about the survey and promotional materials that they could use.

As international students constitute the majority of temporary visa holders who are working in Australia, we developed a comprehensive strategy for reaching international students through both official and grassroots channels. This included:

  • Securing the support of key organisations and peak bodies in the higher education sector, including Universities Australia, the Australian Universities Procurement Network, International Education Association of Australia, Austrade, English Australia, International Student Advisers Network of Australia, International Student Education Agents Association, StudyNSW, StudyQueensl and and StudyMelbourne.

  • Securing the formal support of 45 education providers, comprising 30 universities, 3 TAFEs, 5 university pathway colleges and 7 private colleges/institutes in disseminating the survey to their students. We offered a tailored report on the survey findings for their institution if they reached a certain threshold of responses from their students (this threshold being tailored to the provider’s size and the size of their international student cohort). Education providers used a diverse range of methods to promote the survey to their students, of which direct student emails emerged as the most effective. As a result of our own outreach and the efforts of these peak bodies, 27 education providers actively and successfully promoted the survey to their international students, with sufficiently high participation rates to obtain a tailored report on findings specific to students at their institution.

  • We established a Migrant Justice Institute Student Ambassador Program, comprising 40 international student representatives from StudyHubs across the country, to promote the survey to a wide range of student peers and migrant communities, including via social media and private messaging groups. We held regular briefings with this group while the survey was live to provide updates on the demographics of participants, identify underrepresented cohorts amongst students and migrants, and share effective strategies to target these groups and emerging gaps.

Direct promotion to migrants through social media and paid advertisements: We strategically used social media to reach migrants who are networked with each other but not to services/organisations, by posting on migrant-focused Facebook groups. We also enlisted the help of partners and student ambassadors to promote the survey in cohort-specific or private social media groups and platforms. We also ran paid Google and Facebook ad campaigns to target under-represented cohorts that emerged as the survey progressed. We are grateful to the Migrant Workers Centre for running paid campaigns on Facebook and Instagram to target under-represented cohorts.

Direct promotion through physical presence in strategic locations: We put up posters and flyers in key businesses including youth hostels and English language schools, to target under-represented cohorts such as backpackers and English language students.

News and media outlets: We arranged publication of news pieces about the survey in student-focused media outlets such as Insider Guides, Koala News and the PIE News, as well as community publications such as Sydney Today (for Chinese audiences).

Measures to responsively increase reach of survey

We used prizes to provide incentives to a highly diverse population of migrant workers to participate in the survey.

Our dissemination strategy was also monitored and responsively revised throughout the survey live period, to ensure we increased the number of participants from the target cohort. We monitored the demographics of survey participants, and continuously compared the emerging survey sample to the most up-to-date Department of Home Affairs data on temporary visa holders present in Australia, to identify gaps in our sample based on nationality, visa type, location, etc.

We regularly communicated these gaps to key partners and MJI Student Ambassadors in order to adjust ongoing promotional activities, and revised our own strategies, to target key cohorts that were less well represented in our sample. For example, we held four briefings with the Student Ambassadors while the survey was live where we provided running updates on gaps in student participants (eg students studying VET, TAFE or English language courses) and identified ways to reach them. In addition, as sponsored workers emerged as a gap, we relied on Home Affairs data to identify Indian and Filipino sponsored workers as two of the biggest groups of sponsored workers, and targeted resources to these cohorts (eg posting in relevant Facebook groups). We are grateful to Migrant Workers Centre for developing promotional materials in Hindi and Tagalog for paid Facebook and Instagram campaigns to reach these workers

These adjustments allowed us to increase the number of participants from the target cohort.

Data cleaning

There were 16,727 responses entered in the online version of the survey. Of these 4,644 were removed because they did not meet the eligibility criteria:

  • 2,235 indicated they did not undertake paid work while on a temporary visa or while undocumented, or provided duplicate email addresses for follow-up;

  • 7 were under 18 years old;

  • 54 did not selected any temporary visa when asked what visas they had held in Australia;

  • 2,348 answered only demographics questions and did not reach the substantive questions in the survey.

A further 2,120 were excluded because we concluded they may have been bots. We did so using several detection methods. First, we removed clusters of surveys submitted within seconds of each other, and observations where the open response was written in Latin or a long exposition in English which appeared to be generated by AI.

We then used IP reputation scoring software (which specifies IP addresses which have been known to be associated with bots) and logic checks to create a sample for review. We devised 34 logical flags to test for inconsistencies (where age and date of arrival suggested arriving in Australia at an unreasonable age), unlikely responses (eg being a member of more than 3 unions) or impossible responses (eg nationality or age did not correspond to eligibility for visa, or sequence of visas held did not correspond with visa eligibility requirements). We also applied flags from the metadata (for example comparing if the language the survey was completed in - other than English - matched the respondent’s nationality.

Within this sample we then reviewed open responses to assess whether they were likely to be completed by humans. For example, we looked for specific references to their job and their experiences and assessed the tone and voice, among other elements. We also reviewed each response as a whole to look for other non-sensible answers. In each case, we considered whether these flags led us to suspect that it was a bot completing our survey and not isolated respondents tripping one of the flags for some other reason or through a single mistake.

As we completed this review we built a picture of how some bots behaved: while our normal sample skewed female, bots were generally 50-50 male-female; bots tended to arrive longer ago and tended to be younger with a cluster around 25; bots provided an email address with a capital letter in the middle; bots used emojis in their open text response; bots tended not to provide a phone number. We applied these evolving insights in our review.

This resulted in 9,963 valid responses. A comprehensive discussion of survey cleaning strategy including bot removal and approach to analysis is available on request.

Organisations assisting with survey design and distribution

The survey was distributed to migrant workers across Australia by a very wide of individuals and organisations. These include:

Education sector peak bodies: International Education Association of Australia, Universities Australia, Austrade, English Australia, ISANA, International Student Education Agents Association, Australasian Universities Procurement Network, StudyNSW, Study Queensland and StudyMelbourne.

Education providers, comprising 30 universities, 3 TAFEs, 5 university pathway colleges and 7 private colleges/institutes: Alliance College, Australian Catholic University, Australian National University, Bond University, Central Queensland University, Charles Darwin University, Curtin University, Edith Cowan College, Edith Cowan University, Flinders University, Greenwich English College, Griffith University, ILSC Sydney, James Cook University, La Trobe College, La Trobe University, Mars Institute, Melbourne Polytechnic, Murdoch College, Murdoch University, Navitas English, Orange College, Polytechnic Institute Australia, Queensland University of Technology, Swinburne University of Technology, TAFE NSW, TAFE SA, Torrens University, University of Adelaide, University of Canberra, University of Melbourne, University of New England, University of Southern Queensland, USYD, University of Tasmania, University of the Sunshine Coast, Western Sydney University, University of Wollongong, UNSW, UNSW College, UTS, UTS College, Victoria Institute of Technology, Victoria University, Western Sydney University.

Community organisations, service-providers and unions: ACRATH, AMIEU Qld, Asylum Seekers Resource Centre, Australian Catholic Anti-Slavery Network, Australian Council of Trade Unions, Australian Human Rights Commission, Australian Red Cross, Australian Workers Union, Cleaning Accountability Framework, Employment Rights Legal Service, FAIR Hiring Initiative, Fairfutures, Federation of Ethnic Communities' Councils of Australia, Health Services Union, Human Rights Law Centre, Immigration Advice and Rights Centre, JobWatch, KO-WHY, Migrant Workers Centre, Migration Institute of Australia, Redfern Legal Centre, Scarlet Alliance, SETSCoP, Settlement Council of Australia, South-East Monash legal Service, St Vincent's Hospital, Sydney Community Forum, Transport Workers Union, Tamil Refugee Council, Unions NSW, United Workers Union, Uniting Church in Australia - Synod of Victoria and Tasmania, WalkFree, Westjustice, Working Women’s Centre NSW.

Academic experts and consultants: Professor Justine Nolan, Charlotte Long, Samuel Pryde, Dr Yao-Tai Li, Associate Professor Amelia Thorpe, KPMG Modern Slavery Team.

Foreign consulates: Consulate General of Canada and Consulate General of France in Sydney.

Government agencies: Commonwealth Attorney-General's Department, Commonwealth Department of Education, Commonwealth Department of Employment and Workplace Relations, Migration Queensland, NSW Office of the Anti-Slavery Commissioner.