Workfare
Form of welfare in which recipients are required to accept jobs or participate in job training From Wikipedia, the free encyclopedia
Form of welfare in which recipients are required to accept jobs or participate in job training From Wikipedia, the free encyclopedia
Workfare is a governmental plan under which welfare recipients are required to accept public-service jobs or to participate in job training.[1] Many countries around the world have adopted workfare (sometimes implemented as "work-first" policies) to reduce poverty among able-bodied adults; however, their approaches to execution vary.[2] The United States and United Kingdom are two countries utilizing workfare, albeit with different backgrounds.
The examples and perspective in this article deal primarily with the English-speaking world and do not represent a worldwide view of the subject. (February 2019) |
This section needs additional citations for verification. (December 2023) |
Workfare was first introduced by civil rights leader James Charles Evers in 1968; however, it was popularized by Richard Nixon in a televised speech August 1969.[3] An early model of workfare had been pioneered in 1961 by Joseph Mitchell in Newburgh, New York.[4] Traditional welfare benefits systems are usually awarded based on certain conditions, such as searching for work, or based on meeting criteria that would position the recipient as unavailable to seek employment or be employed. Under workfare, recipients have to meet certain participation requirements to continue to receive their welfare benefits. These requirements are often a combination of activities that are intended to improve the recipient's job prospects (such as training, rehabilitation, and work experience) and those designated as contributing to society (such as unpaid or low-paid work). These programs, now common in Australia (known as "mutual obligation"), Canada, and the United Kingdom, have generated considerable debate and controversy. In the Netherlands workfare is known as Work First, based on the Wisconsin Works program from the United States.
Workfare approaches to welfare are examples of Active Labor Market Policy (ALMP) that differ based on country, welfare state, and time period. Active labor market policies are utilized to counteract capitalistic market failure that prevent full employment in an economy. Four types of active labor market policies are incentive reinforcement, employment assistance, maintaining occupation, and human (social) capital investment. Workfare/work-first approaches have been identified as more coercive forms of welfare to work regimes.[5] The US and the UK are both examples of liberal welfare regimes that prioritize the market's role in mitigating poverty, hence adopting workfare.[5]
There are two main types of workfare scheme: those that encourage direct employment to get individuals off the welfare roll and directly into the workforce, and those that are intended to increase human capital by providing training and education to those currently in the welfare system.[3]
In less developed countries, similar schemes are designed to alleviate rural poverty among day-labourers by providing state-subsidised temporary work during those periods of the year when little agricultural work is available. For example, the National Rural Employment Guarantee Act (NREGA) in India offers 100 days' paid employment per year for those eligible, rather than unemployment benefits on the Western model. However, a workfare model typically not only focuses on provision of social protection through a wage-income transfer, but also supports workers to get into work.
The purported main goal of workfare is to generate a "net contribution" to society from welfare recipients. Most commonly, it means getting unemployed people into paid work, reducing or eliminating welfare payments to them and creating an income that generates taxes. Workfare participants may retain certain employee rights throughout the process, however, often workfare programs are determined to be "outside employment relationships" and therefore the rights of beneficiaries can be different.[6]
Some workfare systems also aim to derive a contribution from welfare recipients by more direct means. Such systems obligate unemployed people to undertake work that is considered beneficial to their community.
The history of workfare in the United States dates back to before the American Revolution, during which land grants and military pensions were distributed sub-nationally and based on means-testing. The disbursement of the "first" social benefits set precedents for the development of the US welfare state.[7] In the early days of the United States, most Americans were deeply connected to the Protestant religion that favored literacy and hard work. Therefore, education was promoted and poor relief/cash assistance was discouraged in addressing poverty. In addition, the United States never had a history of feudalism to leave a residue of distinct social classes. Feudalism discouraged education to preserve social order; instead the United States immediately embraced capitalism, an economic system in full support of public education. As such, the United States from its early beginnings placed greater importance on education to decrease poverty.[8]
This history gave rise to colonial poor relief methodology that supported work, as a means of increasing self-reliance. Impoverished and destitute community members were forced into labor at poorhouses and workhouses to enable individuals to provide for themselves while completing a task for the community. Workhouses were designed for the "unworthy" poor, or those who were unemployed but able to work.[7] During this time, women were disproportionately found in workhouses, as they were unable to own property or run a household after a man had abandoned her or died. People of color were unable to receive any poor relief at all. This "deservingness" discrepancy impacting women and people of color set the stage for disproportionate assistance to date.[7] Poorhouses and workhouses existed as a main method of poor relief through the 19th century, particularly growing in popularity as immigration increased in the United States and leading to the narrative that poverty equates to laziness.[7]
Throughout the 20th century, narratives about laziness morphed into stereotypes such as the welfare queen that aimed to paint black, single mothers as abusers of the welfare system. This stereotype claimed that black mothers supposedly refused to get jobs, had numerous children, and lived exclusively off of taxpayer dollars. While applying only to a small percentage of the population, rhetoric such as this laid the groundwork for welfare reform in the 90s.[9]
In 1996, President Bill Clinton and the Republican Congress passed the Personal Responsibility and Work Opportunity Reconciliation Act (also known as welfare reform), which created Temporary Assistance for Needy Families (TANF), shortened welfare stays, and mandated intensive job training and work requirements for individuals in need of assistance. The Personal Responsibility and Work Opportunity Reconciliation Act mandated work requirements after two years of assistance, instituted a five-year limit, created state controlled funding, rewarded work with performance bonuses, and required participation in paid or unpaid work.[10] Welfare reform made workfare the official social welfare ideology of the United States.[11] The effort to decrease the number of people on the welfare roll was successful, although some argue that this did not translate to a decrease in poverty.[12]
The criticism related to workfare in the United States is most notably about the tight restrictions and opportunities for low-skilled workers. Loic Wacquant theorizes that the United States and other Western, liberal states have shifted towards more punitive governance under the guise of neoliberalism. Supplemented by welfare reform and the 1994 Crime Bill, he argues that workfare has shrunk (via stricter restrictions) and prisonfare has expanded, ultimately locking the same vulnerable population in a vicious cycle in which low wage work, decreased benefits, and low social mobility lead to increased crime and punishment. He also argues that the institutional racism inherent in the United States has led to the underdevelopment of public aid.[13]
In all welfare states, there is a constant need to address inclusion and exclusion (i.e., who Is able to access policies and who is not). Race discrimination has placed a central role in this struggle, particularly in the United States as a diverse nation. Typically, people of color have struggled entering the workforce due to narratives related to high crime and low-skilled levels. This discrimination is a leading cause for the higher rates of poverty of people of color in the United States.[5] Jeff Manza argues that people of color, particularly African Americans, are more likely to utilize social benefits because they are more likely to be poor.[14] Since workfare decreases the emphasis on education and increases the emphasis on work, scholars like Manza assert that work-first policies trap people of color in a cycle of low-wage work and poverty.[14]
Gender inequality arises in workfare as well, particularly related to equal pay and dependent care work. Welfare states can adopt different models related to the main breadwinner: male-breadwinner model, dual breadwinner model, or dual-earner-dual carer model.[5] Workfare in the United States is focused on the financial self-reliance of families through work, and tends to lean towards a male-breadwinner model.[12] A male-breadwinner model assumes that men participate in the labor market and women complete domestic and caregiving tasks unpaid.[15] Welfare policies designed and structured based on the assumption and support of marriage significantly disadvantage single mothers. For example, in some states, work-first policies may not consider the childcare responsibilities of women receiving benefits when requiring them to participate in workfare.[16] Single mothers are 33% more likely than married parents to be in poverty in the United States also in part due to the stagnant minimum wage[12] and gender pay gap.[17]
This section needs expansion. You can help by adding to it. (January 2024) |
Canada ran the experimental Self-Sufficiency Project, a "generous"[18] income supplement to welfare recipients who found work. 1998 research saw significant increases in employment rates and hours worked over the control group (who did not benefit from the project).[19] Though later research suggested that the control group was on trend to catch-up with the recipients in the long-run.[20][18][21]
A 2001 review of a range of welfare programs concluded that earnings supplements are an effective policy to increase employment and earnings.[21]
In the UK, critics point out that the type of work offered by workfare providers is generally unskilled and is comparable to community work carried out by criminal offenders being punished on community service schemes.[22]
Workfare has also been criticised as forced labour, as those placed under workfare who do not work for who they are assigned to risk benefit sanctions and starvation from a loss of benefits needed to survive.[23][24]
A 2008 report by the UK's DWP on US, Canadian, and Australian workfare schemes suggested against their effectiveness. It found little evidence that workfare majorly reduced the number of claimants, or increased the likelihood of finding work. Rather it pushes participants to stop claiming before even getting work, leaving them with no income. However, the report notes there was a limited pool of evidence.[25]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.