css.php

Patricia J. Brooks, “Individual Differences in Statistical Learning: Implications for Language Development”. Commentary by ARC Student Fellow Ian Phillips

In her recent talk at ARC titled Individual Differences in Statistical Learning: Implications for Language Development, Dr. Patricia J. Brooks presented ongoing research examining the relationship between individual differences in statistical learning ability and language development. In this talk, Brooks, Professor of Psychology at the College of Staten Island, City University of New York and ARC Distinguished Fellow, explored the relationship between statistical learning ability and both child first language acquisition and adult second language acquisition. The takeaway message is that statistical learning ability appears to influence the outcomes of both types of language acquisition, though this effect may be modulated by the quality and timing of feedback provided to the learner during language development.

Brooks started off by exploring how individual differences in statistical learning ability might underlie individual differences in linguistic skills. Statistical learning—aka procedural learning or implicit learning—is an inductive process in which the learner becomes sensitive to probabilistic patterns in the input. As Brooks pointed out, statistical learning is operative in infants and research suggests that it may facilitate acquisition of vocabulary, phonemic categories (language-specific sounds), and grammatical dependencies (e.g., subject-verb agreement). In further developing the relationship between statistical learning ability and language acquisition, Brooks presented her results from a meta-analysis of recent studies that show evidence that individuals with a language disorder known as Specific Language Impairment (SLI) also have a statistical learning deficit (Obeid, Brooks, Powers, Gillespie-Lynch, & Lum, 2015).

Brooks’s main argument is that immediate positive feedback is crucial to language development and this kind of feedback interacts with the statistical learning mechanisms that underlie language development. To illustrate how this interaction affects language development, Brooks weaved together evidence from three of her recent studies utilizing a variety of data collection and analysis techniques. Brooks first presented her research examining how environmental factors impact language outcomes in infants enrolled in the Early Head Start Research and Evaluation Project (EHSRE) to show that positive feedback from the caregiver is the biggest predictor of child language development (Poulakos, Brooks, & Jewkes, 2015). This study focused on investigating how the characteristics of mother, child, home, and social interaction when the infants were 14 months old affected language outcomes at 14 and 36 months, with the goal of determining whether the quality of social interaction with the caretaker impacts language development. In this research, Brooks analyzed data for 791 infants from low income families where English was the only language spoken at home and used a cumulative risk model to analyze the interactive effects that multiple factors—including maternal mental distress, negative interaction, maternal education, gestational age, child cognition, and child gender—had on language outcomes at 14 and 36 months. The analysis showed not only that a large percentage of the infants in the study were significantly delayed in language development at 36 months compared to all children in the EHSRE data but importantly at both 14 months and 36 months, joint attention—the factor indexing positive caretaker-child social interaction—was the biggest predictor of child language development.

After establishing the link between social interaction and language development, Brooks presented a second study in which she analyzes child-caretaker interactions in the CHILDES Clinical English Weismer SLI Corpus to determine which specific aspects of social interactions affect language development in late talking children (children using no words at 18 months, and fewer than 50 words and no word combinations at 24 months) (Che, Alarcon, Yannaco, & Brooks, 2015). This analysis shows two important findings: first, for late talkers at ages 30, 42, and 54 months, mothers and children show close to 20% overlap in each other’s speech, where overlap is defined as an imitation that may be either expanded or reduced; second, maternal overlap of child speech at 30 months is the single best predictor of a child’s mean length utterance (MLU; a standard measure of language proficiency) at 54 months of age—this factor is more predictive of language development at 54 months than child overlap, child MLU at 30 months, or the amount of mother speech. Brooks noted that this overlap is the critical just-in-time feedback needed for language development and cites research suggesting that this type of feedback provided by caretakers during social interaction facilitates language pattern extraction or rule learning.

After detailing how child language learning is affected by social interaction, Brooks shifted her focus to adult second language learning and presented a final experiment showing how individual differences in statistical learning ability predict language learning outcomes in adults (Brooks, Kwoka & Kempe, submitted). In this experiment, English-speaking college students completed three two-hour language learning sessions over 2-3 week period where they were exposed to spoken Russian phrases in a question-answer dialog using a computer program. Importantly, during the training phase of the language learning experiment, participants were provided with the type of immediate feedback that Brooks argues is crucial to language learning—they heard the correct phrases repeated immediately after answering each question. The results show that performance on both language comprehension and production tasks during each of the three language learning sessions is predicted by individual differences in statistical learning ability, as measured by two separate statistical learning tests.

There is much debate about the mechanisms that underlie both child first language acquisition and adult second language acquisition. In this talk, Brooks presented new evidence suggesting that general statistical learning ability plays an important role in language development in both children and adults. Brooks complemented research showing the importance of social interaction for child language learning with new evidence that suggests that what’s important about social interaction is the just-in-time positive feedback. While this work is important for language acquisition research, it also has implications for developing interventions to close reported gaps in child first language acquisition and applications for adult second language learning.

Patrik Svensson, “From Lab to Lounge: Liminal Spaces for Learning”. Commentary by ARC Student Fellow Hamadi Henderson

Access to information has changed drastically in the information age. Originally, knowledge and access to higher learning was limited to a select few; confined to ornate halls and ivory towers. However, it is now freely available to the public. Where, in the past money and prestige were the only keys that granted access, today all that is needed is a low-end tablet and a Wi-Fi connection. The change in the accessibility of knowledge does not only impact who has access but also where there is access. In formal learning environments, learning is typically viewed as occurring in designated spaces with formal structures and floor plans. However, this ignores the learning that occurs in between these formal spaces. For example, in grade schools it is typical for teachers to display student work on boards outside of the classroom. While students may not actually read the displayed student work, the intent behind the display highlights the concept of learning in liminal spaces. Students in transition from one formal learning environment to another have the opportunity to learn. But is this more effective? What does it mean for the learner to learn in between these formal environments?

One of the key aspects of liminal spaces is the diminished formality. The relationship between the student and teacher in the classroom is very clear. The teacher is placed in a position of power either standing tall among seated students or seated among the students in an authoritative position. However, this dynamic changes in liminal spaces like the halls between classes, in study lounges, or even teacher offices. The question arises: what is the pedagogical value of this diminished formality? To answer this question we must first address what primarily drives learning in formal learning environments? Is it the instruction of the teacher or the motivation of the student? If the latter, liminal spaces offer students the opportunity to learn in accordance to their motivations. This would mean that liminal spaces are learner-focused, driven primarily by the learner’s educational needs. But if this conjecture is accepted as truth, the subsequent question that arises is can this effect be replicated in a non-liminal space? In academic setting there is a tendency to create a physical space with a fixed purpose. There is a computer lab and a biology lab. There is a classroom, a study room, and a thesis room. Would it work to create a “liminal room” where one does all the things one does in a liminal space? To dissect this question we have to think about what are the physical elements of a liminal space. This is a difficult concept to discuss simply because the very existence of liminal spaces is predicated on the existence of formal learning environments. But this is an issue addressed, to some degree, by the configuration of the space. If there is a desired set of behaviors for a space, the configuration of the space can elicit the desired behaviors without overtly specifying them. Additionally, it can simply be a matter of allowing formal spaces the flexibility to be used outside of their designated purpose.

In conclusion, the physical layout of a learning environment can greatly impact the how learners engage with the space and learn within it. The questions that remain are what are the elements of the space that are most conducive to learning and how do we promote their existence in to current learning environments? I find this to be a very interesting and important direction for educational research. Answering these questions can have great implications for the use of technology and the increase in equity of education.

Luisa Martin Rojo, “The Impact of the Native-Speaker Model in the Construction of Inequality”. Commentary by ARC Student Fellows Lauren Spradlin and Jennifer Hammano

ARC Distinguished Fellow Luisa Martín Rojo, Professor in Linguistics at the Universidad Autónoma in Madrid, presented her current research project The Impact of the Native-Speaker Model in the Construction of Inequality on November 12. Her presentation focused on what it means to be a ‘native’ speaker of a language and the privileges that come with it, and conversely, how inequality is structured, legitimized, and propagated through prejudices relating to notions of who is a ‘native speaker’ of a language.

Native speakers of a language are typically defined as those who have grown up hearing and speaking that language from birth, though scholars have begun to challenge both the validity of this definition and the concept of ‘native speaker’ altogether. As Martín Rojo expressed, native speakers of a given language are usually considered (by linguists and speech communities alike) to be the models and the authorities on that language. In line with this notion, there is an expectation that all speakers of a language should strive to sound like a local native speaker. Any language users who choose to use structures, words, or sounds that do not conform to the prestige variety as spoken by a native speaker are perceived as ‘non-authentic’ speakers of the language, and are policed by authentic speakers accordingly. Speakers whose language practices are non-nativelike are subjected to linguistic shame, which carries social and economic consequences.

In order to demonstrate how power is exerted using language, language policing linguistic surveillance, and notions of nativeness, Martín Rojo conducted a study on university students in Spain who had migrated at a young age from Latin America. She interviewed university students about their experiences being the subjects of linguistic surveillance, with specific reference to use of /θ/ vs. /s/. /θ/ (used to represent a th-sound) is not used in Latin American Spanish, but is used in the area of Spain where the subjects lived. This is an interesting twist in a study exploring the status of ‘native speaker’ – the speakers who took part in the study were truly native speakers of Spanish, in that they had been speaking Spanish from birth. Yet, their language practices were still ‘othered’ by speakers of the prestige variety of Spanish spoken in Northern Spain.

 

 

 

Patrick Simon, “Lighter Than Blood: Ethnic Enumeration in the Era of Equality Policies”. Commentary by ARC Student Fellows Siqi Tu and Erik Wallenberg

Patrick Simon’s presentation “Lighter than Blood: Ethnic Enumeration in the Era of Equality Politics” is a research project looking at how states acknowledge and track racial and ethnic diversity. The title of the talk, “lighter than blood” has its reference to the famous book of Tukufu Zuberi, “Thicker than Blood: How Racial Statistics Lie”. Simon is looking at the globalization of racial and ethnic politics in the context of equality policies like affirmative action. His concerns include the paradox of the re-creation of racial categories in the practice of using racial categories to track racial disparities and in the production of statistics. Simon looks at how people react to racial classification in census surveys in a multi-country comparison (includes US, Canada, Brazil, Mexico, Columbia, France, among others).

The reasons historically for collecting data on racial, national, and ethnic origins are many. From the domination, subordination, and segregation of sections of populations, to the attempt to acknowledge diversity and create multiculturalist societies, and for political action to right past injustices (used to guarantee voting rights or affirmative action), gathering this data has a variety of uses. Beyond these fraught uses of this data on race and ethnicity, there are problems in the collection of this data as well. And this is not just the threat of the crude essentialization of scientific racism, but the imposition of ethnic identity in the limited choices given in surveys or the taking away of options for identifying oneself. The international human right and equality agencies like OHCHR (Office of the High Commissioner for Human Rights) and CERD (Committee on the Elimination of Racial Discrimination) have been asking for more statistical data collection broken down by race and ethnicity in an era of post-mass migration.

Simon argues that the collecting accurate statistics is essential for implementing affirmative action programs which have been implemented in a growing number of countries. He argues that statistics make visible the invisible, showing where discrimination is occurring.

Simon suggests that scholars should move toward a constructivist approach assuming that race and ethnicity are indeed subjective and socially constructed concepts. The constructivist turn in ethnic and racial statistics raises a series of epistemological and methodological issues behind ethnic categorization. Currently, states that do not directly collect data on race and ethnicity use other indicators of ethnic diversity (such as language spoken at home, parents’ original countries) as a proxy. Other methods of collecting ethnic data include self-declaration, third-party recognition and group recognition. Each has its own limits given the fluidity of race and ethnicity. The more open-ended the question is in the self-declaration method of collection, the more assumption statisticians will do in the later process of “re-coding” the race and ethnicity category. Statistics is not objective, as Simon mentions in the beginning of the talk, which only represents the convention understanding of the society. Moreover, since race and ethnicity is essentially a social construct, methodological issues like moving identities, multiple identities and misclassification will constantly emerge. Simon suggests that we do not have to be consistent in creating race and ethnicity categories for each nation, because racism is not consistent. For example, in the US census, “the Hispanic question” has been revised from an extra question outside of the “race” question into part of the “race” question. Conflating the Hispanic and race question is to avoid misclassification of “non-Hispanic white”. Also, the perception for the purpose of collecting the ethnic data has changed slightly over the years. Hispanic population starts to utilize this data to claim their rights. In the Canada case, their household survey asks the interviewee’s ancestors and whether they belong to a loosely defined “visible minority” group. In the UK case, the census asks for the interviewee’s ethnic group. In the Brazil case, the census directly asks the interviewee about his/her skin color. All these cases demonstrate that there is no standardization regarding the ethnic enumeration. It is a pragmatic description of the current situation of the nation. In the French case, the census bureau is not allowed to collect race and ethnicity data because the French constitution supposedly treats its citizen “without distinction”. However, Simon argues that it is hypocritical since the discrimination toward minority groups does exist and being colorblind will not make the situation better.

Simon’s presentation has shown that the categorization process is a dialectic one involving constant negotiation around the epistemological understanding of race and ethnicity. We are looking forward to more findings from Simon’s new project on the globalization of racial and ethnic politics in the context of equality policies (POLRACE).

 

 

 

 

 

 

 

 

Amy Chazkel, “The Nocturnal Lives of a Nineteenth Century Brazilian City”. Commentary by ARC Student Fellow Emily B. Campbell

On November 5th, ARC Distinguished Fellow, and Professor of History at The Graduate Center, CUNY and Queens College, Amy Chazkel presented her current research, “The Nocturnal Lives of a Nineteenth Century Brazilian City”. Chazkel offered a detailed portrait of nighttime in Rio de Janeiro, and the socio-legal construction of night during the city’s 53-year long curfew, part of her forthcoming book, tentatively titled, Urban Chiaroscuro: Rio de Janeiro and the History of Nightfall. Chazkel opened the talk by challenging the notion of the night as a time of innate danger, and asked the audience to instead see night as a sociolegal construct of control, policing, curfew and states of siege or exception.

Chazkel emphasized Brazil and Rio de Janeiro as a particularly interesting site, where slavery was not abolished until 1888 and urban modernity and slavery overlapped in profound ways. Drawing on an impressively vast archive of information, Chazkel used travel letters, period paintings, maps, newspapers, public records on the theatre, arrest records, police edicts, among other sources to sketch a portrait of night in Rio at this time, in order to further discussions on modernity and social control. The control of public space through curfews, Chazkel argues, gave way to the novel articulation of the ‘right to the city’ and freedom of movement in public space post-emancipation. Chazkel’s project, has literally been that of pulling out from the shadows, as no explicit archives or materials on nightfall or curfews exist. Fascinatingly, a footnote explaining, “after dark a stick became a weapon” piqued her curiosity and led to this research.

Chazkel explained that Francisco Teixera de Aragão instituted the 53-year curfew in 1825 after the Constitution of 1824 upheld slavery. The curfew was imposed at 10 PM and 9PM in winter months, and was signaled by the unceasing ringing of church bells for thirty minutes. After the curfew began, slaves found on the street were subject to arrest, corporal punishment (often public whipping) and detention. The curfew was instituted through polices edicts, practices and city ordinances. Curfew violations were classified under “troublesome activity” and a threat to “public tranquility”. The curfew did not apply to “well known persons of integrity” and free white-persons, though Chazkel was careful to point out the regular, though arbitrary enforcement of the curfew, as police decided a person’s social standing and race in the darkness of night. Slaves that carried written permissions from their owners were not punished. At some points during the 53-years, curfew violation accounted for up to one fourth of all arrests.

Restrictions on movement, through the curfew, did not impair the economy, and served as a means of labor discipline, and class differentiation in the use of public spaces and in the post-colonial distinction of citizen/non-citizen. Most people did not have clocks of their own, though life was structured by time, with a balloon visible throughout the city released at noon, and church bells marking the start of curfew.

Chazkel also profiled the burgeoning demand for public illumination and the growth of theatres, marking a shift towards a culture of night leisure and urban entertainment. Of special importance was the theatre Alcazar Lyrique, which was celebrated as having changed notions of taste, the culture of leisure, and contributed to the growing acceptance of public drinking alongside the growth in theater attendance. By the 1870’s the curfew became more difficult to enforce and it was dropped in 1888 with the end of slavery. The curfew can be seen as the beginning of modern policing, with its eventual end and subsequent growth of vagrancy law thereafter.

Reflecting on Chazkel’s work, one is both enticed by the rendering of the past she evokes, and equally compelled to reflect on her broader questions of the right to the use public space, projects of social control and times and states of exception. Who has a right to the city is as pertinent a question as ever in our contemporary American moment, looking to the neoliberal dilemma of private-public space exemplified during the occupation of lower Manhattan’s Zuccotti Park by Occupy Wall Street in 2011 and public debates around racialized policing practices, most recently decried by the Black Lives Matter movement. The day/night distinction in the use, conception and control of public space persists, and historical work such as Chazkel’s offers a burgeoning, necessary illumination.

Suren Pillay, “Equality Citizenship and Difference: Becoming Post-Apartheid.” Commentary by ARC Student Fellows Abigail Kolker, Parfait Kouacou, and Sarah Litvin.

Suren Pillay’s talk, “Equality Citizenship and Difference: Becoming Post-Apartheid,” is concerned with the legacies of apartheid, including justice and reparations claims, as well as inequalities and citizenship. In the post-apartheid period, South Africa is trying to transcend its history of violence and inequality by focusing on protecting equal citizenship and upholding human rights. Yet, in his lecture, Pillay underscores the continued legacy of ethnic tension in post-apartheid South Africa by exploring two recent violent instances: the Marikana Mine massacre and a spate of xenophobic violence against foreign nationals from other African countries.

While many scholars consider the “wrong” of apartheid on economic terms (as the securing of cheap labor for the gold and diamonds minds central to the South African economy at the end of the 20th Century), or in racial terms (as a system of systemized racial discrimination that distributed social, political and economic resources based on race), Pillay introduces a third avenue to think about apartheid discrimination that is often less prominent: classification and distinctions based on ethnicity.

The 1950 Population Registry Act designated eight racial categories for South Africans. The indigenous, black South African population did not fall under any of these. Instead, this law categorized and identified them according to ethnicity. Under the Bantustan policy in apartheid South Africa, these people–80% of the population– became, in Pillay’s words, “foreigners” living in “nominally independent states.” These Bantustans or “homelands” were controlled by local chieftons. In 1994, when apartheid rule ended, the racial system of power was overthrown, but the ethnic distribution of power, in the form of zones adjudicated by chieftons following customary law, remained. Of South Africa’s population of about 50 million people, 16.5 million live in these zones today, subject to chieftons and customary laws that discriminate based on ethnicity. Two maps, one showing the bantustans and the second showing the contemporary regions that operate under customary law, reveal how little these loci of authority have changed.

The effects of the country’s ethnicity-based power structure are significant. What’s at stake, Pillay says, is, “Who constitutes the community? Who speaks on behalf of community? and Who is excluded from community?” These issues are practical and philosophical; economic and political. The large platinum mines such as Marikana, elaborated upon below, have attracted major capital investments–and migrant workers–into areas that operate under customary law. This has lead to disputes about the relationship between revenues, private mining houses and chieftons’ authority, and raised questions about citizenship and equality for migrant workers, who often live in “shack areas” nearby mines and can live in these areas for generations without accumulating any rights. The country has seen political splits along ethnic lines as more and more money is at stake through mining operations in regions under customary law.

The two contemporary examples that Pillay uses to show the effects of this are the Marikana massacre and contemporary xenophobic violence. The Marikana massacre occurred in August 2012, when 34 mine strikers were gunned down by a special unit of the South African police. This was the largest instance of police killing of civilians in South Africa since 1960, and rightfully garnered much international attention. This event revealed the complexity of many contemporary issues in post-apartheid South Africa. It shows, for instance, that the Marikana massacre should not only be thought of through the lens of a labor dispute but also as how the ethnic distribution of power allows for the state to abdicate its responsibility in instances of violence. The second example Pillay employs is the discrimination and violence that foreign nationals have recently faced in South Africa. Some of the violence manifests in the quotidian, such as the dealing with immigration laws and negative treatment by the police, while other instances are more dramatic, including anti-immigrant riots and foreigner’s stores being looted. Unfortunately, extreme xenophobic violence has been on the rise; especially notable is the recent spike in physical assaults and murder of immigrants.

Most of the literature on the recent wave of xenophobia claims it all started 1994, but actually violence against the outsider has a long history in South Africa, the only difference being that the figure of the outsider was mobilized differently in the past. The figure that bridges historical and contemporary South Africa, Pillay argues, is the archetype of the migrant. In this particular context, ‘migrant’ can be defined in many ways. The “migrant” or “outsider” is a political subject, so it is not just a foreign national, but also a foreigner from another province or another state. When migrant labor is recruited for deep mining projects, as is often the case, they are usually from another town or province. Today, the migrant can be represented by either the migrant mine worker or the foreign national residing in South Africa.

Pillay claims that South Africa has yet to reckon with its system of ethnic based inequality, a system that dates all the way back to British colonial rule in the 1890s, and was folded into Apartheid South Africa. While empowering local chieftons was initially a useful strategy to consolidate control for the British, today it poses a threat to the programs of equality and human rights of the South African government. He argues that we must see “colonialism as something that has purchase in 20h century, not just history,” and concludes that this research demands a reassessment of “how, whether, and if we are becoming post apartheid.”

Pillay’s presentation is the most recent in the series of ARC’s lectures on inequality in the age of globalization. ARC fellows have explored this topic from a variety of historical, political, and economic perspectives. For example, Richard Drayton was interested in the origins of modern inequality and how inequality is reconstituted across temporal and geographic expanses.  Naomi Murakawa examined the reproduction of inequality as well, by showing how sometimes reforms, such as U.S. police reforms, intend to correct racial injustice, but can unintentionally intensify it. Pillay, by contrast, provides a case-study approach to highlight how one particular system of ethnic inequality, the South African case, prevents progress toward programs intending to promote equality. What is especially interesting about Pillay case is that it begs into question the particularities of how equality is achieved; often, the quest for justice and empowerment of one vulnerable group can come at the expense of another. As these two instances in recent South African history demonstrate, the righteous empowerment of indigenous populations through the official sanctioning of customary law can have negative effects for another vulnerable population, i.e. migrants. His talk added richness and depth to the ARC’s continuing conversation about how to understand and confront inequality in the age of globalization.

David Howell, Lousy Jobs in the Rich World: What happened to shared growth? Commentary by ARC Student Fellows Sarah Kostecki and Orkideh Gharehgozil.

David Howell is a professor of Economics and Urban Policy at The New School. His recent research “Lousy Jobs in the Rich World: What happened to shared growth?” is focused on economic growth and the how workers have benefited from it. His research is driven by the following puzzle: if economic growth and productivity have been increasing over the last 3 decades during the era of neoliberal reform, why haven’t the effects of this growth benefited the majority of workers? Why should maximizing growth be the priority?

In the orthodox economic point of view, inequality is explained by globalization and outsourcing. The belief is that Skilled Bias Technological Change (SBCT) contributes to unequal levels of income shares (Howell, ARC Talk). However, in this setup there is no emphasis on institutions.

To challenge this orthodox economic point of view, professor Howell’s research emphasizes the effects of institutions on the evolution of lousy jobs across the United States and 4 additional rich countries since the early 1980’ to answer two interrelated questions. The first question is how has decent GDP growth, productivity growth, and decent jobs moved over the last few decades in each of the five countries? The second question is what is the institutional story that can be told? Howell hopes to show the decline in bargaining power for employed workers and institutional factors (rules, laws, organizational structures, policies, and social norms) are to blame for the decline in decent jobs in lieu of traditional economic explanations.

To carry out this research, professor Howell first created two new low-wage threshold measures and compares these to a more conventional low-wage measure commonly used in socio-economic research (2/3rds the median income of full-time workers). Howell’s new low-wage threshold measures are defined as 2/3rds of the mean wage of the bottom 90% of full-time workers and 2/3rds of the mean wage of the bottom 90% of full-time prime-aged earners (between 35-59 years of age). Howell then uses the two new low-wage measures to create the lousy jobs measure defined as low-wages plus those individuals working involuntarily part-time. He then showed descriptive results using these measures for the United States.

Interestingly, the alterative low-wage measures show the cut offs for individuals earning low wages should actually be much higher in the past 3 decades than those calculated using the more conventional measure utilized by the OECD, IMF and others. Particularly for the year 2014, for example, Howell’s new low-wage cut off defined as 2/3rds of the mean for the bottom 90% of full-time prime-aged earners, show that jobs defined as low-wage in the United States should be nearly 16 dollars versus the 12 dollar threshold obtain using the conventional measure. Overall, with these findings professor Howell shows that a much higher share of working men and women have been earning low wages over the last three decades than previous studies have shown.

Utilizing the lousy jobs measure, professor Howell highlights several interesting findings. The first is a tale of convergence. In the United States, women have typically held a higher share of lousy jobs since 1979, but men are catching up. Howell shows, for example, that around 50% of women held lousy jobs in 1979 compared to 45% in 2014. Around 20% of men held lousy jobs in 1979 compared to 35% in 2014. Moreover, professor Howell shows the share of prime-aged men with lousy jobs has been increasing steadily over the last 3 decades to converge with that of prime-aged women, especially for those without a college degree. In 1979 around 15% of low educated prime-age men held lousy jobs, compared with 32% in 2014. Nearly 50% of low educated prime-aged women held lousy jobs in 1979, and after declining slightly, rose to the same levels in 2014.

Perhaps most strikingly, professor Howell shows that in 2014, men and women in lousy jobs with a high education had similar median wages to those with low education. Wages for men and women ages 18-34 with a high education were around 11 USD an hour, while wages for men with a low education were around 10 USD an hour and a little more than 9 USD an hour for women. For prime-aged workers the numbers are even more similar. Median wages for men and women with high education were around 11 USD and hour, while median wages for men with low education was right under 11 USD and around 10.50 for women.

Professor Howell’s innovative study should continue to shed more light on the issue of job quality and work precarity. His findings should provide empirical evidence that shows in an era of rising inequality and rising growth, the bounties of this growth did not translate into more high paying jobs. Professor Howell’s preliminary findings also show that no one is safe. Men and women, and especially those with a low education, are not shielded from being stuck in a lousy job. He also shows that once in a lousy job, wages are similar for men and women with both a high and low education – especially for prime-aged workers.

The next step will be looking at other facets of job quality, including the issue of flexible schedules (when workers don’t know their schedule from one week to the next), and lack of or inadequate access to social benefits not tied to employment. Targeting such issues will help to lead researchers toward providing more concrete evidence that “bargaining power” and institutional changes are to blame for the rise in lousy jobs.

To conclude, it is our hope that in the case studies, professor Howell will look not just at institutional changes, but the politics surrounding the institutional changes that are correlated with the rise in lousy jobs, especially in the United States. If social science research is supposed to impact policy and policy change, researchers analyzing the United States need to ask themselves what is the most effective way to do this with the hyper partisan political climate we are currently living in where money rules and issues facing the general public are often ignored.

 

 

 

 

 

Richard Drayton, European Empires and the Origins of Modern Inequality. Commentary by ARC Student Fellow Gordon R. Barnes Jr.

The Longue Durée and the Origins of Social Inequality.

In his recent talk, entitled “European Empires and the Origins of Modern Inequality,” Richard Drayton situates the existence of contemporary socio-economic inequality as part and parcel to a world-historic process originating in Western Eurasia during the Neolithic era. He argues that in examining the advent of sedentary agricultural societies in what today is the Middle East as well as portions of Africa, Asia, and most importantly for Drayton’s research, Europe, we can trace the social origins of contemporary inequality. The rise of notions of private property, increased regimentation of slave labor, and the concretization of the pater familias as part of social-economic relations are all a part of this lengthy process. The wide ranging talk covered various world events and socio-economic processes from the Neolithic period up until the U.S. led invasions of Afghanistan and Iraq. In moving away from, and indeed critiquing, the standardized approach to historical inquiry – that of area study or national narrative – Drayton’s research posits that understanding the global is necessary to understand the local, and vice versa.

In breaking up the history of inequality into three overarching periods, beginning with the rise of the latifundium currently situated in an era of collaborative management (neoliberalism being just a moment in the late stage of this third period), Drayton wants us to consider the substantial social continuities that have persisted over time and space. With this global and transnational framework as the point of departure for studying, analyzing, and understanding inequality (historical in addition to contemporary manifestations) Drayton focuses upon the rise of European imperial systems within the long post-nomadic period of social differentiation and economic segmentation. He rigorously critiques the variety of endogamous explanations for the rise of the West, whilst simultaneously delineating a collaborative network of European imperial structures. The most salient examples offered during the talk referenced Boudreaux merchants and slavers who had insurance claims with a British firm and Iberian silver merchants backed, again, by British capital. All this interconnectivity at the more personal level of business relations persisted even through the strife of inter-imperial conflagration.

This seemingly ubiquitous set of inter-imperial relations are what concerns Drayton, as it represents what he terms as the “reconstitution of inequality transnationally.” In addition to an historical understanding of inequality, Drayton offers us something to think about relative to recent interrogations of the current state of inequality in the world. Offering two frameworks from which to understand contemporary inequality, the first a set of liberal welfare arguments (national) and the second as derived from colonial experiences (international), Drayton clearly sides, analytically at least, with the latter. Taking issue with Simon Kuznets’ and, more recently, Thomas Piketty’s work (their research utilizes the first framework) Drayton was able to effectively demonstrate the paucity of reliable data gleaned from uniquely national studies of inequality. Noting that Piketty’s work posits capital as a “thing” rather than as a social relation, Drayton further reinforces the problem of provincialized studies which do not reckon with global process. Thus, for Drayton not only as a materialist world historian, but as someone who is critically engaged with the phenomena of inequality, reckoning with the effects of the longue durée is necessary if we are to be able to effectively understand the multilayered and varied ways in which social and economic inequality has persisted over time, well into the modern epoch.

Naomi Murakawa, The Perils of Police Reform. Commentary by ARC Student Fellow Douaa Sheet

In this talk, as part of her broader research on 21st century policing and the politics of carceral expansion, Naomi Murakawa discusses the contemporary developments in the ongoing process of police professionalization. She examines the history of police professionalization through a focus on some of its key discursive terms. The conceptual principle she investigates here is “procedural justice”, as the corner stone of these reforms. Murakawa is interested in how the concept is operationalized: what does “procedural justice” actually look like? She looks at incarceration rates as well as admission rates with attention to the enduring racial stratification. She traces continuities in Obama and Johnson’s speeches, in an attempt to trace antecedents of “procedural justice” as well as what is particular to the current reforms. She also examines the training modules and policing textbooks noting the language used where “listening”, “understanding their perspective”, and “respect” are some of the approaches being pushed for.

Murakawa’s argument is that procedural justice is an effort to secure poor people’s consent to their own dispossession, and to diffuse the potential for collective mobilization. What is being obscured, she argues, are the “substance” of policing, the scale (the number of officers employed and the numbers of people being arrested), the scope, and the routine baseline violence (as opposed to extreme violence). She asserts that this is not the story of good intentions with unintended consequences. Murakawa’s concern here is the reproduction of inequality through these police reforms that ironically are called for often in the name of racial fairness. She asks: what political possibilities are enabled and what critiques are constrained when we use this kind of vocabulary?

Murakawa’s questions are very timely particularly with ongoing manifestations of police brutality, as of recently Ferguson, which she also discusses in this talk. The question however is how is this discourse that she traces in books, police guidelines, and political statements translating on the ground? How are officers responding to these texts? What happens in the translation process? The textbooks certainly reveal the meta-discourse. She shows how the textbook language is consistently echoed in political speeches such as that of the police chief and the president. Murakawa mentions in passing a police officer saying to a fresh addition to the team: “forget everything you learned in these books.” Perhaps a look at what happens outside the textbooks, as these officers deal with actual situations (conflict within their teams on levels of brutality deemed necessary, being from the community where they work, members having been exposed to these new guidelines working with older generations, etc.) may help us understand even more comprehensively the ways in which these police reforms gain such force.