- Appendix A: Methods
- Chapter
- University of Michigan Press
- pp. 113-126
-
- View Citation
- Additional Information
Page 113 →Appendix A
Methods
This book sets out to develop new ways of understanding how policy shapes participation in an increasingly devolved, delegated, service-based safety net. I use after-school programs as a case because they reflect all three trends and are an important policy context for low-income families.
Because we know very little about how these programs work on the ground, I use an interpretive qualitative study to develop theory and concepts for this important growing policy context. An interpretive approach seeks to “understand what a thing is by learning what it does, how particular people use it and in particular contexts.”1 I wanted to explore program design in these settings as parents and staff knew it using—as much as possible—their own words.2
This does not mean I approached fieldwork without policy feedback studies in mind. Using the interpretive approach, I could work on data collection and analysis with some “provisional inferences” or hunches informed by previous research, but with some few caveats.3 First, I could not “test” key policy feedback concepts (e.g., program typologies, interpretive and resource effects) with my data. As a small “n” case study with roughly 70 interviews, the data is too limited for hypotheses testing. Moreover, I could not “control” for unobservable factors that may have affected parents’ experiences with programs or their participation outcomes. But carefully controlling for variables was not the aim of this study. As an interpretive study of policy feedback in a new context, I aimed to produce insights that were grounded in what actually happens in specific settings. Interpretive research explores the tension between what is found in the field and concepts from existing research. To that end, study sites should be authentic and not “controlled” settings.4
For the interpretive approach, theory building happens when the researcherPage 114 → makes sense of the tension between what is discovered in the field and findings from previous studies.5 Throughout this study, I looked for plausible reasons behind puzzles that emerged from my observations and interviews and found I could involve both existing theory and my empirical findings to search for these explanations.6
Before I describe my research design and how I collected the data, I want to clarify the kinds of “causal” claims I make throughout the book. I do not aim for causality in the quantitative positivist sense, in which I test alternative explanations to arrive at precise causal mechanisms.7 Instead I pursue what proponents of the interpretive approach call “constitutive” causality, whereby I try to understand how participants in a setting explain their own behaviors and events.8 Instead of imposing policy feedback concepts as explanations of parent and staff experiences, I prioritize study participants’ own understanding of program experiences and how they explain their own behaviors.
The Process: Why I Selected Chicago, the Neighborhoods, and Cases
Interpretive research involves theorizing about unexpected observations from the field. Research designs using this approach are flexible and tend to evolve.9 This was certainly the case for this project. When I started, I wanted to explore how nonprofits and public agencies differed in service delivery. I was a graduate student in Ann Arbor at the time and hoped to study organizations in Detroit, but the Great Recession brought about significant economic decline in the Detroit metropolitan area and nonprofits across the city were closing their doors. I needed to go where there was a vibrant growing nonprofit community, so I explored Chicago as an option.
Chicago was the ideal setting for this study for a number of reasons. First, nonprofits administered most services targeting families and children. Illinois contracted large nonprofits to administer child welfare services: Women, Infants, and Children (WIC), the Child Care Subsidy program, and most publicly funded youth programs.10 Second, different types of nonprofits (e.g., large faith-based organizations, small community-based organizations, and national networks of nonprofit service providers) administered these services. I could explore how service delivery differed across different types of organizations. Finally, many of the professional associations supporting these organizations were headquartered in Chicago and held conferences and professional development workshops in the city. I could attend these meetings to recruit organizations into the study.
Page 115 →I compiled a list of multiservice agencies in Chicago that were supported by government funding and reached out to the directors of these agencies. In 2010, I attended conferences and met with a half dozen executive directors to explain the study and visit their organizations. Most were enthusiastic but only one director gave me access: Anne Jenkins from Progress Development Corp said yes.
Before I ever conducted an interview or jotted a field note, I visited Anne’s organization several times and attended her annual fundraising benefit. At first, I was not sure where to begin or which programs to explore. Like many nonprofits, Progress offered a number of services. The organization ran two homeless shelters, a food pantry, clothing closet, a small preschool, and an after-school program. I chose the after-school program because it was the organization’s largest and most established program. It was also the program community residents widely referenced—Progress was known for what it did “for the kids.”
After-school programs were also ideal because of their prevalence across nonprofits. I could examine fairly comparable after-school programs across organizations to explore how different types of organizations (faith-based, secular, or public agencies) administer these programs. And government funds sponsored most after-school programs through Title I, the Child Care Development Block Grant, and the 21st Century Community Learning Center grant. These different grant opportunities introduced more variation to the study—I could compare program features by organization type and by funding source.
How I Accessed Organizations
I did not access all three organizations that participated in this study at once. Case selection and data collection were sequential and emergent in part because it takes time to develop trusting relationships with study participants (Feldman et al., 2004, 35–36). In addition, the interpretive approach requires adjustments in research design as new insights emerge. I learned from one case what to look for when selecting another case and the kinds of questions to ask staff and parents.
I developed relationships with Progress staff by volunteering extensively for eight months. I tutored seventh- and eighth-grade students twice a week and helped staff organize family and community events. I also spent my summer at Progress, chaperoning summer field trips, leading reading groups, and coordinating the program’s summer basketball league. After several months Page 116 →of volunteering, I was ready to start interviewing parents. My consistent presence over the course of the school term smoothed my transition from volunteer to researcher. I started conducting interviews at the beginning of the school term—in September 2012—but my responsibilities within the organization did not end. I started attending staff meetings and trainings. I taught the fifth- and sixth-grade class, taught music to students, and continued to coordinate the program’s basketball leagues.
Throughout my time at Progress, staff stressed the importance of understanding the “Progress” way. They wanted me to accurately capture the program’s aims and their commitment to families. After a while, “getting the story right” involved spending time with staff outside of program hours. Staff members invited me to their homes for dinner to share personal stories about how they ended up at Progress. They were proud of their work and eager to share their stories with a broader audience. This made our partnership relatively easy.
My time at Progress taught me what to look for in other cases. First, I got a glimpse of when, where, and how staff interacted with parents. Parents and staff conversed during parent pick-up, regular phone check-ins, parent involvement opportunities, and program events. And these interactions ranged from formal routine conversations about student behavior to more personal conversations about family life.
Knowing this focused my data strategy collection for the next setting. I refined my interview protocol to include questions about personal conversations with staff and instances when staff went above and beyond to support families. I also asked parents to share longer narratives about how they arrived at each program and how they viewed the program in relation to their families and their communities.
After spending a full school term at Progress, I was in search for another program supported by a different grant. I hoped to find an after-school program in Westfield that was funded primarily by Child Care Subsidy reimbursements. I found a program blocks from Progress, but a labor dispute at the center limited my access. I needed another program in a similar neighborhood. Through my connections with leaders in the nonprofit community, I found an after-school program on the Southside at South End Community Center.
A Note on Neighborhoods
At Progress, neighborhood contexts emerged as an important way staff and parents viewed programs. Three characteristics of neighborhoods seemed Page 117 →fundamental to how staff and parents understood programs. First, the racial makeup of the neighborhood was important to parents: 90 percent of Westfield’s residents are black. When asked to describe the organization, parents commented on how Progress provided opportunities for black children and met the needs of black families. Westfield’s high unemployment rate and deep poverty also shaped how parents and staff described programs. Parents noted the limited job prospects in Westfield and were grateful for the opportunities Progress gave their children—experiences they could not provide given their own hardships.
Staff similarly commented on Westfield’s economic decline and high unemployment, recounting stories of intergenerational poverty among families and parents’ efforts to make ends meet through Westfield’s informal underground economy of bartering, sharing, and favors. Finally, crime was especially salient to staff and parents. Parents expressed concerns about children’s safety and complained about a lack of safe places for children to play. Staff shared stories about the growing influence of gangs in the neighborhood and losing students to gun violence.
Given the importance of neighborhood conditions, I needed to select an after-school program that was not only comparable in programming but also located in a similar neighborhood to Progress’s. I chose the South End Community Center, which was funded primarily by Child Care Subsidy reimbursements, because the surrounding neighborhood—South End—shared some of Westfield’s characteristics. While slightly better off than Westfield in terms of poverty and crime, the neighborhood was about the same size as Westfield and was nearly identical in race and economic opportunities. Table A.1 shows the demographic characteristics of both neighborhoods.
Both neighborhoods are predominately African American and face high poverty and unemployment rates relative to the rest of Chicago. In 2010, Page 118 →roughly 16 percent of Westfield and 17 percent of South End residents were unemployed relative to 11 percent of Chicagoans. Both neighborhoods experienced deep poverty; nearly 40 percent of Westfield’s residents lived below the poverty line and almost 30 percent of the South End community lived in poverty.
Table A.1. Neighborhood Characteristics | |||
---|---|---|---|
Westfield | South End | Chicago | |
Population |
20,000 |
24,000 |
2,695,598 |
African American |
90% |
87% |
32.9% |
Per Capita Income |
$14,000 |
$19,900 |
$27,148 |
Without High School Diploma |
26% |
18% |
20% |
Unemployment Rate |
17% |
17% |
11.1% |
Below Poverty Line |
40% |
28% |
18.7% |
Homicide Rate |
38 |
31 |
18.6 |
Source: Data retrieved from the 2010 U.S. Census. Note: Estimates are approximate to de-identify Westfield and South End. |
The differences between the two communities should not be glossed over. Ten percent more of Westfield’s residents live in poverty in comparison to South End and Westfield has a significantly higher homicide rate. But at the time of the study, Westfield was considered one of the most impoverished and violent communities in Chicago. Not many neighborhoods closely matched its crime and poverty rates. And while South End and Westfield are not identical, they are similar along key characteristics that seemed most salient to study participants—racial composition, unemployment, poverty, and crime.
I approached the South End Community Center in the same way I approached Progress—by volunteering. I volunteered for three months to get a sense of the program’s day-to-day activities. I played with children and assisted with check-in, bus rides, homework help, and chaperoning short trips to and from bathrooms, playgrounds, and the gym. I officially gained access to South End Community Center in January 2013 and conducted interviews and observations from January through December. I focused on staff observations and parent interviews during the school term and conducted most staff interviews over the summer months.
I still needed to add breadth to the study through a public case and one funded by Title I dollars. A public after-school case could provide descriptive insight on whether two nonprofits—as cases of delegated governance—offered distinct program experiences when compared to a public setting. A Title I program could offer variation in funding sources—I could compare how Title I funds influenced program design relative to the Child Care Subsidy program or the 21st Century grant.
Two years following my initial fieldwork, I went back to Westfield to recruit Jackson Elementary School staff into the study. I chose Jackson because it used Title I funds to provide an after-school program for struggling students. I also selected Jackson because I had already developed relationships with its staff and parents through my work with Progress. Progress’s basketball league hosted games in Jackson’s gym; as coordinator of the league, I knew the principal well and had met many of the parents and students. Because I had already cultivated these relationships, I easily gained access to the school’s after-school program.
At the time of the second round of data collection in the spring of 2015 Page 119 →and the spring of 2016, Jackson was relatively stable compared to other schools in Westfield and South End. Between 2013 and 2015, the city of Chicago consolidated several schools on the Southside and Westside, which led to an influx of students and teachers into remaining or “receiving” schools. The South End community was especially affected by two school closures and the remaining schools received new students and teachers. My goal was not to explore how parents were grappling with school closures—although that did emerge as an important issue for parents in interviews. I wanted to talk to parents who had stronger long-term ties to an elementary school. While Jackson has faced the risk of closure in the past, the school was not affected by Chicago Public School’s closure and consolidation plan. I could interview staff who had long-term experiences with the school and parents who had years of experience with Jackson’s teachers and programs.
How I Presented Myself
When I began interviews, parents knew me as a volunteer or staff member who looked after, taught, and tutored their children. Once I informed them that I was a graduate student interested in their program experiences, they expressed interest in the study. Here I think my identity as a young black woman was an asset. I shared the same racial and gender identity of many study participants, and I sensed respondents were willing to share detailed stories about their lives and these programs because they viewed my success as their own.11 Parents were especially supportive of my pursuit of higher education. My recruitment efforts were usually met with reassuring grins and words of encouragement, such as “Oh this for school? You go girl!” and “Get that degree!”
Staff members expressed similar enthusiasm. They often asked questions about my college experience and shared their own aspirations for higher education. This helped my study immensely. Staff and parents were supportive and—more importantly—willing study participants.
Observations and Interviews
Instead of focusing on what students did in these settings, I observed what staff and parents did. I started my fieldwork by observing and documenting general activities to get a sense of the program’s pace, key transitions, and staff routines. The excerpts at the beginning of the book reflect the kinds of Page 120 →observations I conducted at each program. But as analysis progressed, I narrowed the focus of observations to staff-parent interactions at parent drop-off or pick-up, program events, and parent meetings or activities. I followed Emerson’s (1994) guidelines on field notes, starting first with jottings and then expanding these jottings later.
In the beginning, I had trouble balancing my time as a volunteer and my role as a researcher: supporting staff while documenting activity was daunting. To manage this, I designated days for observations and days when I would focus on my volunteer duties. I also shadowed specific staff members. While I was not given access to staff meetings at Jackson, I did take notes on most staff meetings that I attended at South End and Progress.
As my observations narrowed to parents and staff interactions, jotting notes became much easier. These interactions usually occurred at specific times—at the very beginning and end of programs and during family and community events. At all three programs, I would arrive earlier on-site to clean, summarize, and expand on jottings from the previous day.
Observations helped me decide whom to talk to and what to ask. I learned which parents had long-term relationships with staff at each program, and I could identify who was relatively new to the program. I also learned about parents’ interactions—I could distinguish parents who visited the program most frequently from those who were less involved. I could also identify parents who had more casual or personal interactions with staff from those who had brief exchanges with staff as childcare professionals. Observing this behavior also helped refine my recruiting efforts—I sought out deeply invested program veterans and newer parents to capture a range of experiences.
I initially recruited parents through flyers that were sent home, but I found that personally inviting parents to participate was a more effective recruitment strategy.
At Progress, I approached parents at program dismissal and events hosted by the after-school program. At South End and Jackson, I took a similar approach. I started with a wide distribution of flyers. As I completed interviews with those who responded to the flyers, I moved toward more targeted invitations to recruit parents who had a range of experiences with the program. Table A.2 describes parent characteristics.
The parents I interviewed across these programs did not differ much in terms of age and race. All of the parents were low-income, African American parents in their early 30s. Single-parent households were common across all three groups but most prevalent among Jackson parents. Family size varied and the average age of children varied slightly across programs. On average, Page 121 →Progress parents had larger families and older children when compared to South End’s and Jackson’s parents. Parents’ education levels differed marginally between Progress and South End parents. Both groups of parents had at least a high school diploma and, in rare instances, some college experience. Jackson Elementary parents interviewed were the least educated, with an average level of education just shy of a high school diploma.
Parents across all three programs experienced poverty. With household sizes ranging from four to five, South End’s and Progress’s average household income neared the federal poverty line.12 At the time of the interviews, Jackson parents experienced deep poverty. Several parents reported long stints of unemployment in the past year and noted the difficulty of finding work. Nearly all of the parents received some sort of public assistance ranging from Medicaid to public housing.13
What about Selection?
In some ways, parent characteristics differed significantly; marriage was more prominent among the Progress and South End parents I interviewed and Jackson’s parents were the least educated and experienced the greatest economic hardship. One might be concerned that these differences influenced how parents selected these programs, which would weaken the central argument of this book. There could be something distinct about either set of parents that motivated their choice of programs, and these characteristicsPage 122 → might explain program experiences and participation outcomes. For example, South End parents might be especially active or civic minded and opt to enroll their children into the program because of its parental involvement opportunities, while Jackson parents prefer a program with fewer parental commitments or involvement opportunities.
Table A.2. Parent Characteristics | |||
---|---|---|---|
Progress Youth Development
(n = 15) |
South End Community Center
(n = 15) |
Jackson Elementary
(n = 17) |
|
Mean Age |
33 |
30 |
33 |
Race |
100% African American |
100% African American |
100% African American |
Marital Statusa |
60% Single |
71% Single |
100% Single |
Number of Children (mean) |
4 |
3 |
3 |
Age of Children (mean) |
12 |
9 |
9 |
Education Level |
13.8 |
14.5 |
11.3 |
Income |
$26,000 |
$28,000 |
$8,000 |
Public Assistance |
86% |
86% |
80% |
a“Single” refers to both never married and divorced households. |
Selection bias concerns—while important—take on a different meaning in an interpretive study. The objective is not to control differences but to understand how these differences matter. For example, if each group of parents differs significantly in the selection process, I can explore why one group of parents prefers one type of program more than another or identify the barriers to accessing more engaging participatory programs. These kinds of insights could have direct policy implications on program design and access. The common themes that emerge across these distinct groups of parents might suggest broader aspects of program experiences that can be applied to other similar settings. In both cases, these kinds of insights serve as the foundations for new concepts and hypotheses that can be tested in larger studies.
Even with these theory-building objectives in mind, I took some measures to address selection bias. First, I looked to the literature on how parents select childcare providers to guide parent interview questions and analysis. Studies suggest that parents select providers whom they deem trustworthy and often choose providers that are conveniently located and provide care during hours that best fit work schedules.14 I asked parents to share how and why they chose to enroll their child into these programs. Significant variation in these responses would suggest some selection bias.
But interviews suggest that parents used the same criteria to select a program. Parents chose programs by the kinds of academic support they provided students and whether programs were located near work or home. Forty-five of the 48 parents interviewed emphasized the quality of homework help and tutoring as factors they considered when selecting providers. Parents enrolled their children in these after-school programs with the expectation that these programs would improve their children’s academic performance.
Parents also heavily weighed the proximity of programs from home, school, and their workplaces. Sixty percent of parents reported the program’s convenient location as influencing their choice of providers. The importance of proximity emerged when I asked parents to describe other options for after-school care. Most responded that there weren’t any—even if other programs were located in the neighborhood. Parents defined convenient locations as programs that were within safe walking distance of schools and home. They distinguished safe blocks from dangerous ones, and—unless the Page 123 →program provided transportation home—parents chose programs that were within a safe walking distance.
But even if parents selected programs for the same reason, selection bias within programs might complicate the inferences I draw about program design and participation. It may be the case that those who serve on parent advisory boards or volunteer would have done so in another setting, with or without the program. Moreover, these parents would likely be civically and politically engaged without the help of these organizations. In the same way, parents who are passively tied to each organization could also have similar levels of inactivity in political realms. This study does not observe the counterfactual for either kind of parent. I do not observe what parents would do without the program. Instead I adopt the conventional perspective on political mobilization to understand how the design of these programs matters. Rosenstone and Hansen (1993) describe mobilization as the “process by which candidates, parties, activists, and groups induce other people to participate . . . one of these actors has mobilized somebody when it has done something to increase the likelihood of her participation” (26).
With that in mind, the narratives presented in this project can be interpreted as evidence of how these programs targeting low-income families can influence the likelihood of parents participating in civic and political activities. In this sense, this book is a story about how the features of services and the organizations that deliver them intervene in the mobilization process. These programs may not determine parents’ predispositions to political activity, but this study shows key features of policy that organizations can build upon.
Staff Participation
I recruited staff at all three programs by sharing information about the study at staff meetings and circulating flyers. Staff members who were interested in participating contacted me directly. I conducted the interviews in a place suggested by the staff member—which sometimes included a private office on-site or a coffee shop. Table A.3 describes staff characteristics at each program.
Of the three programs, Jackson Elementary’s after-school staff was the oldest, most educated, and most experienced group. On average, staff members were in their mid-thirties and most of the staff interviewed (71.4 percent) had graduate degrees—a master’s in education or a master’s in education leadership. Staff—on average—had more than nine years of professional Page 124 →experience either teaching or supporting the after-school program. At Jackson Elementary, all of the staff members interviewed worked in full-time positions for the school. Along with extensive education and professional experience, Jackson’s after-school staff members were also the most diverse of the three groups along gender and racial lines.
Distinct from Jackson’s veteran educators, the staff members at Progress were an eclectic group of college interns, recent college graduates, licensed teachers, and nonprofit professionals. While every staff member had earned a college degree, three earned advanced degrees, two staff members earned an advanced degree in nonprofit management, the other in education. The staff was less racially diverse than Jackson’s after-school staff and had, on average, three years less experience.
The staff at South End was predominately African American and residents of the South End community. All but one were part-time employees and younger than age 25. With the exception of one part-time employee, every youth worker was once a participant in the youth program as a child and became a program employee during high school and college. All of the part-time youth workers had at least three years of program experience. Levels of Page 125 →education varied among the South End staff; most of the part-time employees had some college experience or were currently completing degrees.
Table A.3. Staff Characteristics | |||
---|---|---|---|
Jackson Elementary After-school Program
(n = 7) |
Progress Youth Development Corp After-school Program
(n = 9) |
South End Community Center After-school Program
(n = 7) |
|
Age |
35 |
31 |
27 |
Race |
|||
Black |
3 (42.8%) |
2 (22.2%) |
7 (100%) |
White |
2 (28.6%) |
6 (66.6%) |
|
Hispanic |
1 (14.3%) |
1 (11.1%) |
|
Asian |
1 (14.3%) |
||
Gender |
|||
Male |
2 (28.6%) |
2 (22.2%) |
2 (28.6%) |
Female |
5 (71.4%) |
7 (77.8%) |
5 (71.4%) |
Education |
|||
High School |
2 (28.6%) |
||
Some College |
1 (14.3%) |
3 (42.8%) |
|
Bachelor’s |
1 (14.3%) |
6 (76.6%) |
1 (14.3%) |
Graduate Degree |
5 (71.4%) |
3 (33.3%) |
1 (14.3%) |
Tenure (years) |
9.42 |
6.5 |
6.6 |
Full Time |
7 (100%) |
9 (100%) |
1 (14.3%) |
Part Time |
0 |
0 |
6 (85.7%) |
The Interviews
I asked every parent and staff member the same set of questions, but I tailored these questions for each site based on my participant observations. In the case of Jackson’s after-school program, I noticed that parents sought out one particular staff member. When I interviewed this staff member, I asked her to describe how she interacted with parents, how her close relationships with parents differed from more distant connections, and how these relationships changed over time.
I took a similar approach with parent interviews. At South End, I frequently observed a group of parents who had more casual social interactions with the staff. When I interviewed these parents, I reframed general questions about staff interactions to include behavior I observed. I asked questions such as “I noticed you tend to stick around during parent pick-up to chat with this staff member. How often do you stick around to chat with staff? Who do you usually talk to? What kinds of things do you tend to talk about?” I assured parents and staff that their identities and our conversations were confidential.
Analysis
My efforts to understand the meaning of program experiences for both staff and parents involved a mix of deductive and inductive analyses. My approach was deductive because previous research on policy feedback, policy implementation, and nonprofits informed the analysis. It was inductive because it was directed by a close read of interview data that allowed for emergent concepts to inform the analytic story.15
To aid analysis, all transcripts, memos, and field notes were entered into a qualitative software package, NVIVO-11. I initially categorized respondents by the interview questions. I then created line-by-line codes for interview transcripts and field notes to identify crucial aspects of program experiences from staff and parents’ perspectives. I grouped these narrow detailed codes into broader analytical categories by different levels of analysis (e.g., organization practices and policy).
I created these broader analytical categories through an “iterative comparison”Page 126 → process whereby I treated every respondent as a case and compared responses.16 For example, parents uniformly highlighted relationships with staff (whether professional or personal) as an important part of their experiences. To explore how these relationships emerged, I categorized parents who had close ties with staff and those who did not. I probed the data for differences and similarities between parents’ responses by asking questions such as “What kinds of interactions do parents with close ties to staff have compared to those who don’t? Are there differences in how frequently these parents engage staff? Do these parents look different demographically?” I also explored parents’ own explanations about how and why they interacted with staff. By comparing and contrasting parent responses, I could tease out parents’ motives and preferences for relationships with staff.
I captured interview responses in data matrices that connected staff and parent characteristics to emergent themes. These matrices helped me identify patterns in experiences across study participants (Ryan and Bernard 2000). As I collected and analyzed data, I wrote memos that described my impressions of the data and hunches about emerging theory. These memos were the basis of most of the chapters in this book.
1. Peregrine Schwartz-Shea and Dvora Yanow, Interpretive research design: Concepts and processes (New York: Routledge, 2013), 23.
5. Markus Haverland and Dvora Yanow, “A hitchhiker’s guide to the public administration research universe: Surviving conversations on methodologies and methods,” Public Administration Review 72, no. 3 (2012): 401–8.
6. Haverland and Yanow, “A hitchhiker’s guide to the public administration research.”
7. Schwartz-Shea and Yanow, Interpretive research design, 52.
10. Illinois contracts the delivery of the Child Care Subsidy to nonprofit organizations called childcare resource and referral agencies. The city of Chicago contracts WIC administration to Catholic Charities. Various child welfare services are delivered through a set of larger nonprofits.
Page 152 →11. Michael C. Dawson, Behind the mule: Race and class in African-American politics (Princeton: Princeton University Press, 1995).
12. The federal poverty line in 2012 was $23,050 for a family of four and $27,010 for a family of five. See Federal Register 77, no. 17 (January 26, 2012): 4034–35.
13. Public assistance programs included any means-tested program such as SNAP, Public Housing Section 8, WIC, TANF, Medicaid, and the Child Care Subsidy.
14. Heather Sandstrom and Ajay Chaudry, “‘You have to choose your child care to fit your work’: Child care decision-making among low-income working families,” Journal of Children and Poverty 18, no. 2 (2012): 89–119. See also Kim Jinseok and Maryah Stella Fram, “Profiles of choice: Parents’ patterns of priority in child care decision-making,” Early Childhood Research Quarterly 24, no. 1 (2009): 77–91.
15. Haverland and Yanow, “A hitchhiker’s guide.”
16. Barney G. Glaser and Anselm Strauss, The discovery of grounded theory: Strategies for qualitative research (Chicago: Aldine Publishing, 1967).