The second wave of the NSYR longitudinal telephone survey was designed to be a re-interview of all Wave 1 youth survey respondents. Parents of the youth respondents were not re-interviewed. At the time of this second survey the respondents were between the ages of 16-21. Like the Wave 1 survey, the Wave 2 survey was conducted by telephone using a Computer Assisted Telephone Interviewing (CATI) system. The survey was conducted from June 9, 2005 to November 24, 2005. For this second wave of the survey, we only conducted interviews in English. Four youth respondents did not participate in the Wave 2 interview due to not being able to understand or speak English. We did translate our pre-survey mailing to Spanish for respondents we knew to have Spanish-speaking parents or guardians. Additionally, a call center staff member was available to conduct the verbal parental consent in Spanish. The Wave 2 telephone survey questionnaire covers many of the same topics as the Wave 1 questionnaire. Many of the questions are identical so that change can be measured precisely. However, the Wave 2 questionnaire was re-designed to take into account changes in the lives of the respondents as they began to enter young adulthood. The Wave 2 survey includes new questions pertaining to behaviors occurring during the transition to adulthood, such as non-marital cohabitation, educational and career aspirations, pregnancy and marriage.
Many variable names have been truncated to allow for downloading of the data set as an SPSS portable file. Original variable names are shown in parentheses at the beginning of each variable description.
- Data File
- Cases: 2,604
Weight Variable: RWEIGHT, NWEIGHT
- The longitudinal weights, RWEIGHT (rweight2_w2) and NWEIGHT (nweight2_w2) have been calculated for use when analyzing data from both waves of the NSYR survey data (excluding data from the Jewish oversample). The longitudinal raw weight is RWEIGHT (rweight2_w2). The normalized version of RWEIGHT is NWEIGHT (nweight2_w2). We recommend the use of raw weights when software developed for analysis of survey data, e.g., Stata or SAS, is used for estimation. The only exception to this is when software documentation specifically requests that users normalize the weights before estimation. It is the data user’s responsibility to determine whether raw or normalized weights should be used in an analysis.
- Data Collection
- Date Collected: June 9, 2005 through November 24, 2005
- Original Survey (Instrument)
- Original NSYR Survey
- Funded By
- The Lilly Endowment, Inc.
- Collection Procedures
- Wave 2 of the NSYR telephone survey was fielded from June 9, 2005 through November 24, 2005. Telephone calls to respondents were spread out over varying days, and times, including weekends. Every effort was made to contact and survey all original NSYR respondents, including those out of the country, in the military, or on religious missions. In addition, respondents were able to initiate the completion of their survey interview at their convenience by calling a toll-free number provided in the mailings and in occasional voice mail messages left by interviewers.
Since survey interviewing began in the summer months most respondents were not in school at the time they were interviewed. However, in the later months of data collection many respondents were beginning the 2005-2006 school year. With this in mind, the survey instrument was designed so that questions regarding education were asked in regard to specific time-periods, “In Spring 2005…”, for example. In addition, interviewers were trained to remind respondents that they should be responding to the specific time-frame asked about in the wording of the questions. On average, 14 phone calls were made to each respondent (including those that completed the survey). Of the cases where no contact with a human was ever made 65 calls were made to the household on average. Of the cases where contact was established with the household but the survey was not completed, on average there were 46 attempts to reach the respondent by telephone. A total of 2,604 respondents participated in the survey for a final NSYR Wave 2 overall non-weighted response rate of 78.6 percent.
Prior to conducting all Wave 2 survey interviews, each respondent’s verbal consent was obtained. In addition, for all respondents under age 18, parental consent was obtained through the return of a signed form before the start of the survey or verbally, on the phone, with a survey interviewer. The respondent’s identity was confirmed using name and date of birth. If a respondent was unable to correctly answer one or all of the screening questions a call-center supervisor was notified. The supervisor then attempted to identify whether the answer discrepancy was due to a keying error made in the Wave 1 survey (if the birth date was off by one day, for example) or whether it was, in fact, questionable that the interviewer was speaking with the correct respondent. If the supervisor had any question about the identity of the person on the phone the interviewer broke off the survey and notified the respondent that s/he would have to call her/him back at a later time. The supervisor then recorded the details of the situation and informed project researchers. The NSYR researchers made the final determination of whether the survey interview should be completed with the respondent.
- Sampling Procedures
- An RDD telephone survey sampling method was chosen for this study because of the advantages it offers compared to alternative survey sampling methods. Unlike school-based sampling, for example, our RDD telephone method was able to survey not only school-attending youth, but also school dropouts, home-schooled youth, and students frequently absent from school. Using RDD, we were also able to ask numerous religion questions which many school principals and school boards often disallow on surveys administered in school.
For more information, see http://youthandreligion.nd.edu/assets/102496/master_just_methods_11_12_2008.pdf
- Principal Investigators
- Dr. Christian Smith
Department of Sociology
University of Notre Dame
Dr. Lisa Pearce
Department of Sociology
University of North Carolina, Chapel Hill
- Related Publications
- Smith, Christian and Melinda Lundquist Denton. 2003. “Methodological Design and Procedures for the National Survey of Youth and Religion (NSYR).” Chapel Hill, NC: The National Study of Youth and Religion.
Smith, Christian and Melinda Lundquist Denton. 2005. Soul Searching: The Religious and Spiritual Lives of American Teenagers. Oxford: Oxford University Press.
See http://youthandreligion.nd.edu/research-findings/ for a list of publications.
- All publications using NSYR data must contain the following acknowledgement:
“The National Study of Youth and Religion, http://youthandreligion.nd.edu/, whose data were used by permission here, was generously funded by Lilly Endowment Inc., under the direction of Christian Smith, of the Department of Sociology at the University of Notre Dame and Lisa Pearce, of the Department of Sociology at the University of North Carolina at Chapel Hill.”
- Retention and Response Rates
- In Wave 2 every attempt was made to re-survey 3,364 (the original 3,370 respondents minus 6 respondents whose contact information was not collected) original teen respondents, including those that were out of the country or serving in the military. The entire survey was completed in Wave 2 by 2,581 respondents, for a full retention rate of 77.9 percent (2,581 of the 3,312 eligible respondents). The overall retention rate is 78.6 percent, which includes the 23 respondents that partially completed the survey. Therefore, the total attrition in Wave 2 was 766 respondents. The predominant source of attrition was non-located respondents. The overall combined response rate for Waves 1 and 2 of the NSYR telephone survey, calculated by multiplying the W1 and W2 response rates, is 44.8 percent. Of the original 80 Jewish oversample, 74 completed the W2 survey (92.5 percent).
The Wave 2 cooperation rate was 89.9 percent. This was calculated by dividing the number of completed cases (including partials) by the number of respondents who were successfully contacted (N = 2,895). The categories making up the non-contacted cases (N = 505) are: No human contact (38), non-located military or jobcorps (15), non-located out of the country (4), other non-locates (390), and ineligibles (58). The ineligible group consists of deceased respondents (5), respondents who were institutionalized throughout the entire data collection period (16), respondents with language barriers (4), cases in which the identity of the respondent was too questionable to proceed with the survey (2), and one case where the respondent was incapable of completing the survey. In addition to those referenced above are 30 cases that were taken out of the Wave 2 data after the survey had been fielded when it was discovered that the dates of birth (DOB) reported by these respondents in the second wave put them outside of the age criteria for the study. When a situation like this arose, where the DOB reported in Wave 1 and Wave 2 differed, the birthdate was checked with the respondent several times by the survey interviewers. Additionally, for these 30 respondents, before taking them out of the data, the study project manager contacted the respondents to confirm the dates of birth one last time. Each data user can decide whether or not to use these respondents in Wave 1 data analyses. Comparisons on the key variables: Never drink alcohol, never use marijuana, and regular smoking of cigarettes reveal only minor differences between W1 and W2 responders. Only minor differences exist in the percentage of those W1 and W2 responders that attend religious services weekly or more and those that never attend religious services. However, when analyses were run comparing the W2 responders and W2 non-respondents it was found that non-respondents are more likely to never attend religious service and are less likely to attend religious services weekly or more than responders. This is consistent with findings from other social science research that indicate that more religious study participants are more likely to cooperate with research studies than non-religious participants. On key demographic characteristics, only small differences between W1 and W2 respondents were found.
- The refusal rate for Wave 2, calculated as the number of eligible respondents (N = 3,312) that refused to take part in the survey, was 4.0 percent. For each initial respondent refusal some attempt was made to persuade survey participation. The reason(s) for the refusal was clearly noted and any conversion attempts were specific to the respondent’s (or guardian’s) concerns. When appropriate, a project staff member would contact a respondent directly to give more detailed information about the project. Letters were drafted to address concerns about data security, sensitive questions, and confidentiality as well as to persuade those respondents that reported being “too busy” or “not interested”. Each respondent was reminded that his/her participation was important to the success of the project and every attempt was made to address each particular respondent’s questions. Of the 132 refusals 45 were indirect, meaning that the refusal was given by a parent, guardian, or other adult household member, not the respondent. In situations where the person giving the refusal was not a legal guardian of an underage respondent, the cases were treated as “blocked”. All efforts were made to contact the respondents directly in these situations, and when those attempts were unsuccessful, project staff tried their best to persuade the “blocking” adult to allow us to communicate with the respondent.
- Missing Data
- With the exception of a few created variables, the standard “.” indicator of missing data has not been used. In the actual dataset (but not the codebook survey instrument), for all variables, DON’T KNOW=777, REFUSED=888, and NOT ASKED=999. All missing values are reported as 3-digit numbers, except for those that had more digits to start with (year, for example). The 999–NOT ASKED response indicates a valid skip of the question. In other words, a respondent does not have a response for that question because they were intentionally skipped past the question. In a few cases, there is a value of 666, which indicates an INVALID SKIP. These are cases in which a respondent was incorrectly skipped out of a question due to computer or programmer error, or partial cases. The use of these codes instead of traditional missing data indicators means that analysts must be very careful to be aware of these cases in their analyses. Stata, for example, will not recognize 777, 888 or 999 as missing data. Therefore, unless you tell it otherwise, the stats package will include 777, 888, and 999 as actual values in your analysis. Always pay attention to the value of skip code indicators.
- Religion Variables
- The religion questions in the NSYR survey are complex. We have worked hard to create interpretable religion variables to be used in analysis. All of the original variables have been left in the dataset. However, many of these are incomplete because they were asked of only a subset of the respondents or because they do not include open-ended verbatim responses. For consistency across analyses, we ask that all analysts use the standard integrated religion variables created by NSYR as the starting point for their analysis. These variables include the following:
W2 Teen Re-Created Integrated Religion Variables:
W2 New Teen Integrated Religion Variables:
RELAFF (relaff_w2) is the variable categorizing teens into major religious types. RELAFF is similar to reltrad in wave 1, with two key differences. First, there were no data from parents to consult for this variable, so the assignment of affiliation is based on less information in this wave. Second, we split African American Protestants into two groups (Mainline and Evangelical), so analysts can combine African American and White Mainline Protestants and/or Evangelical Protestants when necessary. RELAFF was created based on the type of religious congregation that a teen said s/he attends. The variables used to determine RELAFF included: ATTREG (attreg_w2), CHURTYPE (churtype_w2), BAPTIST (baptistgrp_w2), METHST (methstgrp_w2), PRSBIAN (prsbiangrp_w2), LUTHAN (luthangrp_w2), REFMED (refmedgrp_w2), CONGAL (congalgrp_w2), CHCHST (chchstgrp_w2), RELIG0 (relig0_w2), ATHEIST1 (atheist1_w2), and teenrace.
- Jewish Oversample
- In addition to the original Wave 1 national sample of 3,290 cases, the NSYR also conducted surveys with a modest oversample of Jewish households (80 Jewish oversample completes in all) in order to obtain a large enough number of cases with which to conduct meaningful statistical analyses of Jewish youth. This Jewish oversample was included in the Wave 2 re-survey. For a complete description of the oversample see the “Methodological Design and Procedures for the National Survey of Youth and Religion (NSYR)” (Smith and Denton, 2003). This oversample is NOT nationally representative and is meant to be used primarily to bolster Jewish-specific analyses. Therefore, we generally recommend that the oversample be excluded from most analyses of the general sample.
Researchers using simple descriptive statistics to make claims about the characteristics (but not the size) of the Jewish population may include the Jewish oversample cases in analyses but should use the simple probability of selection weight (rweight1) to correct for number of teenagers in the household. Researchers using multivariate statistics with religious category variables who feel comfortable using the Jewish oversample data do not, in our view, need to use the national weight (rweight2 or rweight2_w2) if they control for region and family income, which these weights adjust for. However, they should use the simple selection probability weight (rweight1) and should include the Jewish oversample dummy variable in their models to statistically remove any possible unidentified effect of sampling bias inherent in those cases net of the other independent variables the models control for. Or, researchers may simply exclude the Jewish oversample cases from analysis and work with a significantly lower number of Jewish cases.
- Wave 2 Gender
- For those data users working with both W1 and W2 survey data sets, please note that there are three cases in which respondents reported themselves to be different genders in W2 than they did in W1. We have created a W2 gender variable, GENDER (gender_w2). When discrepancies arose during the W2 survey, interviewers took care to confirm that we had correctly recorded gender in the W2 dataset; thus we recommend relying on GENDER (gender_w2) as the gender variable in longitudinal analyses.
- Wave 2 Age
- In W2 of the survey, respondents were asked to confirm the date of birth they reported in W1. In a few cases the W2 date of birth was different. In almost all cases, the date was off by one day or a year, likely the result of a W1 interviewer keying error. In W2 the survey interviewers were instructed to carefully confirm the date of birth, particularly when there was a discrepancy between the W1 and W2 date. For this reason, NSYR researchers advise that analysts use AGECATS (agecats_w2), the variable for the age the respondent gave at the time of the W2 survey, for respondent age. Further, in confirming dates of birth for W2 we discovered thirty respondents whose true birthdates were outside the age range of the original sample. We have therefore removed these cases from the W2 dataset as “ineligibles” and advise that analysts remove them from W1 analyses.
- School Variables
- The Wave 2 survey took place from June of 2005 through November 2005. Therefore, some of the respondents completed the survey over the summer and others in the fall. The researchers took care to word questions in the Education section so that correct school year information was captured by asking the respondent to report specifically on either the spring of 2005 or the fall of 2005. However, data users should take note that there are other sections where respondents’ answers may be influenced by the time of year that they completed the survey. For example, in the survey respondents are asked to report how many hours a week they were currently working. For some respondents this answer would be different depending on whether they completed the survey in the summer or during the school year.
- Open-ended coded responses
- Fourteen variables have open-ended responses that have been coded by the researchers. A pdf file containing the codes can be downloaded here. Below is a list of the variables: