top of page
Boundless Logo_Hor.png

Digital Library

The Israel Literacy Measurement Project 2015 Report

Topic:

Israel Literacy

Principal Investigators:

Annette Koren, Shira Fishman, Janet Krasner Aronson, Leonard Saxe

Study Date: 

2015

Source:

Cohen Center for Modern Jewish Studies,Brandeis University

Key Findings:

The purpose of this research was to create a bank of questions that could assess student knowledge about Israel. The question bank developed by the research team was designed to be used with college-aged young adults to assess the extent and content of their Israel-related knowledge. The survey population was limited to Jewish college students and the focus shifted from a “diagnostic tool” (single instrument) to a bank of reliable and validated questions related to the larger construct of Israel literacy.

Ultimately, a total of 628 student responses were used in the analysis. Most of the respondents (61%) were in their first or second year of college and the vast majority (83%) had some formal Jewish education growing up. Overall, students had an average score of 46% correct; the median score was 44%; and 92% of all students scored 75% or lower. All questions were based on a multiple-choice format except map questions, which relied on matching names of places with positions on a map. All questions in the test bank were categorized in one of six domain areas: conflict, geography, government, history, religion, and society (which includes culture and economy).

Scores were computed for each school. Average scores differed by school: The school with the lowest average score was 27% and the school with the highest average score was a 59%. The average score for students from the 10 schools with the highest average SAT scores, was 49%. The average score for students from the 10 schools with lowest average SAT scores was 43%, as more selective schools scored higher overall. 

The tests also asked about the students’ level of formal Jewish education, college classes about Israel, and visits to Israel. On average, students who had Jewish education (part-time, day school, or both) scored better on the exam (average score of 47%) compared to those who had no Jewish education (average score of 42%), a significant difference t(626) = -2.78, p < .01. Students who had been to Israel scored significantly higher (52%) than those who had never been to Israel (43%), t(610) = -5.60, p < .001.

The testing to date has demonstrated a less than acceptable level of knowledge about Israel among students about to embark on Birthright Israel. More than half of all students answered less than half of the questions correctly, and over 90% scored less than 75%. This information deficit prevents students from contributing to discourse about Israel on campus in a meaningful way. The researchers also believe that the test scores from The Israel Literacy Measurement Project also raise concerns about potential disillusionment with Israel education prior to college.  

Methodology:

Qualitative Data Collection 

 

The research team conducted two initial rounds of interviews: the first with a small group of students in order to find out what kinds of information they brought to the interpretation of newspaper articles or videos on the web. For the first round of interviews, CMJS recruited a convenience sample of six students and young adults in April 2013. 

 

In July and August 2013, the second group of interviews were conducted with 42 students from various colleges and universities, also recruited through a convenience sample. Of these, 25 were Jewish and 17 were not Jewish, some were from elite schools and others from less prestigious schools. All students were asked the same broad questions designed to elicit their knowledge within each domain.

 

Developing the Question Bank

 

In fall 2013, the research team developed a question bank to test with a sample of students in Birthright Israel orientation programs prior to their trip to Israel. This sample provided adequate numbers for quantitative testing while also providing the research team with a diverse group of students whose knowledge of Israel varied greatly. A total of 140 questions were prepared by the members of the research team. Questions were created within each domain identified by the advisory group. Although some of the questions overlapped two domains (for example, history and conflict), each question was assigned to a primary domain by the end of the testing period. In designing questions, the research team attempted to follow good practices for multiple choice test creations (see Haladyna, 2002).

 

In November-December 2013, CMJS administered three questionnaires of about 50 questions each to 311 students from nine universities in the Boston area and one in Philadelphia. Students were randomly assigned one of the three versions of the questionnaire. Students completed the questionnaire in hard copy at their Birthright Israel orientation sessions and received 20 New Israeli Shekels ([NIS], a sum equal to about $5) to be used on their trip. If students did not attend their Birthright Israel orientation, they received the questionnaire electronically through an email invitation.

 

A total of 642 students completed the final round of testing. Of these students, 14 were excluded from the analysis because of errors in their use of the scantron forms (an extra question was filled in, suggesting that, at some point during the test, they had not paired the answer form with the question sheet). A total of 628 student responses were used in the analysis. Students came from 20 universities. 

bottom of page