SALRC Assessment Institute

Developing Assessments for South Asian Languages

June 12-16, 2006

Readings | Schedule | Participants | Questionnaire | Workshop Home

Annotated Readings

In preparation for the workshop, Please consult the following general references:

Boston, Carol (2002). The concept of formative assessment. Practical Assessment, Research & Evaluation, 8(9).

This article addresses the benefits of formative assessment, diagnostic assessment that provides feedback to teachers and students throughout a course and provides examples and resources to support its implementation.

Brindley, G. (1997). Assessment and the language teacher: Trends and transitions. The Japan Association for Language Teaching: The Language Teacher Online, 21(9).

The article considers the effects that changes in assessment types in response to increased accountability and a focus on performance have had on the roles of teachers.

Thompson, L. (2000). Foreign language assessment: 30 years of evolution and change. ERIC/CLL News Bulletin, 23(2). Available online at:

Instruction, Outcomes, and Assessment on Center for Advanced Research on Language Acquisition’s (CARLA) the Virtual Assessment Center (VAC) website

Please review and be familiar with:

The American Council on the Teaching of Foreign Languages Council (ACTFL) Proficiency Guidelines

Hindi Proficiency Guidelines: ILR (The Interagency Language Roundtable) Scale

The Executive Summary of the National Standards for Foreign Language Learning for the 21st Century

Papers by Wells, James and Lange

Recommended Readings

Monday | Tuesday | Wednesday | Thursday | Friday

Monday, June 12th, 9 AM - 1 PM

Assessment Concepts and Theory

  • Matching assessment type to purpose
  • Guidelines and Standards
  • Samples of assessment types

Readings:

  • Brualdi, A. (1999). Traditional and modern concepts of validity. ERIC Clearinghouse on Assessment and Evaluation.
  • This article introduces the modern concepts of validity advanced by the late Samuel Messick (1989, 1996a, 1996b) to the concepts of content-related, criterion-related, and construct-related evidence of validity.

  • Review pre-workshop readings on ACTFL, Hindi and other guidelines

Recommended Readings and Materials:

  • Bailey, K. M. (1998). Learning about language assessment. Cambridge, MA: Heinle and Heinle.
  • The book is an accessible introduction to language assessment. Provided are practical examples, theory and clear explanations of concepts fundamental to assessing language.

  • Cohen, A. D. (1994a). Assessing language ability in the classroom. 2nd Edition. Boston: Newbury House/Heinle & Heinle.
  • This second edition presents various principles for guiding teachers through the assessment process (oral interview, role plays, portfolio assessment, cloze, summaries). It deals with issues in assessment in general, beyond traditional tests. It is accessible to novices in the field of language assessment: terms are defined within the text and all terminology is indexed.

  • Genesee, F. and J. Upshur. (1996). Classroom-based evaluation in second language education. Cambridge: Cambridge University Press.
  • Written to help foreign and second language educators in planning and carrying out effective and comprehensive evaluations in their classroom, the book emphasizes the use of classroom-based assessments as a tool for improving teaching and learning. Non-technical in its presentation, this book requires no previous background in psychometrics, statistics or research. Suggested procedures are useful for a range of proficiency levels, teaching situations and instructional approaches.

  • Hall, R. Virtual lessons on Behaviorist Theory, Information Processing/Cognitive Theory, Constructivist Theory, and Assessment. http://web.umr.edu/~rhall/ed_psych/index.htm
  • Richard H. Hall, Ph.D., Professor, Information Science and Technology, Associate Dean for Research, School of Management and Information Systems Director, Laboratory for Information Technology Evaluation (LITE) University of Missouri - Rolla provides web-based educational psychology modules. The modules address active learning, behaviorist theory, learning in groups, information processing theory, learning styles, constructivist theory, metacognition, learning strategies, we learning, and assessment.

  • Video FAQs Introducing Topics in Language Testing from Glenn Fulcher & Randy Thrasher is featured on the International Language Testing Association’s web site. Topics covered by the seven minute videos include reliability, validity, test impact, test specifications, writing test items, pre-testing, testing: listening, speaking, reading, writing, statistics, testing for specific purposes.

Tuesday, June 13th, 9 AM - 1 PM

Assessing for Proficiency

  • Models for Speaking and Writing Assessment
  • Portfolio Assessment
  • Rating Scales

Readings:

  • Chapman, C. (1990). Authentic writing assessment. Practical Assessment, Research & Evaluation, 2(7).
  • This article discusses some of the ways authentic writing assessment can be used in education. Using the Illinois Writing Program as an example, this article also looks at some of the goals, solutions, and experiences of a program that is implementing authentic writing assessment.

  • Forgette-Giroux, Renée & Marielle Simon (2000). Organizational issues related to portfolio assessment implementation in the classroom. Practical Assessment, Research & Evaluation, 7(4). http://pareonline.net/getvn.asp?v=7&n=4
  • This paper explores organizational issues that arose when implementing portfolio assessment in eleven classrooms during the field trial of a generic content selection framework2. Some researchers have already examined, to various degrees, the organizational process teachers go through when implementing portfolios within their classrooms to assess learning as opposed to showcasing or reporting achievement.

  • Moskal, B. (2003). Recommendations for Developing Classroom Performance Assessments and Scoring Rubrics. Practical Assessment, Research & Evaluation, 8(14).
  • This paper provides a set of recommendations for developing classroom performance assessments and scoring rubrics similar to the sets of recommendations for multiple choice tests provided in this journal by Frary (1995) and Kehoe (1995a, 1995b)…. This article draws from this base to provide a set of recommendations that guide the classroom teacher through the four phases of the classroom assessment process — planning, gathering, interpreting and using (Moskal, 2000a). Each section concludes with references for further reading.

  • Wiggins, G. (1990). The case for authentic assessments. Practical Assessment, Research & Evaluation, 2(2).
  • The author, a researcher and consultant on school reform issues, is a widely known advocate of authentic assessment in education. The article is based on materials that he prepared for the California Assessment Program. He defines authentic assessment and makes a comparison with traditional standardized tests to clarify the meaning of authentic. He explains the need for these labor-intensive forms of assessment. Reliability issues in the judgment-based scores of authentic assessments are addressed, as is the subjectivity in traditional standardized tests.

Recommended Readings:

  • Guidelines for Portfolio Assessment in Teaching English provides an extensive explanation of portfolios and assessing skills with student samples. These guidelines provide an overview of portfolio assessment, rationale for use and samples with resources.
  • Messerklinger, J. (1997). Evaluating oral ability. The Japan Association for Language Teaching: The Language Teacher Online, 21(9).
  • The paper explores assessing oral ability and provides an overview of ready made oral assessments and examines their scoring as well.

  • Moskal, B. (2000). Scoring rubrics: What when and how? Practical Assessment, Research & Evaluation, 7(3).
  • The paper “describes the different types of scoring rubrics, explain why scoring rubrics are useful and provide a process for developing scoring rubrics….[ It }concludes with a description of resources that contain examples of the different types of scoring rubrics and further guidance in the development process.

Wednesday, June 14, 9 AM - 1 PM

Assessing for Proficiency cont'd.

  • Developing Reading and Listening Assessments
  • Backwards Design (Wiggins & McTighe, 1998)

Readings:

  • ACTFL. (2003). Integrated performance assessment project.
  • Integrated Performance Assessments© ACTFL, were developed “to enable foreign language educators to better implement assessments that address standards-based performance across all three modes identified in the ACTFL Performance Guidelines for K-12 Learners and the National Standards for Foreign Language Learning in the 21st Century.” (ACTFL, 2003, p.8).

  • Badger, E. & Thomas, B. (1992). Open-ended questions in reading. Practical Assessment, Research & Evaluation, 3(4).
  • The article considers that “Open-ended questions focus on students' understanding, their ability to reason, and their ability to apply knowledge in less traditional contexts.” It provides a rationale and examples for their use.

  • Rudman, H. C. (1989). Integrating testing with teaching. Practical Assessment, Research & Evaluation, 1(6).
  • Testing and teaching are not separate entities. Teaching has always been a process of helping others to discover "new" ideas and "new" ways of organizing that which they learned. Whether this process took place through systematic teaching and testing, or whether it was through a discovery approach, testing was, and remains, an integral part of teaching. This article discusses ways teaching and testing can be integrated. Just as there is no "best" teaching method, neither is there only one best approach to testing. The digest discusses the use of tests as an instructional tool, the use of tests as an administrative tool, teacher attitudes towards testing, and teacher competency with regard to testing.

Recommended Readings:

  • Alderson, J. C. (2000). Assessing reading. Cambridge, UK: Cambridge University Press.
  • Topics of this volume are: the nature of reading (key concepts and 12 implications for test design -- content focused, range of skills and strategies not just one, reading of longer texts, encourage use of background knowledge, possible multiple interpretations, group tasks, extensive reading, speech and comprehension, enjoyment, avoiding integrated testing perhaps, discrete or integrated coverage of the components of reading, building on research base) just to name some of the assessment topics considered.

  • Buck, G. (2001). Assessing listening. Cambridge, UK: Cambridge University Press. Chapter 5: Creating Tasks. p. 116-153.
  • Ch. 5, about creating tasks, provides a step-by-step guide through the features one would want to consider. At the end of the chapter, a helpful summary of text characteristics and their effect on difficulty is presented.

  • Wiggins, G. & McTighe, J. (1998). Understanding by Design. Alexandria: Association for Supervision and curriculum Development.
  • The book presents “Backward design [which] the author thought of as purposeful task analysis: Given a task to be accomplished, how do we get there? Or, one might call it planned coaching: What kinds of lessons and practices are needed to master key performances?…Rather than creating assessments near the conclusion of a unit of study (or relying on the tests provided by textbook publishers, which may not completely or appropriately assess our standards), backward design calls for us to operationalize our goals or standards in terms of assessment evidence as we begin to plan a unit or course.”

Thursday, June 15, 9 AM - 1 PM

  • Integrated Performance Assessments
  • Integrating Technology with Assessments 11:00 – 1:00
  • Participant presentations of sample assessments using technology

Readings and Resources:

Recommended Readings:

  • Roever, C. (2001). Web-based language testing. Language Learning & Technology 5(2).
  • This article describes what a Web-based language test (WBT) is, how WBTs differ from traditional computer-based tests, and what uses WBTs have in language testing. After a brief review of computer-based testing, WBTs are defined and categorized as low-tech or high tech. Since low-tech tests are the more feasible, they will constitute the focus of this paper. Next, item types for low-tech WBTs are described, and validation concerns that are specific to WBTs are discussed.

Friday, June 16, 9 AM - 1 PM

  • Planning for Comprehensive Assessment
  • Participants' presentations – 11:30 AM
  • Demonstration of Speaking and Writing Assessment
  • Demonstration of Integrated performance assessments

Recommended Readings and Resources:

  • CARLA’s Virtual Assessment center is a web-based learning module that provides teachers with background information, step-by-step guidance and MANY practical resources on developing second language assessments, standards and teaching. Links to articles, other useful sites, student work and state standards web sites.

    The components of the VAC provide you with background information, examples to download, a reflection and a resources section that provides annotated links to useful sites on assessment. Featured also is a theory and test development section and a link to an annotated assessment bibliography.

  • The current issue of Language Learning and Technology features several interesting articles including ON THE NET: Wikipedia: A Multilingual Treasure Trove by Jean LeLoup and Robert Ponterio.

  • Lee, Jin Sok. (2006). Exploring the relationship between electronic literacy and heritage language maintenance. LLT.
  • This paper focuses on the electronic literacy practices of two Korean-American heritage language learners who manage Korean weblogs. Online users deliberately alter standard forms of written language and play with symbols, characters, and words to economize typing effort, mimic oral language, or convey qualities of their linguistic identity such as gender, age, and emotional states. This paper explores the linguistic development …. [of two siblings].