Gerry's Home Page Position Papers

Ravitz

Jason Ravitz

University of California, Irvine

jravitz@uci.edu  

| Read and Write Comments |

I'm currently evaluating the Irvine Experiment with Margaret Riel here at UCI and trying to develop guidelines for an "interactive project vita", a notion that I think holds some promise.

Here is a new paper I wrote: http://www.gse.uci.edu/Ravitz/ept.html :

 

Emerging Practical Theory of Assessment: Interactive Portfolios

Introduction

What is an emerging practical theory? It is an evolving conceptualization of practice, one that is based on experiences that can be applied to the work of others in varied situations. This paper is the outgrowth of a course on classroom assessment at Syracuse University taught by Dr. Gerald Mager.  It builds on the idea of the Interactive Project Vita for project evaluation that was developed previously (Ravitz, 1997a). It attempts to take a specific assessment strategy developed for Internet-based projects and generalize principles of learning that might be applied in any learning environment. In this case, technology lets us study how learning occurs without taking away from learning without technology. Education in online settings becomes a "specialized application" of technology (Smith, 1983) with learners, teachers and researchers potentially developing new understandings about learning and assessment, especially learning that meets the needs of society and the workplace (SCANS, 1991).

The current discussion relates most closely to assessing the outcomes of student projects. Experience shows that online work is often very difficult to assess because of lack of information about the learner's thinking processes. At the same time, data show that group work and project-based activities in U.S. schools, even non-technological projects, are often not accompanied by the cognitive challenges we would like to see (Ravitz & Snow, 1999). What we would like to see is learners developing the habits of lifelong learning and scholarship and the capacity to improve their understandings with effort (Honey & Moeller, 1990; Linn & Muilenburg, 1996). The interactive portfolio is offered as an approach for supporting learning across various subjects and levels.

Definitions Assessment -- a process designed to give information to learners about their performance on learning-related tasks. This information can be used to judge the quality of a learner's work, as well as to suggest remediation or alternative ideas for learning activities. Assessment can be shared with others to provide a better understanding of the learner's performance.

Authentic assessment -- Authentic assessment is undertaken with the assumption that the task being assessed is meaningful to others besides the learner, others within the learners life (family, friends, community members), and that the work may have potential implications that extend further (e.g., into the world of K-4 mathematics). Authentic assessment takes on consideration of the value of the work for others who may be interested in the same question or problem.

Technology - human endeavor to resolve life's problems and seek betterment through the acquisition and application of attained knowledge (e.g., crop rotation, flu shots, beepers, online surveys, textbooks, pen and paper).

Interactive portfolio assessment -- authentic assessment that includes a conversational, or interactive component between the learner and another person involved in the assessment. Interactions can occur via facilitating technologies, written or spoken communication. Reviewers becomes engaged in or involved in the assessing the work with the learner.

Requirements

Interactive portfolio assessment is offered as a way to promote deeper thinking about student work among students and teachers, and beyond. When the work of a learner is offered for review with the purpose of obtaining helpful criticism (not so different from a teacher grading a test or paper), it quickly becomes clear that the performance or product does not stand alone; it needs to be presented with information (say, a story) that traces effort and progress -- changes in direction, obstacles encountered, theories tested, and where the learner went wrong. Often we are interested not only in the products that learners are creating but in questions of meaning and interpretation (Guba & Lincoln, 1983).

A portfolio could start with examples of writing, video documentary, an art piece, a lesson plan, the best question you ever asked, the hardest (or easiest) math problem you ever encountered, a solution. Key components of the theory, really essential conditions for the usefulness of the emerging practical theory to be tested, would include the following:

    1. work is presented in a way that expresses the meaning given to the work by the learner (i.e., intrinsic motivation is clearly evident).
    2. it is apparent how the work (performance, product, idea) can be made of use by others, e.g., how others can draw the same conclusion, test the same theory, or apply the same framework to the solution of problems.
    3. the learner seeks help from those who view their work and are willing to share their assessment, e.g., feedback, suggestions, new ideas.
    4. The assessment expresses a "point of view", e.g., a research scientist, an English teacher concerned about the ability of a student to communicate, a policeman concerned about how the student's work might be used to maintain the peace, a business person assessing an idea's commercial viability.
Using interactive portfolio assessment, student and teacher work (sometimes together) can be responded to by external reviewers. If the person providing feedback makes clear what perspective he or she presents, and what set of standards he or she is using, these standards themselves can also become the subject of examination. Assessments can be "directly linked to the artifact being judged, with confirming and dissenting commentary attached" (Kozma & Quellmalz, 1996). The criteria for judging a presentation can also be negotiated with the learners, so the learner knows what is expected and agrees about the use of standards for assessment.

In some cases the teacher may be more of a partner in student activities than a fair external judge, for example if she is joining students in the production of a yearbook or newspaper, a ThinkQuest project, using Knowledge Forum™ together, or leading a Listserv for a university course. In these cases, the value of the effort might best be judged by someone who is outside the learning process, someone who can talk about the meaning of the work beyond the teacher-student relationship. This might be a peer (another student trying to understand the same scientific principle) or someone who is involved in a different level of activity but working on the same problem (e.g, a policymaker studying school violence).

Though knowledge is culturally defined in this way, at the same time it must also be idiosyncratic, as oneís notion of reality and truth is still rooted in oneís own experience. These thoughts would go into the individual portfolio being produced as a product of learning along with statements from others who had shared the experience, or who were getting a glimpse of the situation from different viewpoints, and an interpretation and summary of comments from the author along with new questions for investigation.

In creating an interactive portfolio of work, individuals are forced to consider what they are doing and why, and to recognize that they may be held accountable for their work by others and have to change the course of their thinking about a topic. Requirements for student work should be put in place so that work can be explained, offered for review, and discussed. But these requirements can also be negotiated to match the level of the learners, and individual learning contracts should be used when one's learning needs are substantially different from that of the group.

Imagine that instead of studying textbooks, students in study-teams would write their own textbooks on what was important to them at the time, with each student taking responsibility for composing a few sections. The teacher would help make it unique, perhaps revising the approach with each new cohort of students. At the same time, there would be interactions with peers and teachers who can serve as a "review board" before which oneís beliefs come to be accepted as knowledge. If ideas are intended to be useful for others, trying them in another classroom could provide a very good assessment of their validity -- but, it would only be a good assessment, and the feedback would only be useful, if news of its return arrived.

Support for the theory

Respect for alternative ideas and different views remains critically important, in keeping with a fundamental value of science against any exclusionary belief or practice when questions of knowledge are involved. Too often, the classroom has been a closed system, with knowledge being limited to a pre-specified domain of information, and teaching from one point of view. Popper (1962) and Kuhn (1962) and Rogers (1983) all make quite clear the disadvantages of thinking that we have arrived at "the answer" to any of our most meaningful questions. Followers of a closed knowledge system generally: do not seek out the possibility of scientific evidence disproving their theories; rely on a single paradigm for finding answers to new problems; and, do not take advantage of weak communication links that often carry the most information.

A reflective process that involves the support of others can produce learning around almost any kind of effort undertaken, as long as there is sustained attention over time, the type of attention teachers in crowded classrooms with little technology may not be able to give to their students. The proposed approach permits (does not cause) learners to become experts in a topic over time, capable of assessing themselves and others, and of demonstrating this fairly quickly to others in the area of their choosing. Being able to assess the evolution of an individual's or groups work can then be used instructionally (CRESST, 1996) year after year with technology helping to provide progress markers of learning (Jonassen, Carr, & Hsui-Ping, 1998; Riel & Harasim, 1994).

Although originally conceived of in terms of electronic environments, these principles extend to face-to-face environments. First and foremost, the presentation of student's work in a portfolio should support student self-reflection (Lankes, 1993). However, getting learners, even the most capable adult learners, to be reflective in practice (Schon, 1983) is a very difficult and important challenge. This is especially true if we hope to see the professionalization of teaching. One way to better understand student work is through personal communication, a particularly useful assessment method for examining reasoning skills or attitudes, for example (Stiggins, 1997). The problem with personal communication is that these things tend to disappear without a trace, and lessons often cannot be reproduced in artificial settings at the time a test is administered. One way technology can be used, then, is as a tool for creating shared remembrances, stories of struggle, and so on, that also allow one to better judge what went into a learning experience and what the result has been.

Implications

Lesson plans are presented as part of any teaching portfolio, and teachers-in-training are expected to reflect on their teaching after they have tried something out. This reflection may include what, where, when, how, and why an activity was undertaken in teaching. These are fundamentally important questions that researchers outside the classroom cannot answer without teachers coming up with their own answers first, and then choosing to let others in on this conversation. Often teacher portfolios are presented to an advisor and peers for a one-time review. With the development of interactive portfolios, teachers can create a product of learning that includes rich data for researchers and useful ideas for other practitioners, that come to represent the shared expertise of the teacher and others.

Before we might draw conclusions about the potential for success of this approach and work to define this further, it is worth raising a few important issues to be addressed. For example, is the review process perceived as a distraction from the real work of learning, or is feedback more generally viewed as a central part of the experience? It is not clear that people are really willing to discuss and share their work openly with others, and to be reflective (McKinley, 1983; Schon, 1987), and clearly individuals will have different styles of sharing and a different cycle of producing knowledge that will have to be supported. Preliminary experience in the Online Internet Institute (Ravitz, 1997b) suggests that some people may want to put the best face on their work and ignore any problems they are having. How does one encourage people to take risks, and share more of their struggle around a topic? This probably requires a change in the culture of teaching and learning so that we view ourselves as involved in addressing complex problems that require scholarship and the involvement of supportive colleagues.

Conclusion

It is challenging to take an idea that was developed for a specific setting and to see how it might inform work in other areas. As new strategies that take advantage of the medium of learning become increasingly available for study and use by educational researchers, approaches to assessment may be impacted across the disciplines. Reflecting on the underlying assumptions of the interactive portfolio is a valuable exercise, even among educators who are technology neophytes. Discussion concerning the usefulness of these ideas in offline settings is needed, and I hope this approach will be of interest to both researchers and the broader population of educators.

Until now, limited technology may not have permitted this idea to be broadly useful (e.g., one might have objected that there was not enough Internet connectivity in homes). However, as we see the rapid proliferation of student and teacher web sites, the question inevitably arises about quality of the learning is taking place. Collaborative tools that are beginning to proliferate can be integrated early in the learning process in support of project design and assessment -- e.g., before, during and after "instruction" takes place. These tools provide the capability to extend the view of the peer, parent, teacher, or researcher into the mind of the learner at the same time that it helps the learner reflect on his or her own understandings and goals (Loh, Radinsky & Russell, 1998; Brophy, Goin, Bransford, Sharp, et al., 1994).

My emerging practical theory builds upon a direct connection between science, technology, learning and assessment. Technology rests on the advancement of science, which rests on constant effort to learn more, expose our ideas to others, and to assess what we know. The same principles of learning apply for researchers, teachers, students, and student-teachers, and across almost any subject area. Just as a production tool that supports presentation (e.g., Powerpoint) can be used by K-12 students and professional educators to present their ideas, so can a tool that supports creation of an interactive portfolio be used to develop and improve the same.

As inquiry proceeds in a meaningful direction, learners can provide valuable information about their own thinking without which it is difficult (if not impossible) to promote collaboration, provide useful feedback, assess learning, or evaluate the long-term value of a lesson or activity. Returning to the example of teacher web pages -- in order to assess learning that results from production of web sites, or any research activity --the producers have to speak.

One way to get this conversation going and to make it more generative is through interactive portfolio assessment. By establishing a method for learner's ideas work to be collected and shared with the teacher and others, one can provide greater opportunities for collaboration, reflective dialogue, and increased student awareness of their own learning processes.
 

 

BIBLIOGRAPHY

Brophy, S. P., Goin, L., Bransford, J. D., Sharp, D. L., Moore, P., Hasselbring, T., & Goldman, S. R. (1994). Software support for instruction in "deep" comprehension and decoding. Presented at the annual meeting of the American Educational Research Association, New Orleans.

Center for Research on Evaluation, Standards, & Student Testing (CRESST). Creating better student assessments. In Improving America's Schools: A Newsletter on Issues in School Reform , Spring, 1996. [WWW Document]. Available: http://www.ed.gov/pubs/IASA/newsletters/.

Guba, E.G. & Lincoln, Y.S. (1983). Epistemological and methodological bases of naturalistic inquiry. In G.F. Madaus, M. Scriven and D. Stufflebeam (Eds.)., Evaluation Models: Viewpoints on educational and human services evaluation. Boston: Kluwer-Nijhoff Publishing, pp. 311-333.

Honey, M. & Moeller, B. (1990). Teachers' beliefs and technology integration: Different values, different understandings. CTE Technical Report #6. New York: Center for Children and Technology. Available: http://www.edc.org/CCT/ccthome/reports/tr6.html

Jonassen, D., Carr, C., & Hsui-Ping, Y. (1998). Computers as Mindtools for engaging learners in critical thinking. TechTrends, 43(2), 24-32.

Kozma, R. and Quellmalz, E. (1996). Issues and Needs in Evaluating the Educational Impact of the National Information Infrastructure. Paper commissioned by the U.S. Department of Education's Office of Educational Technology.  [WWW Document].  Available: http://www.ed.gov/Technology/Futures/kozma.html

Kuhn, T. (1962). The Structure of Scientific Revolutions. Chicago, IL: University of Chicago Press.

Lankes, A. (1995). Electronic portfolios: A new idea in assessment. ERIC Digest. Syracuse University. Syracuse, NY. (ERIC Document Reproduction No. ED 390 377).

Linn, M. & Muilenburg (1996). Creating lifelong science learners: What models form a firm foundation? Educational Researcher, 25(5), 18-24.

Loh, B., Radinsky, J, Russell, E., et al. (1998). The progress portfolio: Designing reflective tools for a classroom context. In Proceedings of CHI 98. Los Angeles, CA: ACM Press. [WWW Document].  Available: http://www.ls.sesp.nnwu.edu/sible/papers/loh_chi98.pdf

McKinley, J. (1983). Training for effective collaborative learning. In R.M. Smith (Ed.). Helping adults learn how to learn. New Directions for Continuing Education, no. 19. San Francisco: Jossey-Bass, September.

Popper, K. (1962). Conjectures and refutations: The growth of scientific knowledge. New York : Basic Books.

Ravitz, J. (1997a). Evaluating Learning Networks: A special challenge for Web-based Instruction? In Badrul Khan (Ed.).,Web-based Instruction, Englewood Cliffs, NJ: Educational Technology Publications, 361-368. Supplementary page: http://nsn.bbn.com/Ravitz/ipv.html

Ravitz, J. (1997b). Summary of First Year Evaluation Report for the Online Internet Institute. Proceedings of the Edward F. Kelly Evaluation Conference. SUNY, Albany, April 11, 1997.  [WWW Document]. Available: http://www.gse.uci.edu/Ravitz/oii_summary.html

Ravitz, J. & Snow, J. (1999). Constructivist-compatible teacher beliefs and practices: Prevalence and differences by instructional setting. Poster session presented at meetings of the American Educational Research Association, April 24, 1999. Montreal, CA.  [WWW Document].  Available: http://www.gse.uci.edu/Ravitz/constructivism/index.htm

Riel, M. and Harasim, L. (1994). Research Perspectives on Network Learning. Machine-Mediated Learning, 4 (2-3), 91-113.

Rogers, E. (1983). Diffusion of Innovations (3rd ed.). New York: Free Press.

Secretary's Commission on Achieving Necessary Skills (SCANS) (1991). What work requires of schools: A SCANS report for America 2000. Washington, DC: U.S. Department of Labor.

Schön, D. (1987). Educating the reflective practitioner. San Francisco: Jossey-Bass.

Smith, N.L. (1983). The progress of educational evaluation: Rounding the first bends in the river. In Madaus, G.F., Scriven, M. and Stufflebeam, D.L. (Eds.). Evaluation Models. Kluwer-Nijhoff Publishing: Boston, MA., 381-392.

Stiggins, R. (1997). Student-centered classroom assessment (2nd Ed.). Upper Saddle River, NJ: Merril.

======================================================================================

Here is a clip from a talk I gave at NECC on this topic:

Building assessment into the design of online projects

http://www.gse.uci.edu/Ravitz/Necc98_assess/1.html

A variety of structures have been developed that are intended to help learners demonstrate learning outcomes as a result of their participation in online projects.  Collaborative tools that are beginning to proliferate can be integrated early in the learning process in support of project design and assessment -- e.g., before, during and after "instruction" takes place.

As learners work on a project, they can provide valuable information about their own thinking without which it is difficult (if not impossible) to promote collaboration, provide useful feedback, assess learning, or evaluate the benefits of project activities over time.

With its flexible participation structures, the Internet allows learners to pace their own activities, share information at opportune times, receive feedback and interact with others, all while building a record of their activities and what they have learned.

By establishing the expectation that assessment data will be collected and shared along the way, as part of the learning process, and by more effectively building this expectation into a project's design, one can provide greater opportunities for collaboration, reflective dialogue, and increased student awareness of their own learning processes.

Finally, by systematically collecting this information into online archives and portfolios, teachers and project designers will better be able to demonstrate the types of learning outcomes that justify the efforts made to participate in online projects.

*********************************************************

Jason Ravitz  <jravitz@uci.edu>      VOICE (949) 824-5850    

Education, Univ. of Calif., Irvine    FAX  (949) 824-2965   

2001 Berkeley Place Bldg                         

Irvine, CA 92697-5500     http://www.gse.uci.edu/Ravitz/home.html