Proceedings of the 2nd Annual JALT Pan-SIG Conference.   May 10-11, 2003. Kyoto, Japan: Kyoto Institute of Technology.
Testing communicative competence using mobile phone digital photos
 携帯電話の写真機能を使ったコミュニケーション能力の測定
by Steven E. Quasha    Sugiyama Women's University

Abstract

In this qualitative study, students capture their daily activities in the form of digital photo journals using mobile phones. For curriculum related testing purposes, students were required to jointly establish pragmatic competence skill objectives with the instructor at the beginning of the course and revise individual goals throughout the semester. Three weeks before the end of the semester, communicative tests were conducted using random partners and the results were recorded on audio. Next, language partners reviewed the recorded conversations to identify and correct mistakes. Finally, reflective personal interviews and discussion of learning logs with the teacher concluded the study.

Keywords: communicative competence, mobile phones (keitai), sha-mail, highly personalized language content (HPLC), reflective learning, self-assessment (SA)

 本稿では、質的研究によって学生が携帯電話の電子写真日記(写メール)で日々の活動を記録する様子を分析する。
カリキュラムに関連したテスト目的のため学生はまずコース最初に担当講師と語用能力の目標を設定し、学期を通して
その修正を行なう。学期の3週間前に無作為に選ばれたパートナーを相手にコミュニカティブテストが行なわれ、それ
は録音される。次にパートナーが録音を聞きなおしてエラーを探し、直す。最後に教師とそれを反映した個人面談を
行い、学習記録を話し合うことにより学習は完了する。

キーワード:コミュニケーション能力、携帯電話、写メール、個人的言語内容、反映学習


Most L2 researchers tend to agree that utilizing a skill-oriented approach that places students in authentic situations to create, critically analyze, make mistakes, repair them and, most importantly — learn to function in the target language is an ideal activity. The reality is that many teachers at Japanese universities are often faced with large class sizes, so accomplishing classroom activities proposed by skill-oriented theorists is dubious at best. This does not, however, have to result in a complete disregard of promoting a more skill-oriented approach. In fact, one way of accomplishing this is to find an object in students' lives that they value, enjoy and want to talk about. If educators remain aware of this factor and constantly seek out realia that students value, foreign language competence levels based on a communicative approach will enrich the learning process (Hurtado, 2000; Matsuta, 1998; Nunan, 1989).

[ p. 63 ]

Test Purpose

In Search of Authentic Materials
". . . authentic EFL materials help ease students into a more English-oriented universe and away from the narrow world of EFL textbooks."

The implementation of authentic materials in the second language classroom has garnered considerable attention over the past decade. One key advantage for using materials such as photos, newspapers, advertisements and television commercials is that foreign language students will likely encounter them in their overseas experience. Thus, authentic EFL materials help ease students into a more English-oriented universe and away from the narrow world of EFL textbooks. In a preliminary survey for this research, students expressed that when visiting English-speaking countries they usually do not attempt to read the newspaper nor do they examine advertisements. On a personal level, they felt unprepared to discuss information about their own families and personal details concerning their lives back in Japan.
While reading newspapers is certainly excellent for expanding vocabulary and watching television is superb for listening and cultural assimilation, they are quite passive as second language learning tools. Interaction is minimal and realistically students could accomplish similar forms of academic training back in their own countries. Therefore, this research commenced with the hope that it would empower students to develop a communicative skill set to engage in more authentic conversations. The medium for accomplishing this task came in the form of digitalized personal photos. One advantage to this approach is that even though the course would eventually finish, the photos could then be taken overseas and used as topic activators while studying and living abroad. The students agreed that this activity would be a nice idea to supplement our communicative English class and may prove to be useful for future homestay participants. Prior to implementation, a needs analysis was conducted and revealed the following data.

Needs Analysis

Based on this information (see Appendix 1), it is apparent that most participants possessed a keen desire to continue visiting English-speaking countries. For this reason they needed to expand their conversational and sociopragmatic skills. Apart from the course textbook, many teachers are searching for methods/materials that will help students perform better in L2 environments.
In class EFL video and DVD activities are certainly viable authentic alternatives (Quasha, 2001). However, it is sometimes difficult to personalize the content to the extent that students want to go out and actually use the target language. While television programs can entertain, their usefulness in term of vocabulary is not always evident. Therefore, educators need to reflect on the learning process more and ask the following questions: These questions can serve as a starting point for developing students' second language and pragmatic ability to effectively communicate in English speaking countries. The challenge is to find authentic materials that will assist us in this pedagogic process.

[ p. 64 ]

Test Items and Tasks

The Ultimate Authentic Tool

". . . the keitai is ubiquitous and using e-mail via the mobile phone is a very integral part of most students' daily lives."

Here in Japan, there is one indispensable device for today's university student. No, it is not a pen nor is it an electronic dictionary. It is the mobile phone or what is known in Japanese as keitai denwa — although this has been shortened to simply keitai, which means "to carry". Japan boasts one of the highest rates per capita for mobile phone users in the world. Consequently, the keitai is ubiquitous and using e-mail via the mobile phone is a very integral part of most students' daily lives. Walk down any street in Japan or ride a subway and you will most surely catch sight of someone rapidly thumbing away – as if playing a video game - on a mobile phone checking or sending mail. In fact, I have witnessed entire subway cars of people staring at their phone screens without the slightest regard for life outside their digital solitude. Recent technological advances have brought us sha-mail (photo mail) that enables users to take photos with a built-in camera located on the back of many mobile phones. Accordingly, it has become commonplace to exchange pictures and even video clips via the network.
Over the past few years, I have seen firsthand how essential this device has become to students. During class, students leave their phones on their desks – no matter how often I disapprove of the practice – some even try to hide the phone under their desks and cleverly write e-mails to friends while the teacher is not looking. As a penalty, I began collecting students' phones and hoped this would serve as a deterrent from using them in class. Not a chance since the keitai is a social addiction for so many. Besides, I felt more like a watchdog - having to play the role of disciplinarian collecting phones – and less like a teacher far too often. I wondered what could change this situation? I came to the conclusion that the phone is such an integral part of their lives and, unless the entire university banned carrying them into other classes, the keitai was here to stay.
With this in mind, I decided to find ways to make the mobile phone part of an EFL course. Slowly, as in-class opening activity, I asked students to conduct simulated communication in English using recent keitai conversations. This was fun and simple as they looked at the last few numbers they dialed and had to discuss the content with classroom partners in English. This activity worked well for a semester or two and then technology changed the rules on me. Along came mobile e-mail and my once communicative students began to claim they do not have telephone conversations anymore. Apparently, e-mail is much cheaper than voice so its appeal among students on a limited budget is obvious. This led to the implementation of having to discuss recent e-mail communiques, which frankly was something many of them were loath to do. The power of the word was not something many students felt at liberty to discuss with one another. Fortunately, along came sha-mail and this has opened up a new venue for my teaching and for student learning.
Sha-mail permits students to capture pictures of their daily lives. These pictures can then become photo logs that serve as personal narratives of their lives; where they live, who they see and meet, where they go or hang out and where they work part-time. It has been my teaching experience that obtaining this amount of personal information, outside of written journals, can be like extracting teeth. Most students are not willing to open up and feel threatened having to talk about themselves. Yet, using photo journals seemed to place students in a comfortable position to openly discuss and exchange this information on their terms. This may be because today's students were brought up in an era when puri-kura, print club, photo booth style pictures were once the rage. Therefore, they feel at ease making small talk and asking each other questions about personal photos. This step, where the language content becomes highly personal was the level I was trying to achieve.

Test Administration
". . . [students] feel at ease making small talk and asking each other questions about personal photos."

By the tenth week of the course students have used photo journals in class every week for approximately 15-20 minutes. They have submitted weekly photo learning journals that are checked and graded (see Appendix 2). Participants have a set of ten pictures that they can expound upon and are confident about being able to discuss with others. In class practice has included changing partners so students have received a variety of questions concerning their photos from at least four people. Also, they have learned how to ask detailed questions in English to enhance communication. From these task-based activities, we have explored and practiced how to keep and end conversations, hedging strategies and ways to negotiate meaning. Personally, I believe teaching these techniques using student generated material – digital photos – is far more effective than anything offered in a textbook.
The test was administered during week ten of the course and students were encouraged to practice asking each other many questions concerning their photos. Then, at the beginning of class, their names were drawn out of a hat – like a raffle or lottery — to determine the pairing for the test. The raffle was conducted after the previous group had finished so this eliminated any chance for pairs to pre-practice before the oral test. For the actual test, each group was asked to enter another classroom and under the observation of the teacher — who gave brief instructions and only kept notes — the ten-minute test was recorded on audiotape.

[ p. 65 ]

Scoring and Rating

Student competence tests were loosely based on Bachman and Palmer's scales of ability in vocabulary and cohesion (Brindley, 1989). The main difference here is that an extra column that would include an extensive vocabulary and excellent cohesion has been left out. The reason is that my students are not English majors and simply do not have the skill to perform at that level. Many of the participants have an inferiority complex when speaking English and continuously ask the teacher for confirmation. Consequently, I see no reason to include such an unattainable goal for this group.

VOCABULARY COHESION
Extremely limited vocabulary
A few words and formulaic phrases. Not able sustain discussions.
No cohesion
Utterances completely disjointed, or discourse too short to judge.
Small vocabulary
Difficulty in talking with other examinee(s) because of vocabulary limitations.
Very little cohesion
Relationship between utterances not adequately marked; frequent confuses ideas.
Vocabulary of moderate size
Frequently misses or searches for words.
Moderate cohesion
Relationship between utterances generally marked; sometimes confuses ideas.
Large Vocabulary
Seldom misses or searches for words.
Good Cohesion
Relationship between utterances well-marked.

Table 1: Evaluation rating scale criteria for student photo discussions

Next, a grade scale was given for each level ranging from D-to-A going from top to bottom. Thus, the "A" grade student would possess a large vocabulary and have good cohesion. For statistical purposes, these letter scores were adapted from numbered grades to help determine the standard deviation. Thus, the following system was used:
	   'D' grade = 55-69   	'C' grade = 70-79   	'B' grade = 80-89   	'A' grade = 90-100

Since each student graded all classmates (except themselves) the N = 12. The mean score of student-generated responses was 71.54. Comparatively, the teacher's mean score of the students' cumulative projects was 74.62. As a result, the standard deviation for the participants' scores was determined according to the formula in Table 2:

STUDENT SCORES (X) MEAN (M) X - M (X-M) SQUARED
1 78.75 71.54 7.21 51.98
2 62.5 71.54 -9.04 81.72
3 68.75 71.54 -2.79 7.78
4 65 71.54 -6.54 42.77
5 66.25 71.54 -5.29 27.98
6 70 71.54 -1.54 2.37
7 71.25 71.54 -.29 .08
8 82.5 71.54 10.96 120.12
9 75 71.54 3.46 11.97
10 71.25 71.54 -.29 .08
11 83.75 71.54 12.21 149.0
12 67.5 71.54 -4.04 16.32
13 67.5 71.54 -4.04 16.32
SUM: 930 -.02 528.57
TOTAL MEAN = 71.54 S.D. = 6.64

Table 2. Student-generated mean & standard deviation scores

[ p. 66 ]

Table 3 specifies how the mean score and standard deviation of the teacher ratings were calculated.

STUDENT SCORES (X) MEAN (M) X - M (X-M) SQUARED
1 85 71.54 10.38 107.74
2 85 71.54 10.38 21.34
3 85 71.54 10.38 21.34
4 85 71.54 10.38 21.34
5 85 71.54 10.38 21.34
6 85 71.54 10.38 21.34
7 85 71.54 10.38 21.34
8 85 74.62 10.38 107.74
9 85 74.62 10.38 107.74
10 70 74.62 -4.62 21.34
11 85 74.62 10.38 107.74
12 70 74.62 -4.62 21.34
13 70 74.62 -4.62 21.34
SUM: 970 74.62 -.06 623.02
TOTAL MEAN = 74.62 S.D. = 6.92

Table 3. Teacher-generated mean & standard deviation scores

One way this test differs from many other oral tests is that students evaluate each other. In class, the teacher demonstrated the scoring process with detailed examples for each performance level. The rater training process was quite time consuming because students needed more simulated examples to test to build confidence in their rating abilities. Overall, the student graders reported a mean score of 71.54 while the teacher grades reflected a slightly higher score of 74.62.
The slight variance in the mean score indicated that the students valued their role as a rater. This proved to be an important departure point for students to promote self-responsibility in their evaluation of EFL. None of them had ever experienced this before and most seemed to relish the opportunity. The purpose of self-assessment (SA) and its rationale in L2 instruction was fittingly summed up as follows:
". . . [a] learner-centered grading approach, where the students are actually determining the scores and setting the standards, is more empowering for learners because they are completely involved in the process."

In my own teaching, I have asked students to take more self-responsibility for their learning. Yet, this never materialized to the extent that I intended. More often than not, I was disappointed and perplexed about what went wrong. This learner-centered grading approach, where the students are actually determining the scores and setting the standards, is more empowering for learners because they are completely involved in the process. A positive side-effect was that students prepared more for the actual test knowing that, not only their teacher but also their peers were going to judge their oral production.

[ p. 67 ]

Reporting Results

Student Logs

Self-assessment (SA) can be used for different purposes, ranging from placement tests to the diagnosis of individual problems. It is this latter function that will serve as a focal point for this section. By having students submit weekly learning logs based on the lesson and their progress, this helps diagnose individual problems and establishes a line of communication with the teacher that can help assist in the learning process. Moreover, logs were the conduit for revising individual students goals throughout the semester and were responsible for evolving the curriculum of the course.
Logs fall under one of the many alternative assessment methods delineated by Brown and Hudson (1998) that encourage real world contexts and higher level thinking. For our class, the learner logs served as reflective notes for the students and were used each week. It was the instructor's hope that, at the very least, logs would serve as effective study guides for the final oral exam.

Audio Review & Grading

Once the testing was complete for all fourteen students — seven pairs — the grading process began the following week. Test partners were asked to evaluate two other groups in the teacher's office during lunchtime. This did not take away from class time and the students did not seem to mind. For the student graders, it usually took fifteen minutes to assess another group - so no more than thirty minutes for two. Names for the other group members were kept anonymous and instead numbers were displayed on each tape and grading sheet. Most students, however, quickly recognized the voices of their classmates. Obviously, the problem is that students have certain preconceived notions of others abilities based on class observation. This can produce positive or negative bias in the grading process - although much to my surprise - students were stricter in their assessment than me. Given an opportunity to do this again, I think having two classes grade each other may produce a more valid grading system.
One particular advantage implementing the learner-centered grading was that students were the primary decision makers for evaluating conversation analysis (CA). Recent work has shown that using CA for validating L2 oral testing is formidable (Lazaraton, 2002). Having students transcribe CA is a rigorous, inductive approach and one that puts students in a position to learn more about their own EFL idiosyncrasies.

Test Taker Feedback

The week following the test, pairs were asked to listen to their own audiotape and write down any mistakes or questions they may have. Again, the learner log was used to report this information and it was submitted to the teacher for review. At the beginning of the next lesson, I was able to open the class with some common errors and answer a few of the student-generated questions on the board.
During the final week of instruction, students met individually with the teacher to discuss their learner logs, the oral tests and what they learned from the course. This reflective element gave students the chance to critically think about the course content and explore learner autonomy for future EFL study. They also submitted a questionnaire that revealed students overall enjoyed the digital photo testing component of the course (see Appendix 3).
Based on this information, the mean score of 1.92 seemed to indicate that students approved of the learning methodology and found it useful.
Oral interviews revealed that compared to other classes, eleven participants felt this particular course provided them with much more opportunities to speak and expand about their lives. This positive feedback led me to establish the acronym HPLC that stands for highly personalized language content. The focus of the language learning is strictly based on the lives and activities of the students and not fictitious characters in a textbook. HPLC is crucial in the learning process because it put students into a cycle where the target language is very real and students have much more at stake. Three participants did complain that the extra work taking pictures seemed fun at first, however, as their semester got busier, it became more of a burden. All three of them also were working more part-time hours than their classmates.
One of the problems with any form of testing is the issue of potential washback it can produce for the examinees. This effect on teaching and learning can be either negative or positive (McNamara, 2000; Hughes, 1989). For the students in this study, the nature of the test impact changed.
Students had practiced most of the material and knew what to expect by the time the test was administered. In fact, any flaw based on this approach was that the validity was weak because the test was bit too predictable. Aside from this, washback was more positive as students walked away from the course more empowered in communicative English.

[ p. 68 ]

Conclusion

As EFL teachers, we need to stay abreast of what is important to our students linguistic and sociopragmatic development. Also, we cannot lose perspective of the things that are important to them as individual language learners. Or for that matter, what is important in their lives. The mobile phone or keitai is ubiquitous and establishing sha-mail as in-class activity or for testing purposes requires very little technical preparation. Actually, students will likely know more about its usage and functions than the teacher, so most instruction is then limited to the EFL teaching objectives.
Students without sha-mail can borrow a friend's or use the family digital camera. If one is not available, most universities have cameras students can borrow at the audio-visual center. After all, if a picture can speak a thousand words, think how much more we have given our students to talk about. The outside assignments can be printed as a photo collage and taken along to a homestay program. Photos then saved and put on a Sony memory stick serve the same purpose. What better way to break the ice and introduce a student to their overseas family?
Based on personal observation compared to other oral communication classes, digital photo testing forces students to develop skills to explain personal information. This empowers learners, promotes self-responsibility and, most importantly, has real world applications.

References

Alderson, J. (1981). Report of the discussion on communicative language testing in J.C. Alderson & A. Hughes (Eds.), Issues in language testing. ELT Documents 111. London: The British Council.

Bachman, L. (1990). Fundamental considerations in language testing. London: Oxford University Press.

Bachman, L., & Palmer, A. (1996). Language testing in practice. Oxford: University Press.

Bachman, L. (2000). Modern language testing at the turn of the century: Assuring that what we count counts. Language Testing, 17, (1), 1-42.

Brindley, G. (1989). Assessing achievement in the learner-centred curriculum. In National Centre for English Language Teaching and Research, (pp. 56-84). Sydney: Macquarie University.

Brindley, G. (1991). Defining language ability: The criteria for criteria. In S. Anivan (Ed.), Current developments in language testing. Singapore: Regional Language Centre.

Brown, J., & T. Hudson. (1998). The alternatives in language assessment. TESOL Quarterly, 32, (4), 653-675.

Brown, J., & T. Hudson. (2001). Criterion-referenced testing. Cambridge: University Press.

Fulcher, G. (2000). The communicative legacy in language testing. System, 28, 483-497.

Griffiths, G. & Keohane, K. (2000). Personalizing language learning. Cambridge: University Press.

Hoelker, J. (2000 Nov.). Kolb applied to three EFL classrooms. The Language Teacher, 24, (11), 13-21.

Hudson, T. (2001). Indicators for pragmatic instruction: Some quantitative tools. In K. Rose & G. Kasper (Eds.), Pragmatics in language teaching. Cambridge: University Press.

Hughes, A. (1989). Testing for language teachers. Cambridge: University Press.

Hurtado, O. (2000). Empowering existing materials and realia to reinforce language learning skills development. This is not the APA Format for citing conference presentations. British Council Colombia National ELT Conference.

Kasper, G. (2001). Classroom research on interlanguage pragmatics. In K. Rose & G. Kasper (Eds.), Pragmatics in language teaching. Cambridge: University Press.

Lazaraton, A. (2002). A qualitative approach to the validation of oral tests. Cambridge: University Press.

Lewkowicz, J. (2000). Authenticity in language testing. Language Testing, 17, (1), 43-64.

Matsuta, K. (1998). Applications for using authentic materials in the second language classroom. CELE Journal, 6, 41-44.

McNamara, T. (2000). Language testing. London: Oxford University Press.

Newfields, T. (2003). Voices in the Field: An interview with J.D. Brown. JALT Testing & Evaluation SIG Newsletter, 7, (1), 10– 13.

Nunan, D. (1989). Designing tasks for the communicative classroom. Cambridge: Cambridge University Press.

Pienemann, M., Johnson, J. & Brindley, G. (1988). Constructing an acquisition-based procedure for language assessment. Studies in Second Language Acquisition, 10, 217-43.

Realia Project (2003). About REALIA Project. Retrieved from the World Wide Web at www.realiaproject.org/realia_pages/about.html on 5 Aug. 2003.

Quasha, S. (2001). DVD Subtitles: A dynamic approach to code-switching. Proceedings of the Nov. 23-25 2001 JALT Conference in Kita-Kyushu, Japan. Tokyo: Japan Association for Language Teaching.

Shohamy, E. (1996). Competence and performance in language testing. In G. Brown, K. Malmkjaer & J. Williams (Eds.). Performance & competence in second language acquisition. Cambridge: University Press.

Spolsky, B. (1985). What does it mean to know how to use a language? An essay on the theoretical basis of language testing. Language Testing, 2, 180-191.

Weir, C.J. (1990). Communicative language testing. London: Prentice Hall.


2003 Pan SIG-Proceedings: Topic Index Author Index Page Index Title Index Main Index
Complete Pan SIG-Proceedings: Topic Index Author Index Page Index Title Index Main Index

[ p. 69 ]

Last Next