Diablo Valley College shares their experience in adapting iTEP for the IEP world.
Since 2013, the Virginia Tech Language and Culture Institute has been using iTEP Academic-Plus to help accurately place incoming students, administering the test upon the students’ arrival to get an instant baseline score of their English language proficiency. Testing and Assessment Coordinator Eric Moore says the relationship has been highly beneficial, praising iTEP as a user-friendly English test.
“The quickness of scoring is great, the database is great, and to be able to go in and locate a student’s test score right away is very valuable,” Moore says. “iTEP has really made things simple and very user-friendly.”
iTEP Academic-Plus is broken down into five segments: grammar, listening, reading, writing, and speaking. The first three parts of the test are multiple-choice, and are scored electronically in real-time. “The quick turnaround is really helpful,” Moore says. “I can log in and see if someone is done and go in and look at the scores right away to get an idea of where the incoming students are with grammar or listening.”
The speaking and writing sections of iTEP are graded by native English-speaking ESL professionals. Institutions can also opt to have their own staff grade the test if needed. Virginia Tech uses this option, and during a semester break, the same ESL trained instructors that teach the universities courses will come in and read through the essays and listen to the speaking sections of the test, which Moore says gives them a good idea of the students’ abilities before classes start. (See iTEP Business Development Manager Cerise Santoro’s explanation of why iTEP uses human graders.)
Small company, big heart
A large institution like Virginia Tech needs a user-friendly English test that is also flexible. Before each semester, the university has to get all the incoming international students to take a placement exam, but managing the schedules of so many college students can be very difficult. The school values iTEP because of the personal, hands-on customer service it offers.
“We don’t have to go schedule something with iTEP and say we’d like to have a test on a certain day, and then have to jump through hoops if we want to add another test,” Moore says. “If I call iTEP, I get someone helpful right away. The ease of scheduling is great—if we have one in the morning, we can schedule another in the afternoon and it’s no problem.”
Every school, business, or organization that uses a test has their own unique set of circumstances and obstacles to overcome to get the best results from an English proficiency test. iTEP understands that every situation is different, and works with institutions to customize the exam to different settings or to test different skills.
For example, Virginia Tech found that a few essay topics for its placement test were appearing more frequently than others. Moore contacted iTEP, and immediately noticed that the randomization of essay prompts is much improved. Each individual iTEP test is assembled from an “item bank” of thousands of questions, decreasing the chances of seeing the same question twice. In addition, iTEP utilizes a live, rotating item-bank that serves test questions randomly.
Troubleshooting made easy
It’s been said that excellent customer service helps strengthen your brand. At iTEP, we strive to offer our customers the best experience possible, and treat everyone with total respect. Moore repeatedly mentions iTEP’s excellent support team and how they’ve always been there, no matter the time, to help resolve problems. “When we’ve had issues, we’ve been able to reach a tech person with iTEP and been able to receive great customer service to help walk us through the issue,” he says. “The troubleshooting has always been handled very well.”
Created by education professionals
iTEP was founded in 2002 by two individuals with deep roots in the international education field. They wanted to create a user-friendly English test that addressed the needs of the international education community. The company wasn’t created as a business ploy, but as a true labor of love, something that Moore says makes it easy to believe in iTEP. “It’s nice knowing the background of a lot of the individuals that are a part of the organization,” he says. “What [iTEP Executive Vice President] Dan Lesho says, I trust. They have what’s important for the students in mind. Being smaller than other test companies, they have the ability to offer really tremendous support to any school or organization they work with.”
What makes an English assessment test effective?
English proficiency testing is crucial for educational institutions looking to admit qualified international students, and for companies that employ speakers of English as a second language. There are all sorts of English assessment tests out there, so what distinguishes a great English assessment test from a weak one? Here are a few things to look for when deciding on a test for your school or organization:
To get an overall picture of someone’s English language abilities, it’s important to test all of the language skills relevant to the test-taker’s study or work. For many industries, a simple overview of a prospect’s grammar skills is not enough. iTEP offers comprehensive exams that measure test-taker’s command of the English language both formally and informally, through verbal and written communication that occurs naturally in the workplace and in the classroom.
The proliferation of smartphones and the internet has given rise to a number of quick online-tests that purport to give a baseline level of a person’s English abilities. However, these tests typically don’t evaluate these skills in depth, and if they do, they fail to measure the speaking and writing abilities; both crucial skills to include when deciding on job prospects or potential students. Ensuring that you testing for both the speaking and writing abilities helps to showcase a test-taker’s command of voice and tone, the hardest thing to master in written language.
The flagship iTEP exams, iTEP Academic, iTEP SLATE, and iTEP Business all have five sections that asses speaking, writing, listening, reading and grammar. The score reports are intricately detailed, allowing for data gathering that tracks even the smallest improvements in a test-taker’s English proficiency. These reports are also very useful in helping identify areas that need more work.
Graded by man or machine?
Some English language tests seek to evaluate all language skills using artificial intelligence or non-native English speakers to grade the tests. Of course, there’s no reason multiple-choice sections of an English test shouldn’t be graded automatically and instantly. The difficulty arises in grading the active skills of speaking and writing, in which the test-taker generates organic content. Of course, it would be very fast and inexpensive to grade these sections automatically using artificial intelligence, but our research has found there to be no substitute for ESL-trained native English speakers.
Grading is an extremely complex task. Proponents of automatic grading argue that it’s more objective than human graders. To eliminate subjectivity, iTEP graders go through “norming” exercises which function as a type of calibration where all the graders are tasked with scoring the same test, allowing them to compare and adjust their standards based on a community-consensus, grading history, and expected performance per question. This ensures that results are consistent whenever the test is administered. Someday, AI technology may advance to the point of being able to provide accurate scores, but presently, only trained humans can reliably judge the intricacies and quirks that distinguish one level of English speaker from the next.
The test should speak for itself
The nature of an English assessment test demands that the structure be sufficiently intuitive to the test taker so the questions can be understood without any extra explanation in the local language. All iTEP exams have a similar structure, a convenient administration procedure, and a standardized scoring rubric. Each type of question is formatted to be easily understood at first glance, even to a beginner English speaker.
Secure and convenient
Online English proficiency assessments are convenient, affordable, and accessible, but how do we know they are secure?
Naturally, the most secure environment to administer an English assessment test is a staffed test center. However, even in this setting, the top English tests on market have seen imposters taking the test on behalf of others. iTEP’s answer to this is a feature called FotoSure, a software which makes cheating by impersonation virtually impossible. FotoSure snaps and stores digital photographs of the test-taker throughout the exam period. Institutions can match the photos with the student arriving on campus.
In addition, iTEP utilizes a live, rotating item-bank that serves test questions randomly.
Each individual test is assembled from hundreds of random questions, decreasing the chances of ever seeing the same question twice. iTEP graders also conduct plagiarism scans, check testing history, and analyzing speaking samples for security breaches.
Not all settings require a maximally secure test. For placement purposes, for instance, intensive English programs often find it acceptable for test-takers to take iTEP on their home computer. When both the convenience of an at-home test and security are needed, iTEP has partnered with Examity to offer remote proctoring during which both the test-taker and his or her screen is monitored via webcam throughout the course of the exam.
Just the right amount of time:
Reports show that anxiety among test takers, especially students, is on the rise. Taking a long, taxing English test can be exhausting for any non-native speaker. This type of stress can skew results and have negative impacts on test takers. In an effort to help combat fatigue, iTEP conducted years of research, and found that a 90-minute test was the perfect length. At 90-minutes, an English test can be comprehensive but not unnecessarily long, while collecting enough data to provide reliable, detailed scores.
Evaluates a range of levels
Perhaps the most crucial aspect of an effective English assessment test is that it can accurately evaluate the skills of a wide-range of people. iTEP’s exams are laid out so that even someone with a very minimal grasp of English can answer at least a few questions. The writing section is open-ended, giving fluent or near-fluent students the chance to flex their muscle and really show how much they know. The graders will recognize the use complex structures, difficult verb tenses and other language nuances.