Writing Tools for the Self-directed Learner Part 2.

In part 1 of this four part series, I introduced a free vocabulary tool that helped learners profile their words in their essays.  In part 2, this week’s tool is a grammar one called PaperRater.

PaperRater: Grammar checker

PaperRater is a software that checks your grammar and vocabulary usage. It is free to use though it offers a “premium service” at a cost.  It’s quite user friendly.  After you click on the “Use Now FREE!” button, you add a title, paste your essay into a blank template, select education level (e.g. 12 grade, college, graduate school, etc.), type of paper (e.g. essay or research report), fill out a captcha, agree to terms and click “Get Report” at the bottom of the page. (See image below).

PaperRater Template for adding essay.
PaperRater Template for adding essay.

There is an optional area to paste works cited. The “originality detection” (i.e. plagiarism check) is set to “skip” by default setting. It can be set to “include” that feature, however it will slow down the process.

For this review, I submitted one of my 1st year student’s book reports, set the level of writing at “college (undergraduate) and I didn’t include a plagiarism check.  It took less than 5 seconds to generate a report.   The UI is split between the submitted essay on the left hand side and an item list of notes on the right hand side (see figure below).  PaperRater uses the same to color scheme to highlight grammar errors (green) and spelling mistakes (red) as Microsoft Word’s.  Similarly, when you move/click your cursor in the highlighted areas, you can choose from several alternatives.

Screenshot 2014-01-04 16.28.16
PaperRater results Screenshot

In addition to grammar and spelling, the item list also evaluates word choice, style, and vocabulary words. It also provides a grade with a commentary and even offers the user an opportunity to print the results.  The “word choice” item checks for “inappropriate words and phrases” supposedly based on the genre of writing.  It also provides a numerical score.  For example, the essay I submitted scored 3.386 with a comment in parentheses saying “lower is better”.  Moreover, a follow up comment reported that the score was “above average”.  Unfortunately, the developers do not provide any in depth explanation as to what the numerical scores signify only to say that they are  derived from a combination of “statistics, machine learning and natural language processing”.

The “style” item shows a breakdown of word usage such as verb phrases, conjunctions, pronouns, and normalizations.  The “vocabulary” item provides scores, a commentary on the quality of words used in the submitted text and some follow up tips.  While the scores aspect is a bit vague there is a link to a “vocab builder” in the comments section.  Unfortunately, the vocab builder is a bit disappointing, as it doesn’t offer the user an option to see their chosen words used in context.

Finally the “grade” item gives both numerical and letter grades.  This item is followed up with a very long disclaimer that in essence suggests the user to think of all the notes in the item list when looking at the grade (i.e. take it with a grain a salt).

I have used this PaperRater with my first-year ESP (English for Specific Purposes) Science students. They similarly remarked on the ambiguity of the grading scale but found some benefit in the grammar and style comments.   More notably, PaperRater’s comments raised their metacognitive awareness to aspects in their writing such related to the use of pronouns and normalizations.  Although PaperRater doesn’t provide good examples of collocations, it could be a good bridge to introducing a concordance like COCA.  Interested in giving PaperRater a trial run in your classes? Let us know how it turns out if you do.

%d bloggers like this: