Skip to content

Mentafy offers a new and reliable method to confirm original student writing

Our unique method

Chronological text composition

Mentafy's writing pattern analysis requires just one additional information per word: when it came into the text. To gather this data, Mentafy looks at the incremental differences between auto-save versions of a text document (resolution: roughly one auto-save version per minute). Each word that was changed or added in a new version gets the corresponding version number, deleted words get removed and moved (as well as re-inserted) words keep their original version number.

Activity time series

Mentafy’s activity pattern analysis looks at the author actions over time. Actions include the number of inserted, moved, changed or deleted words per version (resolution: roughly one auto-save version per minute). There should be a mix of these, changing depending on the writing phase.

Student writing patterns are special

Focus on the final text does not work any longer, as AI-generated texts get more and more human.
Mentafy instead tracks the writing process, but without invading privacy: The system analyses text changes during the writing process.
When the work is submitted, it is then clear for each existing text section when it was included in the work and how it was revised.
The results from our pilot are clear:
The writing patterns of pupils and students reflect the development of their thoughts and texts and are uniquely different from a chatbot, copy-typing or copy & paste.
Writing patterns of students reflect the development of thoughts and text

Student Support

  • Minimal changes to the work-flow will allow fast adoption
  • Limited data collection alleviates privacy concerns
  • Revolutionary real-time writing project support

Instructor Insights

  • Transparency of plagiarism and AI-Usage
  • Student comments on unusual text avoid misunderstandings
  • Minimizing overhead with concise reports

Can the process be cheated?

As this is an algorithm it is not perfect, but all obvious ways to cheat the system are blocked. And the possibilities to get around this are really very time intensive – we aim to make this at least as hard as writing the text properly yourself. So the message is: Cheating does not pay off!
We can monitor all contents in the folder and recognize text moved between documents. And if other text is pasted or manually copy-typed we detect that.
However in the future, cheaters might start using scripts to randomly copy over words; but the data we collected so far shows that students do not revise randomly, instead follow language structure and meaningful units. So soon we will train machine learning to also recognize advanced cheating patterns. For collaborating authors we can also distinguish their contribution, provided they access the file with their own separate accounts.
Do you get false positives?
Very rarely, as the system detects many straightforward cases like citations, and the student can provide a categorization for special cases (e.g. text written elsewhere by the student, say a typed audio-recording).
There is always evidence available, and the document evolution can be reviewed by a human in really bad cases.