At Mentafy we have embarked on a large-scale test phase involving over a dozen schools across Germany. This endeavor, engaging around 300-400 students, aims to transform the way young learners approach and improve their writing skills, particularly in a time where generative AI is becoming more and more prevalent.

Objectives and Expectations

Primarily, it seeks to evaluate the efficacy of Mentafy as a writing mentor, helping students navigate the complexities of creating quality academic work. We aim to test and refine Mentafy as an effective writing mentor, ensuring it meets the specific needs of students preparing their ‘Facharbeit’, a first academic paper for 12th-year A-Level students in the German school system. These students typically dedicate about three months to their papers, making this period critical for their academic development. During this evaluation period we will assess the tool’s functionality, usability, and overall impact on the student learning process.

In these test scenarios, Mentafy is more than just a writing assistant. It’s a guide that helps students taking their first steps in academic writing, providing them with the necessary tools and support to produce quality work. This process involves tackling the inherent challenges posed by generative AI technologies, which have added a layer of complexity to academic integrity and the authenticity of student work. The goal is to improve writing process efficiency, intuitiveness, and alignment with academic requirements.

Core Features of Mentafy

The platform is designed to streamline the writing process, making it less daunting for students. Mentafy aims to enhance the overall quality of the academic papers, ensuring students are well-equipped to tackle this significant challenge.

Mentafy stands out with its unique features designed to aid both students and teachers. The platform includes a writing diary, offering students a structured way to document their writing process. The project management tools in Mentafy facilitate effective organization and planning of their writing project, making extensive research and writing more manageable. One of the most crucial features is the writing recorder, designed to ensure correct citation and transparent use of AI, addressing the growing concern of academic integrity in the age of AI-driven writing aids.

Feedback and Problem Awareness

Currently, the test phase is in full swing, with feedback collection ongoing. While comprehensive feedback is yet to be fully gathered and analyzed, early indications point towards a positive impact on both students and teachers. Teachers, at the forefront of this technological integration, are particularly focused on ensuring that students not only utilize AI tools like Mentafy effectively but also maintain academic integrity.

One of the major problems highlighted by teachers is the difficulty in evaluating student work with the rise of generative AI technologies. Mentafy is being tested as a solution to ensure that while students leverage AI for their work, the authenticity and integrity of their submissions are maintained. This is key in a landscape where distinguishing between student-generated and AI-assisted work is becoming increasingly challenging.

Conclusion and Next Steps

As the test phase progresses, the onboarding of schools and registration of students continue to gain momentum. As Mentafy’s test phase continues to unfold, the enthusiastic participation of schools and students is a testament to the perceived value of this initiative.

The results of this test phase will be examined in greater detail in the follow-up article, which is slated for April 2024. It should provide insightful information about Mentafy’s efficacy and effects on education, especially with regard to incorporating AI into the teaching and learning process.

Recommended Posts

No comment yet, add your voice below!


Add a Comment

Your email address will not be published. Required fields are marked *