Editor’s Note: Today’s post is by Janet Salmons. Janet is is a free-range scholar. She edits the Substack newsletter, When the Field is Online, and is the author of 12 books, most recently Doing Qualitative Research Online (2022).
Q: Rewrite with AI?
A: Why do you ask?
The blank page beckons, inviting me to write. But when I start to put thoughts into words, embedded AI features cajole me into allowing their invisible hands to rewrite it. What are websites, text messaging apps, and writing programs trying to tell me when they pose this question?
Your words aren’t good enough…. It would be better if you used our words.
Your style isn’t right…. It would be better if you used our style.
Your voice doesn’t merit attention… It would be better if you used our voice.
What you have to say isn’t right… it would be better if you said what we suggest.
The implicit message is less than subtle: use the words we tell you to use, in the style we tell you to use, to say what we tell you to say, in the voice we tell you to use.
Somehow in the past, through all sorts of hardship, with the most basic tools, human creativity came through and left legacies that still inspire us. Somehow Indigenous people were able to scratch their stories into cave walls. Somehow Shakespeare wrote sonnets and Beethoven wrote sonatas, with quills. Yet we are meant to believe that, in our time, basic literacy and human imagination are inadequate. We are meant to believe that we are not up to the simplest written task without AI assistance. Are we so easily fooled into giving up agency as creative, inquisitive humans?
What if people succumb to the persistent “rewrite with AI” query and use these tools to produce or revise their student or scholarly papers? The presence of hallucinations and inclusion of fake citations are well-known at this point. We are starting to see complaints about the “flattening” of contemporary writing. Scholarly Kitcheneditor David Crotty asked: “What happens to innovation when everyone is using the same tool with the same biases that is essentially built to offer us more of the same stuff that we already like/know? Is this a recipe for a similar homogenization and stagnation of science and knowledge building in general?” Nina Begus found just that in a study that compared human and AI-generated storytelling: “GPT-generated stories … were thematically homogeneous to an extent that they hardly differed from each other. … GPT stories are predictable in their plot and message” (p. 6). Perhaps a reason is that, as Michael Gerlich observed, “AI tools might inadvertently reinforce biases and limit exposure to diverse perspectives”. Yet those are the very perspectives we need as writers in a multidisciplinary, global community of scholars!
We represent something of ourselves in scholarly writing: where we are coming from, what we value and respect, what we think is important enough to merit the time and effort it takes to conduct research and put our thoughts into words. Ken Hyland observes that “academic writing… is an act of identity: it not only conveys disciplinary ‘content’ but also carries a representation of the writer”. The APA Publication Manual advises that “in describing your research, present the ideas and findings directly, but aim for an interesting and compelling style and a tone that reflects your involvement with the problem. How can we speak about our research, with our own style, tone, and voice if we are using AI-generated writing? The present risk is the loss of the unique voice of the individual, the product of their lived experiences and cultures. Is that what we want or need in scholarly publishing?
As AI is embedded into the programs and platforms commonly used, and as AI plies writers with ready-made answers, suggestions, or full-blown text, those of us who are committed to scholarly publishing face numerous challenges. Will emerging writers miss the opportunity to learn skills necessary to develop ideas because it is easier to shortcut the process? Instead of thinking and rethinking with each iteration of a draft, they accept with AI gives them to say. Gerlich calls this “cognitive offloading,” relying on external agents to do your thinking, “delegating tasks [to AI] such as memory retention, decision-making, and information retrieval to external systems.” Will experienced writers, after their intellectual property is taken for AI training without consent or compensation, be too demoralized to continue? This was a view expressed to me recently: “there is the existential angst…why bother writing at all?” Needless to say, scholarly publishing cannot operate without writers, and scholarly publishing cannot be a channel for disseminating breakthroughs and new knowledge without writers who can think critically and convey their insights to readers.
AI is not neutral. Whose voices will the algorithms exclude or ignore? The tech industry behind LLMs and AI has seemingly endless amounts of money to spend and increasingly, political power and influence on the media. But as professors, mentors, writers, reviewers, and yes, bloggers, we have the power too. We can encourage the use of tools that assist writers’ efficiency without taking away their unique voices. We can celebrate critical thinking, welcome underrepresented scholars, and encourage thoughtful, original scholarly writing so our colleagues and students feel empowered to say no to the “rewrite with AI?” query and put their own insights into their own words.
Discussion
7 Thoughts on "Guest Post: Finding Your Voice in a Ventriloquist’s World – AI and Writing"
Great post, Janet (and lovely to see your experienced voice on the Kitchen!) I’m somewhat on the opposite camp here, though, seeing opportunities for AI-assistance in multiple ways, with researchers and those like me who write in numerous settings for publishers and societies. For instance, take Niki Scaplehorn’s recent article reflecting on feedback on engaging researchers with AI to support writing (TLDR: speed, equity especially where English isn’t their first language): https://www.researchinformation.info/analysis-opinion/can-ai-create-high-quality-publishable-research-articles/. Having heard first-hand from researchers and editors how these tools can level the playing field (and hearing from publishers how much doesn’t even make it past the first hurdle where the basic problem is the way a paper is written, rather than the research), it’s clear that there are gains to be made. As for what those like myself who write not as a researcher but as part of the Schol Comms community, I defer to the fantastic Ethan Mollick whose book, Co-intelligence, sets out exactly some of the issues you describe here alongside some of the other opportunities for harnessing AI tools to our advantage, providing there is ALWAYS a human collaborating. I’m finding that I already fall into what he describes as the cyborg model, where AI and I work alongside one another to kick-start those sticky moments, or to provide suggestions on where my work can be improved. I don’t take AI’s contributions as the final, finished product, nor do I always agree with its suggestions, but it does mean I have a powerful assistant at my side. I strongly recommend Ethan’s book for some excellent discussion on this topic! Hope you’re keeping well, Janet!
Thanks Mithu! As a confident writer, this cyborg model might work for you. But for the emerging writers, those who are still trying to develop their confidence, scholars whose voices have been ignored, I think these prompts can be demoralizing.
I am a fan of humans using technology, not the other way around! Let’s not let these tools take away our unique voices, suppress cultural nuances, and narrow the ways we share our experiences. Instead let’s affirm each other’s worth and celebrate the richness of our diverse stories.
Thank you for this post. Also worth reading is Shannon Vallor’s book, the AI mirror. The central take away (for me at least) was the idea that large language models can only ever provide an image of past content. They cannot ever generate anything truly novel, as they are trained on historical content.
Thanks Gerry. You point out a real problem – and it seems to me that in our rapidly changing world we need new solutions, new understandings, and we need to welcome original insights. For those who teach a question is: how do we encourage aspiring researchers to think outside the proverbial box? Or shall we say, how do we encourage aspiring researchers to step outside the AI box? I wrote about ways to affirm originality in this series for Academic Writing Month:
https://tinyurl.com/4hm58vym
Originality and Academic Writing Month
Originality and Our Scholarly Voice
Communicate Your Insights
Encourage originality: Create a culture of inquiry in the classroom
As a publishing professional and as a writer (of scholarly work and fiction), I really appreciate this post. I too have wrung my hands over AI-generated writing, especially in creative fiction and narrative non-fiction. I don’t so much worry about digital missives in business correspondence, but the homogenization of storytelling (be it non-fiction or fiction, letter writing, and beyond) is a concern. “Voice” is critical to storytelling, and for all the reasons (and more) mentioned in this post. Our perspectives, choice of words, pacing, what we choose to emphasize, etc., are based on our experiences and positionality, which makes it unique and serves to advance our humanity as we read and learn (from memoir, narrative-nonfiction, fiction, biography, letters — digital or otherwise — from a friend, neighbor, or colleague describing an experience). And right now, we especially need Voice. Shameless plug alert: For those of you who love non-AI generated upmarket/commercial/women’s fiction, my book, WEIGHT OF A WOMAN, is out this week with Odyssey Books. https://www.judithjacksonpomeroy.com/
Thanks Judith, and I will look for your book!
Agreed, “voice” is what makes writings interesting to read. A few years ago, when managing Sage Methodspace, we did a whole series about Indigenous research methods. The researchers who contributed emphasized the need to bring their cultures and communities into their scholarship. Would the power of their voices be conveyed if they “rewrite with AI”?
(See: https://tinyurl.com/2t4283h3)
Hi Janet, Thanks for this link. Very appreciated!