This is a short but hopefully powerful post that underscores my points thus far about ChatGPT-3 being an excellent editing tool.

As many of you know, I dictate my stories. Dragon is fantastic, but it makes some really goofy errors sometimes, even if you speak clearly.

The following are actual sentences I wrote over the last 48 hours where Dragon made mistakes that are obvious to the human eye, but Word’s Editor, Grammarly, and ProWritingAid all missed them.

I won’t put the actual corrected sentences here. Just the wrong ones. Copy them into Word (or your favorite writing app) and use your favorite spellchecker to see if it suggests the right edit. It will not.

The team at ChatGPT-3 has clearly figured something out that current app developers have not. ChatGPT-3 understands the context of a sentence and makes corrections based on these understandings. It understands proper nouns, sentence clauses, and so much more that, until now, have been exclusively human editor territory.

Let me put it this way: I haven’t seen ChatGPT-3 recommend a false positive. Not once. Do I AGREE with all of its suggestions? No. But they’re not wrong, unlike current tools, which often recommend edits that are nonsensical (not going to name names, Word…)

Is it perfect? No. There are still errors in the text after it’s done with it. But there will be fewer errors if you use ChatGPT-3 than if you don’t. I can prove it.

It’s just unfortunate that it is such a PITA to use right now. But that will change.

Read the sentences below and spot the errors. Whatever you see that looks wrong, it is. And whatever you think the correct answer is, ChatGPT-3 nails it.

And if you want to see how ChatGPT-3 fixes these sentences, go to https://chat.openai.com/chat (you may have to create an account), paste the sentences in, and use the prompt “Edit this text for typos only: (paste)” and see what happens.

To be fair to Grammarly, there are two errors in Example #3. Grammarly does indeed catch the first error…but not the second.

EXAMPLE #1: For example, there's a reason why tools like grammar way and providing aid still can't recognize the difference between the words to and to.

EXAMPLE #2: I ran the original sentences through Microsoft Word's Editor, grandmotherly, and ProWritingAid.

EXAMPLE #3: OpenAI has committed to bringing the program to its API, but for now, users have to copy and paste text and 500-word chunks into the software and weight around 30 to 45 seconds before ChatGPT-3 finish is announced. Then, users must be the finished text back into their writing app.

EXAMPLE #4: Maybe Danielle steel did something similar in one of her novels.

EXAMPLE #5: Perhaps John Grisham, Ken Follett, and James Patterson all do X, but only Nora Roberts does why.

Anyhoo, there’s your daily dose of tech. These sentences are real examples. Even if you don’t dictate, I’m sure you’ve got some real typo doozies lying somewhere in your manuscript waiting to be discovered. We all do.

It’s just a matter of time before this gets more attention. Just remember that you heard it here first.

Help a brother out and share this content