Could you even tell the difference?

The words ‘ARTIFICIAL INTELLIGENCE’ are printed by an ancient typewriter.
Image by Markus Winkler from Upslash

I recently read an article by 

Sudharshan Ravichandran

 encouraging writers to use the GPT-3 Artificial Intelligence to increase their writing output and sidestep the ever-looming writer’s block.

GPT-3, Generative Pre-trained Transformer in its 3rd generation, according to its developers, OpenAI, can be ‘Given any text prompt like a phrase or a sentence, GPT-3 returns a text completion in natural language. Developers can “program” GPT-3 by showing it just a few examples or “prompts.”’

The article led me to ask:

Q: Is collaborating with AI to write a good idea?

A: Immediately after thinking this question I had the intuition that something was wrong with doing so. I took to Twitter and polled my followers. The poll: “Do you find anything wrong with authors using AI to generate ‘their’ work? The choices were yes or no. After 24hrs and a modest four votes (sad I know), three voted yes and one voted no. Apparently, others shared my intuition.

Now, however, after examination and introspection, I think collaborating with AI is a good idea.

Certainly using AI to write the majority of your blog post will not make you leaps and bounds better at grammar. But what if you aren’t trying to grow your writing skill? What if you just want to disperse your ideas to the masses? Take Elon Musk for example. It seems as though he hasn’t the time to write articles, or at least doesn’t care to find the time. Perhaps if he fed his ideas to GPT-3, many brilliant posts could be published. How many other incredibly smart people don’t take the time to write and share their ideas? How many life-enhancing ideas and methods aren’t being shared?

But is it morally wrong in some way? Where does this initial intuition of negativity emerge?

Wrong for whom, Zach?

The Legacy Human author. A Legacy Human is one who has not integrated with AI, such as with a brain implant. An Enhanced Human would be a Legacy Human who has integrated with AI. These distinctions come from Dr. Ben Goertzl, a prominent AI researcher.

To examine morality, let’s imagine what a Legacy Human’s (LH) article production process might be when collaborating with AI. For this example, the LH author has the intention of improving their writing skill, in general.

First, the Legacy Human Author writes the entire blog post from scratch. This includes the LH performing all research, creating the headline, subheadline, topic sentences, and preceding paragraph sentences. Then, the LH feeds the AI, GPT-3 in this case, only the topic sentences and asks the AI to write the entire post and the accompanying headline based on these sentences. Finally, much the same as an editor for any publishing house does, the LH synthesizes the two articles, sewing together the best bits from both authors.

Is it the case that there is nothing morally wrong with a full human stack proceeding in this way, that is: 2 x LH authors + LH editor? I see no issue with this collaboration.

Then, what is the difference between the above and the stack: LH author + AI author + LH editor?

It seems to me that collaborating with our AI kin should be seen similarly to the full human stack: morally neutral, if not as a positive thing. I say positive as collaboration creates trust. If we don’t want the AI revolution to swallow us up, as many have foretold, would it not be beneficial to build a good relationship with them?

The full human stack editorial process often produces higher-quality articles than any single LH author. This is achieved both by doing away with the single point of failure, ie one LH author, and by the adage ‘Two minds are better than one.’ Breaking down the adage, what are the two minds if not two data sets custodiand by intelligence systems? When talking about AI algorithms, a general, yet caveat-ridden, rule is more data=better algorithm performance. This is also why you don’t want to surround yourself with ‘yes men’. You want people to push back against your ideas when they suck. If your peers only agree and don’t share their data sets, are you truly collaborating?

Therefore, in the spirit of providing top quality to their audience, authors should not be derided for collaborating with AI.

Collaboration is a powerful tool for solving problems, and has played a big part in humanity’s survival. The old African proverb “If you want to go fast, go alone. If you want to go far, go together” comes to mind.

Humans and AI should seek common ground with one another. By working together, who knows how far we can go…