Viewing a single comment thread. View all comments

Hrmbee OP t1_iydik85 wrote

>On Monday, OpenAI announced a new model in the GPT-3 family of AI-powered large language models, text-davinci-003, that reportedly improves on its predecessors by handling more complex instructions and producing longer-form content. Almost immediately, people discovered that it could also generate rhyming songs, limericks, and poetry at a level GPT-3 could not previously produce. > >... > >Introduced in 2020, GPT-3 gained renown for its ability to compose text in various styles at a similar level to a human, thanks to extensive training on text scraped from the Internet and data pulled from books. It uses statistical associations between learned word positions to predict the next best word in the sequence while reading from the prompt. > >Of course, generating poetry with a machine is hardly a new pastime. Even as far back as 1845, inventors have been crafting ways to write expressive verse through automation. But in particular, experts note that GPT-3's latest update feels like a step forward in complexity that comes from integrating knowledge about a wide variety of subjects and styles into one model that writes coherent text. > >Beyond poetry, GPT-3 still has its flaws, as some have examined in detail. While its factual accuracy has reportedly increased over time, it can still easily generate false information, limiting its applications. And GPT-3's short-term memory is generally limited to what you've recently fed it within a prompt. But when it comes to purely creative fictional output, GPT-3 hits the mark fairly well.

It will be interesting to see how this technology is initially going to be (ab)used, especially by those in the commercial creative sectors.

2