AI

AI can make you more creative—but it has limits

1 Mins read

The findings make sense, given that people who are already creative don’t really need to use AI to be creative, says Tuhin Chakrabarty, a computer science researcher at Columbia University, who specializes in AI and creativity but wasn’t involved in the study. 

There are some potential drawbacks to taking advantage of the model’s help, too. AI-generated stories across the board are similar in terms of semantics and content, Chakrabarty says, and AI-generated writing is full of telltale giveaways, such as very long, exposition-heavy sentences that contain lots of stereotypes.   

“These kinds of idiosyncrasies probably also reduce the overall creativity,” he says. “Good writing is all about showing, not telling. AI is always telling.”

Because stories generated by AI models can only draw from the data that those models have been trained on, those produced in the study were less distinctive than the ideas the human participants came up with entirely on their own. If the publishing industry were to embrace generative AI, the books we read could become more homogenous, because they would all be produced by models trained on the same corpus.

This is why it’s essential to study what AI models can and, crucially, can’t do well as we grapple with what the rapidly evolving technology means for society and the economy, says Oliver Hauser, a professor at the University of Exeter Business School, another coauthor of the study. “Just because technology can be transformative, it doesn’t mean it will be,” he says.


Source link

Related posts
AI

Meet Arch 0.1.3: Open-Source Intelligent Proxy for AI Agents

3 Mins read
The integration of AI agents into various workflows has increased the need for intelligent coordination, data routing, and enhanced security among systems….
AI

The Allen Institute for AI (AI2) Releases Tülu 3: A Set of State-of-the-Art Instruct Models with Fully Open Data, Eval Code, and Training Algorithms

4 Mins read
The Allen Institute for AI (AI2) has announced the release of Tülu 3, a state-of-the-art family of instruction-following models designed to set…
AI

Dataset Decomposition: Faster LLM Training with Variable Sequence Length Curriculum

1 Mins read
Large language models (LLMs) are commonly trained on datasets consisting of fixed-length token sequences. These datasets are created by randomly concatenating documents…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *