Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

“People simply do not understand what they are missing.”

I think you can say this about people who haven’t learned to write well. ChatGPT seems like a great thing because they can get it to excrete a content-like substance that reads well, and call it writing.

Prompting ChatGPT to produce a text lets you move faster when you have an idea and want to cut to the finish.

The problem is that part of writing is starting with an idea, thinking about how to express it, and then realizing that your initial idea might be flawed or need work.

Sure - ChatGPT seems fine for spitting out web copy or something where the role of the writer isn’t to examine an idea, but just to go from start to done as quickly as possible.

But I think he’s talking about the types of writing that involve wrestling with an idea and, for example, trying to persuade others. ChatGPT will happily spit out copy in that form. But it won’t trip over a paragraph and say “hey, shit, now that I write this out, I think it’s wrong or not well-expressed.”

People bemoan co-workers who coast by on people skills and bullshit over people who have expertise and do the work. ChatGPT is likely to be similar - helping people who can’t write (express or work with ideas effectively) sound authoritative and generate more content that they don’t really even understand.



Basically: another active thinking activity goes passive. As always with society-level events the fallout will become apparent when it has happened.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: