- cross-posted to:
- aboringdystopia@lemmy.world
- cross-posted to:
- aboringdystopia@lemmy.world
No.
It is not.
Yeah, wotsisname’s Law of Headlines. If it ends in a question mark the answer to the question is no
Betteridge
Why not
Because too frequently it gives plausible-sounding but completely unfounded statements.
Also it can go more darkly wrong, and all the extra checks and safeguards don’t always protect it.
Why is this different than talking to a human
Because a human can understand the situation, and the person they’re talking to, and reply with wisdom, rather than just parroting what seems like what they heard before.
Therapy isn’t about what the therapist says
Some of it is, as I can personally attest. And well-dressed lies can certainly do a person much harm.
Link to Jacob Geller’s thoughts 1 year ago. Not ChatGPT, but I his long-form stuff, and this loosely relates maybe
Yes it’s a good idea, because it only costs $20/mo and it’s better than nothing.
Therapy is about the patient articulating things and the therapist reflecting.
Therapy is an excellent use of LLMs.
they use Conversational ai too