Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While they certainly can do that, there are large chunks of workflows where "hallucination" are low to none. Even then, I find LLM quite useful to ask questions in areas I am not familiar with, its easy to verify and I get to the answer much quicker.

Spend some more time working with them and you might realize the value they contain.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: