If answers need to be accurate, asking an LLM is risky—and the less expertise you have in the subject at hand, the riskier it is. Don’t let LLMs tell you what to do — but they can be very useful to tell you what to try, provided that you can easily and safely vet and verify their suggestions.
Modern-Day Oracles or Bullshit Machines by Carl T. Bergstrom and Jevin D. West