Just a caveat. Any training (fine-tuning really) that you do on an LLM is NOT guaranteed to be able to give you correct answers. The answers to your questions will seem plausible, and could be correct, but you'll need some system to verify if it's something you want to take action on.
Viewing a single comment thread. View all comments