Viewing a single comment thread. View all comments

KerfuffleV2 t1_jecbxy7 wrote

It's based on Llama, so basically the same problem as anything based on Llama. From the repo "We plan to release the model weights by providing a version of delta weights that build on the original LLaMA weights, but we are still figuring out a proper way to do so." edit: Nevermind.

You will still probably need a way to get a hold of the original Llama weights (which isn't the hardest thing...)

−5

wind_dude t1_jecct1i wrote

ahh, sorry, referring to the dataset pulled from shareGPT that was used for finetuning. Which shareGPT has disappeared since the media hype about google using it for BARD.

​

Yes, the llama weights are everywhere, including HF in converted form for hf transformers.

5