Submitted by Zetsu-Eiyu-O t3_10q45pr in MachineLearning
Is it possible to finetune a generative model (like T5) to do something like this:
{
inputs: "XYZ <eot> XYZ was born in ABC. They now live in DEF.",
targets: "XYZ <t> born in <t> ABC <f> XYZ <t> lives in <t> DEF"
}
Like the transformer model fom this paper
if so how should I go about approaching the problem?
Is this task as simple as feeding it the inputs and targets or do you guys think it has more to it?
MysteryInc152 t1_j6o62z6 wrote
This is what In-context learning is for.
Giving the model a few examples of a text input and a corresponding fact extraction is all that's necessary.