ExchangeStrong196 t1_irw93ux wrote on October 11, 2022 at 2:49 PM Reply to comment by sonudofsilence in Bert - word embeddings from a text by sonudofsilence Yes. In order to ensure the contextual token embedding attends to longer text, you need to use a model that accepts larger sequence lengths. Check out Longformer Permalink Parent 1
ExchangeStrong196 t1_irw93ux wrote
Reply to comment by sonudofsilence in Bert - word embeddings from a text by sonudofsilence
Yes. In order to ensure the contextual token embedding attends to longer text, you need to use a model that accepts larger sequence lengths. Check out Longformer