Viewing a single comment thread. View all comments

neuralbeans t1_irw2mza wrote

If you're talking about the contextual embeddings that BERT is known for then those change depending on the sentence used, so you need to supply the full sentence.

2

sonudofsilence OP t1_irw4bmr wrote

Yes, that's why i want to pass "all the text" into bert, because for example a word in a sentence has to have similar vector with the same word (with same meaning) in another sentence. How can i accomplish that, as the max tokens number of bert is 512?

1