In the code, there is a mention about "cosine similarity from BERT-based embeddings but observed longer inference time and
similar performance", and this version of code is using SBRT similarity, which is missing "from Layer2.Fine_tuned_BERT import get_similarity_from_SBERT" file.
So in order to implement the paper settings, we need to replace this sbert similarity function to LDA similarity function.
(
|
probabilities_of_concepts = self.__calculate_probs_of_concepts_bert( |
->
|
def __calculate_probs_of_concepts(self, concepts, sentence, debug): |
)
The code is fragmented, but it seems to contain all the important information. Thank you for providing codes for feature similarity calculation
In the code, there is a mention about "cosine similarity from BERT-based embeddings but observed longer inference time and
similar performance", and this version of code is using SBRT similarity, which is missing "from Layer2.Fine_tuned_BERT import get_similarity_from_SBERT" file.
So in order to implement the paper settings, we need to replace this sbert similarity function to LDA similarity function.
(
DGen/Conceptualizer.py
Line 68 in ddbe0e4
->
DGen/Conceptualizer.py
Line 169 in ddbe0e4
)
The code is fragmented, but it seems to contain all the important information. Thank you for providing codes for feature similarity calculation