You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I apologize for inquiring about the code released a few years ago, but it appears that the text encoder directly loads the weights from BioClinicalBERT rather than being pretrained through contrastive learning. From my understanding, the text encoder was trained without being frozen. Are there no available pretrained weights for the text encoder of MedCLIP other than those from BioClinicalBERT?
The text was updated successfully, but these errors were encountered:
I apologize for inquiring about the code released a few years ago, but it appears that the text encoder directly loads the weights from BioClinicalBERT rather than being pretrained through contrastive learning. From my understanding, the text encoder was trained without being frozen. Are there no available pretrained weights for the text encoder of MedCLIP other than those from BioClinicalBERT?
The text was updated successfully, but these errors were encountered: