Relation Model Freezing Transformer #11740
Unanswered
Anastasiia-Khab
asked this question in
Help: Model Advice
Replies: 1 comment
-
There can be situations where it would make sense to freeze one of the models, but can you explain why you are training two different but related relation extraction models? I would expect it to usually be better to combine them. Normally, specifying the transformer as frozen like in your example would be enough, but unfortunately there is a bug and that doesn't work right now: #11547. The workaround is described in detail there, but basically you can set |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Currently I am training two Component Relation models with similar tasks and architecture from rel-component tutorial .
Could you please help me to understand:
1) Does it makes sens to freeze
create_instance_tensor.tok2vec layer
(Transformer ) and train onlyclassification_layer
(a bit larger version then in tutorial step)? (I would like to share the Transformer layer later between two models)2) If it make sens, then would it be possible to do it only with adding to
rel_trf.cfg
?or should I adjust
def backprop(d_relations: Floats2d) -> List[Doc]
as well?Thank You
Beta Was this translation helpful? Give feedback.
All reactions