You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some dictionaries are too big, e.g. Hungarian is >1GB RAM with our current setup.
To do the below, we will need to split our affix computations into create_word, our current format that makes the words in advance, and test_word that can see if an affix applies.
It would be best to control how lazy or eager we want this to be. For example:
Always lazy: use test_word for everything, no internal state
Dynamic: precompute nothing, cache results as we create words
Fully eager: precompute everything and store it (like we do now)
The text was updated successfully, but these errors were encountered:
Some dictionaries are too big, e.g. Hungarian is >1GB RAM with our current setup.
To do the below, we will need to split our affix computations into
create_word
, our current format that makes the words in advance, andtest_word
that can see if an affix applies.It would be best to control how lazy or eager we want this to be. For example:
test_word
for everything, no internal stateThe text was updated successfully, but these errors were encountered: