All activity
Pascal de Buren
left a comment
I personally love the ideas around adding an external memory to transformers since it's a simple way to add new vocabulary without training a 100bn+ transformer model. I found this paper by Google on Memorizing Transformers especially insightful: https://arxiv.org/abs/2203.08913
What are the newest great techniques to improve NLP/NLU transformer model performances?
Owen L
Join the discussion
Pascal de Buren
left a comment
Co-Founder and ML-Engineer here. Would love to get the communities feedback. Ask me anything about the tech, product and market!
Caplena
From text feedback to insights.
Automate the tedious text analysis process. Gain actionable insights and share your results in dashboards. Powered by Augmented Intelligence. All in one platform.
Caplena
From text feedback to insights.