Making Pre-Trained Language Models Better Few-Shot Learners
Making Pre-Trained Language Models Better Few-Shot Learners. Tianyu gaoyadam fischzdanqi chen yprinceton universityzmassachusetts institute of technology. Web prevailing methods for mapping large generative language models to supervised tasks may fail to sufficiently probe models’ novel capabilities.
Web prevailing methods for mapping large generative language models to supervised tasks may fail to sufficiently probe models’ novel capabilities. Tianyu gaoyadam fischzdanqi chen yprinceton universityzmassachusetts institute of technology.
Tianyu Gaoyadam Fischzdanqi Chen Yprinceton Universityzmassachusetts Institute Of Technology.
Web prevailing methods for mapping large generative language models to supervised tasks may fail to sufficiently probe models’ novel capabilities.
Post a Comment for "Making Pre-Trained Language Models Better Few-Shot Learners"