Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Train Large, Then Compress: Rethinking Model Size for Efficient Training and Inference of Transformers #10

Open
seopbo opened this issue Jul 2, 2020 · 0 comments
Labels

Comments

@seopbo
Copy link
Collaborator

seopbo commented Jul 2, 2020

어떤 내용의 논문인가요? 👋

간략하게 어떤 내용의 논문인지 작성해 주세요! (짧게 1-2줄 이어도 좋아요!)

Abstract (요약) 🕵🏻‍♂️

논문의 abstract 원본을 적어주세요!

이 논문을 읽어서 무엇을 배울 수 있는지 간략하게 적어주세요! 🤔

이 논문을 제대로 읽었을 때 어떤 지식을 얻을 수 있을까요?

이 논문의 아이디어를 적어주세요. (요약하여 적거나, 자세히 적어도 상관없습니다.)

외부 링크를 걸어도 좋습니다.

이 논문의 결론을 적어주세요.

@seopbo seopbo added the queue label Jul 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant