diff --git a/promptdistill2024/index.html b/promptdistill2024/index.html index 24876d0..9635725 100644 --- a/promptdistill2024/index.html +++ b/promptdistill2024/index.html @@ -1015,7 +1015,7 @@
Quantitative results of prompt distillation compared to the scratch learning and full-weights transfer learning on three domains.
@@ -1024,7 +1024,7 @@Analyze the enhancement ability to already-trained networks.
@@ -1033,7 +1033,7 @@Comparing distinct knowledge compression strategies.
@@ -1056,13 +1056,13 @@Effects of the number of prompt embeddings and distillation epochs on prompt projection types.
CEffects of the number of prompt embeddings and distillation epochs on prompt compression strategies.