From 0339e082b5a88e0dae13b0d917ce507c24502ed5 Mon Sep 17 00:00:00 2001 From: gynchoi Date: Fri, 20 Sep 2024 09:38:40 +0900 Subject: [PATCH] change img width to 100 --- promptdistill2024/index.html | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/promptdistill2024/index.html b/promptdistill2024/index.html index 24876d0..9635725 100644 --- a/promptdistill2024/index.html +++ b/promptdistill2024/index.html @@ -1015,7 +1015,7 @@

Quantitative Results


Transfer Learning via Prompt Distillation

- transfer learning table + transfer learning table

Quantitative results of prompt distillation compared to the scratch learning and full-weights transfer learning on three domains.

@@ -1024,7 +1024,7 @@

Transfer Learning via Prompt Distillati

Knowledge Enhancement

- knowledge enhancement table + knowledge enhancement table

Analyze the enhancement ability to already-trained networks.

@@ -1033,7 +1033,7 @@

Knowledge Enhancement


Knowledge Compression

- knowledge compression table + knowledge compression table

Comparing distinct knowledge compression strategies.

@@ -1056,13 +1056,13 @@

Quantitative Ablation


The Number of Prompt and Distillation Epochs

- transfer learning table + transfer learning table

Effects of the number of prompt embeddings and distillation epochs on prompt projection types.

- knowledge compression table + knowledge compression table

CEffects of the number of prompt embeddings and distillation epochs on prompt compression strategies.