You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is the linear layer initialized by the linear layer of llava? I found that the pretrain_mm_mlp_adapter parameter is not set in the script. Does it mean that the linear layer is not initialized by llava?
The text was updated successfully, but these errors were encountered:
@vhzy hello, I get the same question. According to the paper "We use LLaVA (Liu et al., 2023) as our baseline model and finetune it on our 100K video instruction pairs. We only update the linear layer projecting the video features to the LLMs’ input space, while the rest of the architecture is kept frozen", the linear layer should be initialized by llava, but there is no mm_projector.bin in LLaVA-7B-Lightening-v1-1.
Is the linear layer initialized by the linear layer of llava? I found that the pretrain_mm_mlp_adapter parameter is not set in the script. Does it mean that the linear layer is not initialized by llava?
The text was updated successfully, but these errors were encountered: