You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Sylvestre Rebuffi,
It's inspiring to read your work Efficient parametrization of multi-domain deep neural networks , but I have a problem understanding section 3.4. Cross-domain adapter compression.
As I understand, you used decomposition to compress the number of parameters, and thus shared information among different datasets. But I am confused about how you realize the perspective of sharing parameters to allow target tasks to communicate. Is this part (SVD) also included in your code? Or is there any important information I've missed? It would be very kind of you to reply.
The text was updated successfully, but these errors were encountered:
Hi Sylvestre Rebuffi,
It's inspiring to read your work Efficient parametrization of multi-domain deep neural networks , but I have a problem understanding section 3.4. Cross-domain adapter compression.
As I understand, you used decomposition to compress the number of parameters, and thus shared information among different datasets. But I am confused about how you realize the perspective of sharing parameters to allow target tasks to communicate. Is this part (SVD) also included in your code? Or is there any important information I've missed? It would be very kind of you to reply.
The text was updated successfully, but these errors were encountered: