Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #4880

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #4880

Triggered via pull request November 14, 2024 19:47
@yaoyu-33yaoyu-33
opened #11289
Status Success
Total duration 26s
Artifacts

secrets-detector.yml

on: pull_request_target
Fit to window
Zoom out
Zoom in