Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Jan 27, 2026

Addresses review feedback requesting assertions to catch label attention wiring/shape regressions.

Changes

  • Added assertions in run_full_pipeline test helper to validate label attention attributions:
    • When enabled: verify attributions exist and have shape (batch_size, n_head, num_classes, seq_len)
    • When disabled: verify attributions are None
# Test label attention assertions
if label_attention_enabled:
    assert predictions["label_attention_attributions"] is not None
    label_attention_attributions = predictions["label_attention_attributions"]
    expected_shape = (
        len(sample_text_data),  # batch_size
        model_params["n_head"],  # n_head
        model_params["num_classes"],  # num_classes
        tokenizer.output_dim,  # seq_len
    )
    assert label_attention_attributions.shape == expected_shape
else:
    assert predictions.get("label_attention_attributions") is None

✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Co-authored-by: meilame-tayebjee <114609737+meilame-tayebjee@users.noreply.github.com>
Copilot AI changed the title [WIP] Update implementation of cross attention labels Add test assertions for label attention attributions shape and presence Jan 27, 2026
@meilame-tayebjee meilame-tayebjee marked this pull request as ready for review January 27, 2026 10:31
@meilame-tayebjee meilame-tayebjee merged commit 0558f97 into 24-add-cross-attention-labels-text Jan 27, 2026
@meilame-tayebjee meilame-tayebjee deleted the copilot/sub-pr-60-one-more-time branch January 27, 2026 10:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants