Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Jan 27, 2026

Addresses potential UnboundLocalError when return_label_attention_matrix=True but self.text_embedder is None. The variable label_attention_matrix was only initialized inside the text_embedder branch, causing a runtime error when attempting to return it.

Changes

  • Initialize label_attention_matrix = None before the text_embedder conditional branch to ensure the variable exists in all code paths
def forward(self, ...):
    encoded_text = input_ids
    label_attention_matrix = None  # ← Added initialization
    if self.text_embedder is None:
        x_text = encoded_text.float()
    else:
        # ... sets label_attention_matrix from text_embed_output
    
    # ... later
    if return_label_attention_matrix:
        return {"logits": logits, "label_attention_matrix": label_attention_matrix}

When text_embedder is None, the returned dict now contains label_attention_matrix: None instead of raising UnboundLocalError.


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Co-authored-by: meilame-tayebjee <114609737+meilame-tayebjee@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Jan 27, 2026

Just as a heads up, I was blocked by some firewall rules while working on your feedback. Expand below for details.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • astral.sh
    • Triggering command: /usr/bin/curl curl -LsSf REDACTED (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Copilot AI changed the title [WIP] Add changes to cross attention labels based on PR feedback Fix UnboundLocalError in TextClassificationModel when return_label_attention_matrix=True with no text_embedder Jan 27, 2026
@meilame-tayebjee meilame-tayebjee marked this pull request as ready for review January 27, 2026 10:41
@meilame-tayebjee meilame-tayebjee merged commit a721397 into 24-add-cross-attention-labels-text Jan 27, 2026
@meilame-tayebjee meilame-tayebjee deleted the copilot/sub-pr-60-again branch January 27, 2026 10:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants