Skip to content

Where's the encoder_freeze parameter? #1197

@ricber

Description

@ricber
Contributor

Hello,

I've read in some previous issues that it's possible to specify the parameter encoder_freeze to freeze the encoder weights. However, I couldn't find this parameter anywhere in the repository, and it's also not mentioned in the documentation.
Is this parameter still valid? If so, why doesn't it appear in the code?

Thank you!

Activity

qubvel

qubvel commented on Jul 18, 2025

@qubvel
Collaborator

Hello, this is not smth that was added to the API, but you can always do it manually:

for param in model.encoder.parameters():
    param.requires_grad = False

P.S. It might be a bit more complicated for batchnorm layers. Would appreciate a PR in case you have bandwidth! 🤗

ricber

ricber commented on Jul 21, 2025

@ricber
ContributorAuthor

What’s your idea? To introduce an encoder_freeze parameter in the API? If so, in which class should the parameter be introduced?

qubvel

qubvel commented on Jul 22, 2025

@qubvel
Collaborator

This method could be added to a base SegmentationModel class

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @ricber@qubvel

        Issue actions

          Where's the encoder_freeze parameter? · Issue #1197 · qubvel-org/segmentation_models.pytorch