Skip to content

Conversation

@stevenabreu7
Copy link
Collaborator

As pointed out by Terry, there is no need to support string padding values such as "same" or "valid" as "valid" is equivalent to padding=0, and "same" cannot always be implemented correctly using the PyTorch-style padding that we're currently using.

Question: should we move to Tensorflow-style padding, where the padding at the start and end of each dimension may be different? I.e. instead of (pad_x, pad_y), we specify ((pad_top, pad_bottom), (pad_left, pad_right))?

@matjobst
Copy link
Collaborator

I don't really see the issue here about the string paddings. Would you like the string padding values left out for reduced complexity?
Regarding the four-sided padding: That is more general than the PyTorch style. So I would be in favor of supporting it.

@stevenabreu7
Copy link
Collaborator Author

Yeah, supporting string padding is not really necessary (it's a higher level of abstraction, and an IR is not the place for that), plus it adds more complexity. It also becomes a bit arbitrary what string values we support - there are many: same, valid, full, wrap, reflect, constant, mirror, nearest, etc.

I also agree with TensorFlow-style padding. We can still allow for PyTorch-style padding, and just cast that to the TensorFlow-style in the __post_init__ (similarly to how we already do it to allow integer padding for Conv2D).

@stevenabreu7 stevenabreu7 changed the title Remove string padding values Conv padding: remove string values, move to TF-style values Oct 18, 2023
@Jegp
Copy link
Collaborator

Jegp commented Feb 10, 2025

Is this still a feature we would want to have? I agree that it's a high level feature, but it's pretty convenient and I imagine that a lot of users rely on a 1:1 conversion from PyTorch. An alternative could also be that we use some kind of wrapper function which converts "same" to a specific (lower-level) set of parameters.

Thoughts? @matjobst @stevenabreu7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants