Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reconcile TensorShape functionality with tensorflow's Shape #248

Open
knok16 opened this issue Sep 27, 2021 · 2 comments
Open

Reconcile TensorShape functionality with tensorflow's Shape #248

knok16 opened this issue Sep 27, 2021 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@knok16
Copy link
Contributor

knok16 commented Sep 27, 2021

Tensorflow already provides a class that encapsulates the idea of tensor shape. TensorShape class stores the same data and supports the same mental model. Through the code, one or another is used without a notable difference. What do you think about eliminating TensorShape class and exposing any additional functionality that is currently provided by it in form of extension methods to Shape?

Pros:

  • Single class to represent shape (to follow "entities should not be multiplied beyond necessity").

Cons:

  • Shape has not-obvious equals implementation: it requires all dimensions to be known to declare shapes equal.

Example: knok16@8fa3a91

@zaleslaw
Copy link
Collaborator

The main idea which is not implemented yet: have a separate API layer (a module), the API implementation made with TF.
TensorShape is a draft (or lost building block) of this layer of abstraction. Also, we need a KotlinDLTensor which could be implemented with TFTensor/ONNXTensor/PyTorchTensor and so on.

The initial idea was to avoid working with TF Shape class due to some limitations and unuseful conversion to dimensions

I agree that this is one of the ugliest parts of the internal code, to change it, but I disagree with extension of TF class. Better to make TensorShape more clear to the KotlinDL developer and rewrite carefully all places of usage.

@knok16
Copy link
Contributor Author

knok16 commented Sep 29, 2021

Got it, thanks for the explanation. Agree that an additional layer of abstraction will be helpful in that case. Then what do you think about

  1. replacing usages of Shape with TensorShape whenever it is possible(for example in Layer::computeOutputShape (or at least moving it as close to TensorFlow calls as possible;
  2. making TensorShape immutable?

@zaleslaw zaleslaw added the enhancement New feature or request label Oct 18, 2021
@zaleslaw zaleslaw self-assigned this Oct 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants