Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add basic tests for ONNX module #207

Open
6 tasks
zaleslaw opened this issue Aug 31, 2021 · 4 comments
Open
6 tasks

Add basic tests for ONNX module #207

zaleslaw opened this issue Aug 31, 2021 · 4 comments
Assignees
Labels
good second issue Good for advanced contributors

Comments

@zaleslaw
Copy link
Collaborator

zaleslaw commented Aug 31, 2021

Add tests for the following classes:

  • ONNXModelHub
  • OnnxInferenceModel
  • ONNXModels (for each preprocessing)
  • SSDObjectDetectionModel
  • ONNXModelPreprocessor
  • Transpose

All tests should be located in the test directory of the ONNX module

@zaleslaw zaleslaw changed the title basic tests for ONNX module Add basic tests for ONNX module Aug 31, 2021
@zaleslaw zaleslaw added the good second issue Good for advanced contributors label Aug 31, 2021
@zaleslaw zaleslaw added this to the 0.3 milestone Aug 31, 2021
@femialaka
Copy link
Contributor

I would like to work on this one

@zaleslaw
Copy link
Collaborator Author

zaleslaw commented Sep 1, 2021

Great @femialaka fill free to create draft PR and ask some questions or create the list of tests before implementation to discuss here, in the issue's comments.

@femialaka
Copy link
Contributor

femialaka commented Sep 16, 2021

For the first 3 tests I have the following suggestions

  • ONNXModelHubTest

      	assert that ONNXModelHub creates a directory for a pre-trained onnx model
      	assert that ONNXModelHub can load the model
    

  •  OnnxInferenceModelTest

      	assert that the correct onnx model is loaded - is this enough?
    


 

  • ONNXModels (for each preprocessing)

      For this test I would like to ask for the best way to test.  For example would it be ok to test the accuracy? 
              What would be the best approach, using the "predict" value?
    

I'll appreciate your guidance on this, thanks.

@zaleslaw
Copy link
Collaborator Author

zaleslaw commented Sep 17, 2021

Hi, @femialaka will try to comment on your ideas.

assert that ONNXModelHub creates a directory for a pre-trained onnx model - better to check it in the temp directory (use @tempdir Junit feature)

assert that ONNXModelHub can load the model - do you want to check the model availability on S3?

assert that the correct onnx model is loaded - is this enough?
In the first, we need to check that the directory is not empty and the appropriate file is created (maybe check fileName, fileSize in bytes, unfortunately, we have no control sum for the files here)
Also, I suppose that the loading of the model is a part of modelHub test cases.

The OnnxInferenceModel has a few properties and methods that should be tested. Use a tiny model (Lenet, for example) for test purposes. If required, it could be added to the testResources in the onnx module.

The accuracy checking could be a good test in the case of ONNX Runtime's ability to predict the same data on the same outputs; I hope so.

In this case ONNXModels (for each preprocessing) I suggest defining tests inputs and outputs for each preprocessing hidden in the ONNXModels.

@zaleslaw zaleslaw modified the milestones: 0.3, 0.4 Sep 20, 2021
@zaleslaw zaleslaw modified the milestones: 0.4, 0.5 Dec 15, 2021
@ermolenkodev ermolenkodev removed this from the 0.5 milestone Oct 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good second issue Good for advanced contributors
Projects
None yet
Development

No branches or pull requests

3 participants