Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different metrics to evaluate the dataset #62

Open
NamburiSrinath opened this issue Apr 27, 2019 · 0 comments
Open

Different metrics to evaluate the dataset #62

NamburiSrinath opened this issue Apr 27, 2019 · 0 comments

Comments

@NamburiSrinath
Copy link

Hi jzbontar,

We have a set of stereo images that has been tested on the .lua code provided and the disparity maps are obtained. From visualization, we can say whether it is giving good results or not. But are there any metrics which can actually evaluate the dataset.

An example can be as follows:

"For object detection/ classification, we have metrics such as accuracy, precision, recall, F1-score, IoU, mAP, confusion matrix etc.. What are the metrics for stereo? Are they implemented in your code and if so, how to use them?"

Thanking you
Srinath

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant