Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve comments for test_scale_to #280

Merged
merged 6 commits into from
Dec 27, 2024

Conversation

yucongalicechen
Copy link
Contributor

closes #274
@bobleesj do you wanna give it a read first to see if the comments make sense to you?

Copy link

codecov bot commented Dec 26, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 100.00%. Comparing base (969ca54) to head (07938c9).
Report is 29 commits behind head on main.

Additional details and impacted files
@@             Coverage Diff             @@
##             main      #280      +/-   ##
===========================================
+ Coverage   98.68%   100.00%   +1.31%     
===========================================
  Files           8         8              
  Lines         379       408      +29     
===========================================
+ Hits          374       408      +34     
+ Misses          5         0       -5     
Files with missing lines Coverage Δ
tests/test_diffraction_objects.py 100.00% <ø> (ø)

... and 1 file with indirect coverage changes

@bobleesj
Copy link
Contributor

closes #274

@bobleesj do you wanna give it a read first to see if the comments make sense to you?

Yup currently shopping right now, I will review in 1-2 hrs

@@ -191,7 +191,7 @@ def test_init_invalid_xtype():
"org_do_args, target_do_args, scale_inputs, expected",
[
# Test whether scale_to() scales to the expected values
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

scales_to() scales to feels a bit circulating?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops yes forgot to change this - thanks!

@@ -191,7 +191,7 @@ def test_init_invalid_xtype():
"org_do_args, target_do_args, scale_inputs, expected",
[
# Test whether scale_to() scales to the expected values
( # C1: Same x-array and y-array with 2.1 offset, expect yarray shifted by 2.1 offset
( # C1: Same x-array and y-array with 2.1 offset, expect y-array to shift up by 2.1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

clear

@@ -212,7 +212,7 @@ def test_init_invalid_xtype():
},
{"xtype": "tth", "yarray": np.array([4.1, 5.1, 6.1, 7.1, 8.1, 9.1])},
),
( # C2: Same length x-arrays with exact x-value match
( # C2: Same x-arrays, x-value matches at tth=60, expect y-array to divide by 10
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If i understand correctly, we compare the yarray_1 of 60 against yarray_2 of 6 (located at xarray of 60) and we expect the y-array to e divided by 10 based on this ratio?

Just comment - since both xarray and yarray use 60 for the first input, it's a bit hard to follow

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! It finds the closest index in each x-array and scales based on that. In this case the scaling factor is 60/6=10 and the y-array for the original diffraction object is divided by 10

( # C3: Different x-arrays with same length,
# x-value has closest match at q=0.12 (y=10) and q=0.14 (y=1)
# for original and target diffraction objects,
# expect y-array to divide by 10
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok - yarray of 10 and 1 are compared since they are the nearest values at q=0.1. Since the second is 1/10 of the original, we expect y value to be 1/10

@@ -254,7 +257,10 @@ def test_init_invalid_xtype():
},
{"xtype": "q", "yarray": np.array([1, 2, 4, 6])},
),
( # C4: Different x-array lengths with approximate x-value match
( # C4: Different x-array lengths
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okay - clear, I understand the function now. yeah for C2, it was hard for me to understand what the function does by reading the test alone since numbers were similiar but other cases are clear.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I'll edit C2

Copy link
Contributor

@bobleesj bobleesj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yucongalicechen pls see my comments - thanks a lot Yucong!

@yucongalicechen
Copy link
Contributor Author

@sbillinge ready for review

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great, thanks @yucongalicechen

My one request might be to reorder the cases so the offset goes to the end. In general, I think that things would be easiest to understand if we test the main obvious functionality first, then edge cases later. This way, the reader first gets the idea what the function is supposed to be doing when everything is working first, then starts to dig into funky edge-cases and unusual behavior later when they already have a clear understanding of the main purpose of the function.

@bobleesj maybe we can add this to our "best practices"? What do you think. Here the offset is a funky extra functionality that I would rather be at the end.

@bobleesj
Copy link
Contributor

@bobleesj maybe we can add this to our "best practices"? What do you think. Here the offset is a funky extra functionality that I would rather be at the end.

  1. Order test cases from the most general to edge cases. This helps readers understand the basic function behavior first before utilizing or encountering unusual features or behaviors.

@sbillinge Yup added to our gitlab page

@sbillinge sbillinge merged commit 3f2d0a4 into diffpy:main Dec 27, 2024
5 checks passed
@yucongalicechen yucongalicechen deleted the test-scaleto branch December 28, 2024 00:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve comments for @pytest.mark.parametrize test cases in test_scale_to function
3 participants