You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One other thought circling in my mind is to have the DO's carry around a kind of master array that kind of goes from 0 to q=45 (about the widest range we will ever encounter) with a fine grid and we interpolate the values onto there (linear or quadratic, whichever gives us sufficient accuracy. We could use it for various things, but this would be one of them because we could compute the scaling for the arrays on this grid (but apply the scaling to the data on all the grids). Don't make this change, but let's thin about it. The pro is it is cleaner and nicer and could be used for other things. The con is that we have to carry around another largish array in memory.
@sbillinge the new commit is ready for review. I like this idea and I think it will be very helpful for comparing absorption corrected curves. We can also have a function that allows user to specify which xarray to interpolate on?
yes, that had occured to me too. I think that was where my functions came from for building arrays using arange etc.. But for most flexibility for hte least code, I would suggest we just allow the user to pass in a master array. We can force them to do it on "q". If they want to specify a tth value to scale to we will have to convert.
yes, that had occured to me too. I think that was where my functions came from for building arrays using arange etc.. But for most flexibility for hte least code, I would suggest we just allow the user to pass in a master array. We can force them to do it on "q". If they want to specify a tth value to scale to we will have to convert.
Originally posted by @sbillinge in #211 (comment)
The text was updated successfully, but these errors were encountered: