You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently messing around with the limitations of fitting larger groups of sources where the fit gets painfully slow. I noticed that 80% percent of the time spent was in the evaluate() method of the EPSF-model, digging into this more, the call to the interpolator seemed really slow when compared to hand-written linear interpolation.
It turns out it could be massively sped up by using a different way of calling RectBivariateSpline, ev() seems to be made for querying unstructured points, but with the grid=True argument in __call__() a regular grid of points can be interpolated much more effectively, see snippet below.
What to do about it?
I'm not sure how to properly make use of this though, using a sparse grid as input is a bit at odds with the interface provided by the modelling framework. The fitter expects arrays of same size for x,y and function value and one could possibly want to use the model as a plain function-object so I'm not sure about the ramifications for other usecases.
Also the logic to select the used pixels in BasicPsfPhotometry.do_photometry() would have to change and I'm not sure what the most elegant solution or possible unpleasant side effects would be, hence why this isn't a PR.
I'm starting to think that using the model abstraction is more of a roadblock for this usecase as it also requires the whole parameter-name matching machinery.
I'm in the process of create a variant of the PsfPhotometry class that uses the optimizer more directly to set bounds on parameters and to avoid the penalty of a large call-stack through the compound-model evaluation layers (see also #1217) and think this is actually a bit easier to implement and reason about. I'll link it here or create a PR when I have something presentable.
Description
I'm currently messing around with the limitations of fitting larger groups of sources where the fit gets painfully slow. I noticed that 80% percent of the time spent was in the evaluate() method of the EPSF-model, digging into this more, the call to the interpolator seemed really slow when compared to hand-written linear interpolation.
It turns out it could be massively sped up by using a different way of calling
RectBivariateSpline
,ev()
seems to be made for querying unstructured points, but with thegrid=True
argument in__call__()
a regular grid of points can be interpolated much more effectively, see snippet below.What to do about it?
I'm not sure how to properly make use of this though, using a sparse grid as input is a bit at odds with the interface provided by the modelling framework. The fitter expects arrays of same size for x,y and function value and one could possibly want to use the model as a plain function-object so I'm not sure about the ramifications for other usecases.
Also the logic to select the used pixels in
BasicPsfPhotometry.do_photometry()
would have to change and I'm not sure what the most elegant solution or possible unpleasant side effects would be, hence why this isn't a PR.I'm starting to think that using the model abstraction is more of a roadblock for this usecase as it also requires the whole parameter-name matching machinery.
I'm in the process of create a variant of the PsfPhotometry class that uses the optimizer more directly to set bounds on parameters and to avoid the penalty of a large call-stack through the compound-model evaluation layers (see also #1217) and think this is actually a bit easier to implement and reason about. I'll link it here or create a PR when I have something presentable.
Demo
output:
The text was updated successfully, but these errors were encountered: