You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The underlying fortran code can become unstable when presented with very large particles. It's a known issue of the code. There has been some discussion between various people about fixing it, but not sure anyone has had the time to take it on.
You could increase value of the attribute ndgs in the Scatterer class. By default ndgs=2, try increasing it to 100 (play around with this value). This attribute controls convergence of the T-matrix computations.
I'll comment now on this long-standing issue since there hasn't been any progress: Unfortunately this is a problem with the core Fortran code that PyTMatrix uses (a code originally from NASA GISS).
The best solution to this problem would be to rewrite the T-Matrix core in Python/NumPy. This would also eliminate the compiling problems that many people have had with PyTMatrix. However, this is a serious effort and I don't really expect to have the time to do it in the foreseeable future, especially as my own career is gravitating away from scattering. I'll leave this issue open just in case some brave soul wants to tackle it.
Hi, I am running into a problem when i try and create a scatterer with a diameter that is ~40x larger than the wavelength. Is this a bug?
The text was updated successfully, but these errors were encountered: