Subject: Working with large volumes of data in Igor
Message-ID: <CAGjoBTUaSjy7z6OaAv4AXXt=+Jaj9bQ_uff63GMFN84nG0dm8w@mail.gmail.com>
Hi All,
This is a somewhat broad question but I wonder if anyone in the Igor
community has been routinely working with large volumes of data in Igor. I
am mostly concerned with speeding up processes in situations involving
large numbers of waves or traces.
The type of scenario I am speaking about is working with 50000 waves or
having 10000 traces in a graph. My previous experience was working with
maybe 100x less objects. As I scaled-up the amount of data to analyze, I
realized a lot of my routines and the visualization tools were starting to
lag. I have gone through some O-notation analysis as well as the function
profiler but reach the end of my current programming knowledge.
I realize the solution might mean shifting to a different tool altogether
(Python-based or MatLab) or maybe PC hardware improvements (I am on Igor
Pro 8 with a Win10/16GB RAM/Intel Core i7). However, I have a feeling that
there may be better ways of working in Igor that I am not yet aware.
Any tips or resources would be great!
Regards,
Albert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://info-igor.org/attachments/20190521/32f048aa/attachment.html>