segfault when processing large amount of files sequentially
|Assignee:||Chris Cannam||% Done:|
A friend spotted this package the other day and I've been using it for a few days now - this is so awesome!
But... I have a bug to report. The following code will eventually crash python with: "Segmentation fault: 11" (memory leak?)
import librosa, vamp data, rate = librosa.load("audiofile.wav") for i in range(100000): onsets = vamp.collect(data, rate, "qm-vamp-plugins:qm-onsetdetector", output='onsets') print i, len(onsets['list'])
In my case it crashed after iteration 2265.
#2 Updated by Chris Cannam almost 5 years ago
- Status changed from New to Resolved
Fixed in v1.0.2 (now on PyPI). Thanks for the report, that was the prompt I needed in order to properly research the various refcounting semantics instead of just winging it.
I'd appreciate if you can check & report whether your test case now works properly for you (& whether anything else appears to be broken).