Bug #1308

segfault when processing large amount of files sequentially

Added by Justin Salamon over 8 years ago. Updated over 8 years ago.

Status:ClosedStart date:2015-06-26
Priority:HighDue date:
Assignee:Chris Cannam% Done:


Target version:-


Hello folks,

A friend spotted this package the other day and I've been using it for a few days now - this is so awesome!
But... I have a bug to report. The following code will eventually crash python with: "Segmentation fault: 11" (memory leak?)

import librosa, vamp
data, rate = librosa.load("audiofile.wav")
for i in range(100000):
    onsets = vamp.collect(data, rate, "qm-vamp-plugins:qm-onsetdetector", output='onsets')
    print i, len(onsets['list'])

In my case it crashed after iteration 2265.


#1 Updated by Chris Cannam over 8 years ago

  • Assignee set to Chris Cannam
  • Priority changed from Normal to High

Mm, yes, this does look like a memory leak. Seems to be independent of which plugin you're using.

#2 Updated by Chris Cannam over 8 years ago

  • Status changed from New to Resolved

Fixed in v1.0.2 (now on PyPI). Thanks for the report, that was the prompt I needed in order to properly research the various refcounting semantics instead of just winging it.

I'd appreciate if you can check & report whether your test case now works properly for you (& whether anything else appears to be broken).


#3 Updated by Justin Salamon over 8 years ago

nice one! I updated the package and re-ran the test, seems to be solved (loop is currently on iteration ~5k and still going...). cheers!

#4 Updated by Chris Cannam over 8 years ago

  • Status changed from Resolved to Closed

Also available in: Atom PDF