[PD] big files with textfile

Mathieu Bouchard matju at artengine.ca
Sat Oct 27 21:46:39 CEST 2007


On Sat, 27 Oct 2007, marius schebella wrote:

> I tried something different, which also did not work: split the files in 
> three parts, each around 100MB and load them to 3 separate textfiles. 
> when I try to load the second file, I get an error saying pd: 
> resizebytes() failed -- out of memory if I try to load the 3rd file 
> after that, my system freezes completely, and I have to reboot.

Is Pd somehow allocating memory in a no-swap zone?... that way, you can 
run out of memory much more quickly. But this requires root permissions.

You may also be running out of general memory. If you don't have a swap 
file then all your memory is no-swap all of the time.

In 32-bit Linux there's a limit of something between 1GB and 3.5GB per 
process. I don't know exactly how much, but it's not below 1GB for sure. 
Someone told me he has allocated using 2GB. You'd probably hit an actual 
memory allocation error much before you actually reach the absolute max, 
but it certainly wouldn't be below 1GB for sure.

So, I'm really puzzled. Does your system have anything special about RAM? 
any special limits? (settings that say "max 256MB per process" and such)

  _ _ __ ___ _____ ________ _____________ _____________________ ...
| Mathieu Bouchard - tél:+1.514.383.3801, Montréal QC Canada


More information about the Pd-list mailing list