[PD] Best way to deal with many tables.

B. Bogart ben at ekran.org
Sat Feb 21 18:52:53 CET 2009


Thanks all for your comments.

Roman, The major "inelegance" of dynamic patching is the massive CPU of
generating the graph containing 1000s of objects. In theory one only
needs to do this once, but I'm always changing the patch and also the
number of tables. Dynamic patching seems to work great in the 100s, but
starts getting ugly with 1000s.

Cyrille, This is an interesting idea, pix_histo outputs tables though. I
don't know what you mean by 2D tables, they should be 3D for each colour
channel right?? Even if there was an object to dump a hist into an
image, rather than into a table, I don't think the gem pixel operations
on pixels would be any faster than those on tables (slicing out,
concatenating, listing etc..) Please share if you have some ideas
already implemented that does this kind of thing.

David, How fast was your python code to generate the tables? Seems like
it would be about the same issues as dynamic patching.

Looks like I should just do the ol tried and true dynamic patching for
now (at least before some better method comes along).

The reason why I removed the dynamic patching from the rest of the patch
was that it was just becoming too unscalable. My current 75x75 unit SOM
is so far the biggest, but I would like to get as big as possible with
the amount of RAM available, say 100x100+.

Another option would be to use tables to store the RGB hists, read them
directly in python for concatenation to be stored in numpy arrays.
Operations on these are fast and flexible. Has anyone tried this
approach? (I believe this is how vasp does it). That way I could just
dump a list right into ann_som and let python store all the hist data. I
think I've convinced myself here.

Thanks all for your comments!

.b.

cyrille henry wrote:
> hello,
> 
> your RGB hists are 1D table. so what you need is 2D table.
> the best 2D table are probably images.
> if the 8bits limitation is not a problem, you can store your arrays in 1
> (or more) big image (1000x768).
> pix_crop + pix_pix2sig to get a row of your image in a table.
> 
> Cyrille
> 
> B. Bogart a écrit :
>> Hey all.
>>
>> I've managed to get my patches to use less objects, and more messages.
>>
>> Problem I have now is storing data in an organized way.
>>
>> Basically the system I'm working on needs to store the RGB hists of many
>> images (10,000 ideally, RAM permitting). RGB hists are concatenated into
>> tables of 768 elements each.
>>
>> What is the best way to deal with this number of tables? There are the
>> usual thoughts of using dynamic patching and such, but really I'd like a
>> more elegant solution.
>>
>> Has anyone worked on something like a multi-table or nested table?
>>
>> I could put everything in one giant table, but each chunk needs to be a
>> list in the end and it seems to be iterating over a section of the table
>> to dump it as a list would be a lot slower than using [tabdump].
>>
>> Just wondering if anyone has any suggestions.
>>
>> I've already mentioned my wish to have a generic storage system (similar
>> to data-structures but independent of any graphical representation)
>> namely:
>>
>> tables of floats (done), tables of symbols, and most importantly tables
>> of tables!
>>
>> .b.
>>
>> _______________________________________________
>> Pd-list at iem.at mailing list
>> UNSUBSCRIBE and account-management ->
>> http://lists.puredata.info/listinfo/pd-list
>>
> 




More information about the Pd-list mailing list