[PD] Retrieving Text form a URL | Webpage

errordeveloper at gmail.com errordeveloper at gmail.com
Sat Apr 26 13:32:14 CEST 2008


cuould do the same with curl, +curl supports other protocols, not just
http ..and it has libcurl, which could proly be ported as an external so
we'd have "pd<->web" communication going on ..that'd be great for
particular types of purposes!

On Fri, Apr 25, 2008 at 04:15:14PM +0200, IOhannes m zmölnig wrote:
> mark edward grimm wrote:
> > Hello,
> > 
> > Didn't find any results on search...
> > 
> > Do we know a way to retrieve/grab Text from a
> > webpage/url in PD...
> > 
> > ... then write it to a textfile maybe? or insert into
> > a list.
> 
> you don't have to use Pd, you could just do
> "wget [URL] > myfile.txt"
> 
> you can run this from within Pd (using [shell]) or create a small server
> that communicates with Pd using netsend/netreceive (or whatever you
> prefer). i have posted examples on how to the latter on this list before
> (so see the archives).
> i have used this method extensively to display websites within Pd (e.g.
> for http://umlaeute.mur.at/projects/surf-the-net)
> 
> 
> an alternative method is to use the [tcpclient] object of mrpeach; the
> help-patch even shows how to access a website.
> 
> 
> fgmasdr
> IOhannes
> 
> _______________________________________________
> PD-list at iem.at mailing list
> UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list




More information about the Pd-list mailing list