Lars Aronsson schrieb: > On 07/20/2010 04:30 PM, Roan Kattouw wrote:
>> This does need client-side support, e.g. using the Firefogg extension
>> for Firefox or a bot framework that knows about chunked uploads.
> Requiring special client software is a problem. Is that really
> the only possible solution?
It appears to be the only good solution for really large files. Anything with a
progress bar requires client side support. Flash or a Java applet would be
enough, but both suck pretty badly.
I'd say: if people really need to upload huge files, it's ok to ask them to
install a browser plugin. > I understand that a certain webserver or PHP configuration can
> be a problem, in that it might receive the entire file in /tmp (that
> might get full) before returning control to some upload.php script.
IIRC, PHP even tends to buffer the entire file in RAM(!) before writing it to
/tmp. Which is totally insane, but hey, it's PHP. I think that was the original
reason behind the low limit, but I might be wrong. > But I don't see why HTTP in itself would set a limit at 100 MB.
HTTP itself doesn't. I guess as long as we stay in the 31 bit range (about 2GB),
HTTP will be fine. Larger files may cause overflows in sloppy software.
However, HTTP doesn't allow people to resume uploads or watch progress (the
latter could be done by browsers - sadly, I have never seen it). Thus, it sucks
for very large files. > What decides this particular limit? Why isn't it 50 MB or 200 MB?
I think it was raised from 20 to 100 a year or two ago. It could be raised a bit
again i guess, but a real solution for really large files would be better, don't
you think? > Some alternatives would be to open a separate anonymous FTP
> upload ("requires special client software" -- from the 1980s,
> still in use by the Internet Archive) or a get-from-URL
> (server would download the file by HTTP GET from the user's
> server at a specified URL).
To make this sane and safe, with making sure we always know which user did what,
etc, would be quite expensive. I have been thinking about this kind of thing for
mass uploads (i.e. uploads a TAR via ftp, have it unpack on the server, import).
But that's another barrel of fish. Finishing chunked upload is better for the
average user (using FTP to upload stuff is harder on the avarage guzy than
installing a firefox plugin...)
Wikitech-l mailing list