When exporting the catalogue via tools/export, the download works fine, but if the catalogue is large, the user does not know how long it will take. Koha does not tell the browser the size of the file (Content-Length HTTP header I assume), so the browser cannot estimate the remaining time. It would be nice if it did.
I'm no Perl expert but I had a look and, apparently, tools/export.pl calls Koha::Exporter::Record::export in order to export the bibliographic/authority records. Unless given a filename, the export() subroutine in Koha/Exporter/Record.pm operates on STDOUT, hence the inability to know the size of the file beforehand. So, I think that with the current set of Perl scripts and modules it's not possible to tell the browser the size of the file in advance...
Linking to bug 6952 suggesting to show the number of records before download.
Yeah, this is a tough one. As Andreas suggests, it's impossible to output the Content-Length unless it fetches the entire data dump first, and then sends it out. For a large database, you're not going to be able to do that in RAM, so you're going to need to use a temporary file. The tools/export.pl is actually problematic in general (see Bug 26791). If we wanted to use a temporary file instead of streaming out the response record by record, then we'd be best off using a BackgroundJob to prepare the file (although then you have potential issues with disk space for large data dumps). (For a large file it would be more efficient to have Apache httpd serve it as a static file than for Starman, but then you wouldn't have authentication and authorization protecting the file if you serve it using Apache httpd. So we'd probably still use Starman, but we'd need to make sure it was using either a CGI script or a Mojolicious controller and not Plack since CGI::Emulate::PSGI buffers the entire HTTP response before sending it out) But it's something on my mind [U+1F605]