[Date Prev][Date Next][Thread Prev][Thread Next][Interchange by date
][Interchange by thread
]
[ic] Out of memory problems with Downloadable products.
> Quoting Nathan Pitts (npitts@cybernet.com):
> > Hello.
> > I have recently put Interchange online as the ecommerce application
> > for my company. We are offering our software products for download,
> > available as CD-ROM ISO images.
> > Each time that a customer attempts to download a product, the
> > webserver runs out of memory.
> > I have figured out what the problem is, but not how to fix it.
> > It appears that in /usr/lib/interchange/bin/interchange, a
> download is
> > handled by a subroutine called do_deliver. This do_deliver sub
> is a call
> > to /usr/lib/interchange/lib/Vend/Util.pm->readfile and then a call
> > to /usr/lib/interchange/lib/Vend/Server.pm->respond.
> > The problem is that Util::readfile performs a "slurp" on the file to
> > be delivered, meaning it tries to read the entire file into memory
> > before delivering it. With a 650 MB ISO image, this is not a viable
> > solution, especially considering that more than one user may be
> > downloading at 1 time.
> > Has anyone out there used Interchange to serve large downloads? Is it
> > possible?
>
> You have to have interchange symlink in the ISO and hand it off to
> HTTP or FTP. There have been multiple examples of this on this list over
> the years.
>
> --
> Mike Heins
Mike wrote one for us Nathan. Here is the usertag:
UserTag enable-download Order resource
UserTag enable-download addAttr
UserTag enable-download Routine <<EOR
sub {
my ($resource, $opt) = @_;
my $docroot = $opt->{document_root} || $::Variable->{DOCROOT};
my $dl_dir = $opt->{base_dir} || 'tmp_download';
my $server = $opt->{server} || $::Variable->{SERVER_NAME};
## Routine comes from Vend::Util
## We will use session ID, but this would make totally random
#my $random = random_string();
my $random = $Vend::Session->{id};
require File::Path;
my $dir = "$docroot/$dl_dir/$random";
if(-d $dir) {
# Must have been made previously
}
elsif ( File::Path::mkpath($dir)
) {
## Need to ensure readable for HTTP download
chmod 0755, $dir;
}
else {
logError("Unable to make user download directory %s", $dir);
return undef;
}
# Routine comes from Vend::Util
unless ( file_name_is_absolute($resource) ) {
$resource = "$Vend::Cfg->{VendRoot}/$resource";
}
# Get the base file name
my $filebase = $resource;
$filebase =~ s,.*/,,;
## Now do the symlink
symlink $resource, "$dir/$filebase"
or do {
logError("Unable to symlink %s to %s", $resource, $dir);
return undef;
};
## Return the URL of the now-symlinked resource
return "http://$server/$dl_dir/$random/$filebase";
}
EOR
This should be accompanied by a cron job which runs every half hour
or so:
## Wherever that temporary base is
BASEDIR=/var/www/html/tmp_download
find $BASEDIR -type l -mmin +60 -exec rm '{}' ';'
find $BASEDIR -type d -depth -mindepth 1 -empty -exec rmdir '{}' ';'
You can call the download script with:
[enable-download resource="/path/to/download/dir/[item-code].zip"]
-------------
You may have to edit up that xxx/xxx/xx/[item-code].zip to fit your needs.
-------------
This was all written by Mike Heins, in like 4 seconds :)
Paul