Limite el número de descargas paralelas por IP

We have an Amazon S3 bucket which generates time limited web URLs using the following code:

define('WP_USE_THEMES', false);
require 'wp-blog-header.php';
require_once './sdk-1.5.3/sdk.class.php';
require_once './sdk-1.5.3/services/s3.class.php';

$FILENAME = $_GET['filename'];  

S3GetFile($FILENAME);

function S3GetFile($filename){

if(is_user_logged_in()){
    $url = getS3SignedURL($filename);
    header('Location: '.$url);
}else{
    $url = 'http://myurl.com/';
    header('Location: '.$url);
}
}

function getS3SignedURL($filename){
    $s3 = new AmazonS3();
    $bucket = 'bucketname';
    $url = $s3->get_object_url($bucket,$filename,'2 minutes');
    return $url;
}

So, that's the code. We're using AWSSDK for PHP.

This is a membership site. We're having problems with some of the members doing parallel downloads of all the files from the server. The host keeps shutting us down citing heavy resource usage.

I'm trying to figure out the best way to limit the number of simultaneous downloads por IP or por usuario (possibly both, because I would like to test out case scenarios). I'm checking the documentation as well, but help from some experts here as well if you know already.

Many Thanks, Sachin.

preguntado el 23 de septiembre de 12 a las 03:09

per user should be a good start. if you do by ip, you'll probably need to invalidate user sessions if their ip changes if they are using a proxy. -

@endyourif, exactly... that was my thought while I finished writing this question... but I thought per IP would be a good to know :) -

0 Respuestas

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.