neurofuzzy, flash game development, rich internet applications, free source code – *alt.neurotica.fuzzy*

neurofuzzy, flash game development, rich internet applications, free source code – *alt.neurotica.fuzzy*

8/26/2006

Amazon S3 PHP Class Update

Filed under: PHP — geoff @ 4:14 pm

I’ve made some small updates to my Amazon S3 PHP Class.  It now supports the prefix, max-keys, delimeter, and marker parameters for listing objects in buckets.  I still need to add the delimeter parameter.  The main reason I am posting is because I’ve gotten word that some people are experiencing Signature Match errors when authenticating requests.  It’s probably just a parameter ordering problem, but I am unable to replicate the error.  If anyone has experienced this problem, please post a comment here.  If anyone found a fix, please let me know!

The latest class will always be available via the link, Amazon S3 PHP Class Download.  The class is free to use and open source.

I’d also like to use this post as a place for people to provide links to all the cool interfaces and tools people have created using this class.  Please post a comment with linkage.  Thanks!

UPDATE 8/27/2006: Thanks to Bryan Kennedy for figuring out that some versions of PHP do not correctly report gmdate(“r”), I’m now defining DATE_RFC822 in the class and using it as a constant in the gmdate method. I’ve also added CONTENT-MD5 to PUT object HTTP headers. If object data is neing sent, the class will compute the MD5 checksum of that data and send it to Amazon for verification. This was an optional parameter, but it’s necessary to create a more robust service. Keep in mind Amazon has noted in their updates that this comparison is weak, and future updates may include headers in the checksum.

UPDATE 8/30/2006: Fixed a bug where I forgot to declare delimeter in the class (oops!). I am now updating the date in the sendRequest method in order to be sure date is current if the class is used for multiple requests over a long period of time (thanks to Chris Shepherd). Chris says I need to fix a bug that occurs when using metadata with numeric keys. ksort doesn’t sort them the way Amazon wants them. Looking into this…

UPDATE 9/9/2006: Thanks to Nathan Schmidt for discovering I mispelled delimiter as delimeter. It changed this to the proper ‘delimiter’ globally throughout the class. If you are using delimiters in your GET operations, please update.

51 Comments

  1. >If anyone has experienced this problem

    If you get “SignatureDoesNotMatch” it is due BUG in HTTP_Request 1.3.0 in line 765.

    Comment by Andrzej — 9/5/2006 @ 12:06 pm

  2. Andrzej is speaking of the PEAR HTTP_Request class.

    Andrzej,

    Is this documented, and you you happen to have a fix for this?

    Comment by geoff — 9/5/2006 @ 1:19 pm

  3. The HTTP_Request bug is documented in the Amazon S3 php example. The example code can be found at: http://developer.amazonwebservices.com/connect/entry.jspa?externalID=126&categoryID=47

    Comment by Eric Davis — 9/17/2006 @ 1:23 pm

  4. You still have a bug in the DATE_RFC822 formatting. The current updated formatting string will still exhibit inconsistency problems on some php installation.

    Fix:

    $this->httpDate = gmdate(‘D, d M Y H:i:s \G\M\T’, time());

    Comment by Xing — 10/14/2006 @ 4:31 am

  5. Question: Does use of this class to send large objects (i.e. large media files, 1GB+) require that the objects get loaded into memory first? Initial use of this class to backup several large files to S3 has driven my server into the ground, and it appears that memory usage might be the cause.

    Comment by Peter Rukavina — 10/17/2006 @ 3:05 pm

  6. The S3 PHP class uses the PEAR HTTP_Request package. So, any memory issues you have would be surrounding the implementation of that package.

    Another alternative would be to use CURL instead. I’ve head good luck with CURL in other situations, but have not tried it with S3. If oyu have any luck with it, let me know!

    Comment by geoff — 10/17/2006 @ 3:40 pm

  7. Hi,

    how is it possible to stream files to the S3 Servers? I mean

    $handle = fopen("…");

    while(!feof($handle))

    {

    // Put the data to S3

    }

    ???

    Comment by Sven — 11/23/2006 @ 10:09 am

  8. I’ve also written my own S3 class. Broke it down into an S3Service class that has all static methods for helping and stuff. S3Bucket class that has simple methods, since it doesn’t get used in production much. And, an S3Object class that does all the major S3 object stuff.

    But, anyway, the point is that I’m not impressed with HTTP_Request (and other PEAR packages) so instead I rely on PHP streams! I think you’ll find using PHP’s streams much easier to use. All you have to do is set the context when opening the stream.

    BTW: you should experiment with fork’ing multiple processes for batch saving. For me 1 proc would go at about 300KB/s, but when using 4 procs the combined throughput goes up to 1.3MB/s. Very useful for mass uploading directories.

    Comment by joel — 11/26/2006 @ 5:43 pm

  9. I have also been trying to do streaming for large files. The PEAR module breaks down when it tries to load the entire body into a variable
    ($req->setBody($objectdata)),
    you run into your memory limits given in the php.ini settings. Does anyone have a working example of using the php streaming functions? I have had no success with this.

    Comment by Bart — 11/28/2006 @ 12:13 pm

  10. when using your php class to put content to an s3 bucket, how can I programmatically tell if the put went through correctly? In a nutshell, my code is:

    if ($data = file_get_contents($filePath)) {
    $transfer = new s3();
    transfer->putObject($object_id,$data,$bucket,”public-read”,”image/jpeg”);
    } else {
    die(“error.”);
    }

    This would only error out if $filepath was invalid. But I’m getting close to a 5% failure rate for saving to s3 in one of my buckets and I can’t figure out why. Is there some way I can get an error code from the s3 call?

    Comment by Bob — 12/20/2006 @ 2:39 am

  11. Bob,

    This should work better:

    $response = $transfer->putObject(…

    if ($response) {

    // rejoice!

    } else {

    // weep

    }

    Comment by geoff — 12/21/2006 @ 11:58 am

  12. Geoff. That's really helpful. Thank you.

    Comment by Bob — 12/22/2006 @ 8:50 am

  13. Hi
    Is there any way to upload lareg files,without using up the memory, Uploading large objects (like 1GB etc)using the code to s3 causes it to be loaded into memory first so uploading large objects are difficult.i have searched a lot for a solution for the problem ,i also saw a post here about the problem,i hope one you can come up with a solution in php

    Thanks in advance

    Comment by Steve — 1/5/2007 @ 5:58 pm

  14. First, thanks for taking the time to create (and share) this S3 class.

    I have been trying to implement it in my website, but i've been having problems. I can use some of the GET functions, but when i try PUT functions i get "signature doesn't match" errors.

    I imagine that the part of the signature that contains the content and the details etc is not encoding correctly.

    I am using v1.4.0 of HTTP_Request, so i'm assuming that its not effected by the same bug as 1.3.0, in any case i have tried using 1.3.0 (with the modification) and get the same error.

    Has anyone else had the same problem.

    Comment by Robin — 3/11/2007 @ 9:13 am

  15. Like the posts above, thanks for this great code. I am having a little problem though. Whenever I try to perform an action, I always get the following response:

    Signing String: PUT Mon, 23 Apr 07 21:17:03 +0000 /[Bucket name goes here]

    <Code>AccessDenied</Code>

    AWS authentication requires a valid Date or x-amz-date header

    If it helps any, I am using v1.5.1 of HTTP_Request.

    Comment by Ravi — 4/23/2007 @ 11:23 am

  16. Thanks Geoff. Keep up the good work.

    Comment by James — 4/24/2007 @ 10:16 pm

  17. This was very helpful Geoff, thanks! Saw some questions about streaming an object without loading its entire contents into memory, so here is a version of SendRest that does just that. Note that instead of passing in the entire request, you pass in only the header and a file name.

    private static function SendRest($header, $fileName = NULL, $debug = false)
    {
    if ($debug)
    {
    echo “\nQUERY>\n”;
    }

    // open a socket to s3
    $s3Sock = fsockopen(URL, PORT, $errno, $errstr, TIMEOUT);
    if (!$s3Sock)
    {
    return FALSE;
    }

    // write the header
    fwrite($s3Sock, $header);

    // write file data if needed
    if(isset($fileName))
    {
    if(! $fileSock = fopen($fileName, “r”))
    {
    return FALSE;
    }
    if(! StreamFile($s3Sock, $fileSock, 8192))
    {
    return FALSE;
    }
    // close file socket
    fclose($fileSock);
    }

    // check the response
    $r = ”;
    $check_header = true;
    while (!feof($s3Sock))
    {
    $tr = fgets($s3Sock, 256);
    if ($debug)
    {
    echo “\nRESPONSE>”;
    }
    $r .= $tr;

    if (($check_header)&&(strpos($r, “\r\n\r\n”) !== false))
    {
    // if content-length == 0, return query result
    if (strpos($r, ‘Content-Length: 0′) !== false)
    {
    return $r;
    }
    }

    // Keep-alive responses does not return EOF
    // they end with \r\n0\r\n\r\n string
    if (substr($r, -7) == “\r\n0\r\n\r\n”)
    {
    return $r;
    }
    }

    // close s3 socket
    fclose($s3Sock);

    return $r;
    }

    private static function StreamFile($dest, $src, $chunkSize)
    {
    while (! feof($src))
    {
    if(! fwrite($dest, fread($src, $chunkSize)))
    {
    return FALSE;
    }
    }
    return TRUE;
    }

    Comment by rajesh subramanian — 6/28/2007 @ 3:12 am

  18. Geoff:

    Thanks for providing us the access to the script. It works. I am having trouble with displaying images that have "public-read" access. Does anyone has similar issues displaying images? We tens of thousands of images. We are getting the image names properly and they have "public-read" access and still can not display them. Some images get displayed properly (I am guessing they are within first 1000). Has anyone seen this behavior? Any comments/recommendations for this problem are appreciated. Thanks again.

    Comment by vasu — 7/4/2007 @ 1:26 am

  19. To anyone getting the “AWS authentication requires a valid Date or x-amz-date header” error:

    Rename DATE_RFC822 to DATE_RFC822_S3 everywhere and it should go away. It’s happening because DATE_RFC822 is already defined somewhere else in PHP and can’t be re-defined.


    Ben
    http://www.infiniteftp.info
    FTP access to Amazon S3 – $4.95 monthly

    Comment by Ben Allfree — 7/20/2007 @ 2:02 pm

  20. [...] Agora viña o gordo: o consumo de ancho de banda en imaxes, se vos fixades cada páxina de chuza ten tropecentos avatares (quen comentou, quen votou, os listados) que, ainda que cada un pesa pouco, xuntos forman unha lexión que multiplica os accesos ao servidor en cada carga de páxina e vai sumando Kilobytes de transfer mentres se frotan as mans os do servidor. Para solucionar esto decidin que tiña que sacar as imaxes do server a outro. Este server tiña que cobrar menos pola transferencia, ser rápido, fiable e ter un sistema sinxelo para que os dous servidores se puideran comunicar para transferirse entre eles as imaxes dos usuarios. Despois de moito mirar optei por Amazon S3, o servizo de almacenamento que ofrece a coñecida Amazon, que: é relativamente barato (non ten consumo mínimo, podedes comprobar o gasto nesta calculadora), extremadamente rápido e fiable, e pon a disposición dos usuarios unha API para poder comunicarse facilmente con el. So tiven que buscar un pouco atopei esta clase de php que me permitiu integrar facilmente a subida de imaxes de usuario aos servidores de Amazon. Rápido, limpo e sinxelo. [...]

    Pingback by chuza! blog » Blog Archive » Speeding up chuza! — 8/22/2007 @ 3:07 pm

  21. [...] Amazon S3 PHP Class by Geoffrey P. Gaudreault found @: http://www.neurofuzzy.net/2006/08/26/amazon-s3-php-class-update/ [...]

    Pingback by Free Photo Hosting (Yet Another Place to Upload Your Photos) | JuanJose Galvez :: Programmer — 9/18/2007 @ 1:49 am

  22. Thanks for making this awesome class. I blogged about it over on my site

    http://notpopular.com/blogs/josh/2007/10/02/amazon-s3-php-awesome-image-hosting-solution

    keep up the good work

    Comment by Josh Giese — 10/2/2007 @ 2:31 pm

  23. hi there.. is this api for S3 is actually working with cake php? pls help. i am working if the HTTP_REQUEST of CAKEPHP is same as with PEAR which the S3 is using.

    Comment by cathei_mecholz — 10/8/2007 @ 3:30 am

  24. [...] I’d recommend taking a look at the Amazon S3 library from neurofuzzy.net and phpFlickr to name but two. If you’re working in PHP, you’ll also find a number of web services have PEAR libraries that can prove a good starting point. [...]

    Pingback by Hacking on Open APIs « Untitle Dream — 10/29/2007 @ 6:02 am

  25. I wanted to cover some of the changes I had to make to get this working.

    First, if you’re getting:

    “AWS authentication requires a valid Date or x-amz-date header”

    replace all “DATE_RFC822″ with “S3_DATE_RFC822″ (not the quotes, just the contents)

    Also, I use php 5′s autoload so I commented out all the includes and requires in class s3 and the associated classes (pear, http_request, etc.) You don’t want to comment out the require_once ‘request.class.php’ in the constructor of class s3 because there are important defines that are also included. Commenting out this line created signing errors for me.

    Now it’s working great. Thanks Geoff for this great class!

    Comment by Talat Imran — 11/19/2007 @ 5:06 pm

  26. [...] http://neurofuzzy.net/2006/08/26/amazon-s3-php-class-update/ I am working to implement Geoff’s S3 class using info and file from here: http://www.ibm.com/developerworks/library/os-php-amzmm/index.html [...]

    Pingback by .:. gotblogua .:. joshua gottdenker’s personal blog .:. » Amazon S3 upload via PHP — 12/1/2007 @ 10:55 am

  27. I’ve been trying to make your class work with files that are on a fileserver and that need to be backed up into Amazon S3. Where I keep hitting a snag is trying to read the data content–it keeps choking on any file over 1MB. Actually it just chokes period.

    here is what I’m doing, please advise:

    require_once(‘s3.class.php’);
    define(‘NAME’,'foo’);
    define(‘DIR’,'/bar/’);

    $s3 = new S3();
    $s3->setBucketName(NAME);
    $s3->putBucket(NAME);

    if ($handle = opendir(DIR)) {
    while (false !== ($filename = readdir($handle))) {
    if ($filename != “.” && $filename != “..” ) {
    $FILE = DIR.$filename;
    $fh = fopen($FILE, ‘r’);
    $data = fread($fh,filesize($FILE)) or die(“Trouble getting data for $FILE\n”);
    $attempt = $s3->putObject($filename,$data,NAME);
    fclose($fh);

    if ($attempt){
    echo “Successful s3 backup: $filename\n”;
    }else{
    echo “Failed s3 backup: $filename\n”;
    }
    }
    }
    closedir($handle);
    }

    Comment by Tom Myer — 12/14/2007 @ 4:40 am

  28. Hi,
    Thanks for the class. Its definitely made my life easier. I have been experimenting with S3 and your class. One thing I am not sure is how do I display an image from a bucket using PHP? I tried using the getObject method but all I get is lines and lines of junk. This means its getting the information from the bucket properly but not displaying the image in the right manner. Your help will be much appreciated!!!

    Thanks

    Comment by Amrith — 1/30/2008 @ 11:17 pm

  29. Hi Geoffrey,

    I read your amazon S3 php class, It was really good.

    I have a question regarding this(Amazon S3),

    I want to create folder inside a bucket and a subfolder inside the folder using php scripting or php class.

    Question >> Can we create a folder inside a bucket using php coding in Amazon S3, Also can we create folder inside a folder using php script.

    Can u please confirm me about this, or can u please give me link or any materials that can help me.

    Thank you,

    Comment by Bivek — 3/26/2008 @ 1:49 am

  30. [...] Amazon S3 PHP Class Update (PHP) [...]

    Pingback by Resources and Tools for Amazon Services « mindstorms — 3/26/2008 @ 7:07 pm

  31. How do I use this class? There are no instructions or samples anywhere that are current. The only examples I could find are from 2006 and none of them work with this version.

    Comment by monty — 5/1/2008 @ 8:50 pm

  32. I tried to run the example code and while I didn’t get any errors, my file was never put in the bucket. The debug output in the browser shows the following:

    HTTP Request sent to: http://s3.amazonaws.com/media.pocketlink

    Signing String: PUT Fri, 02 May 2008 14:35:31 GMT /media.pocketlink

    response header: Array

    HTTP Request sent to: http://s3.amazonaws.com/media.pocketlink/Big Bang.mov

    MD5 HASH OF DATA: BVPbeZ0hjO9fYAEjd9Y8iw==

    Setting content type to video/quicktime

    Setting acl string to public-read

    Signing String: PUT BVPbeZ0hjO9fYAEjd9Y8iw== video/quicktime Fri, 02 May 2008 14:35:31 GMT x-amz-acl:public-read /media.pocketlink/Big Bang.mov

    response header: Array

    any ideas on what might be wrong???

    thx!

    Comment by andrea — 5/2/2008 @ 9:36 am

  33. Can we get a update of this class that includes,

    /*
    * Method: copyObject
    * Copies an object from one bucket to another
    *
    * Request Syntax
    * PUT /destination_object HTTP/1.1
    * Host: destination_bucket.s3.amazonaws.com
    * x-amz-copy-source: /source_bucket/sourceObject
    * x-amz-metadata-directive: metadata_directive
    *
    * Authorization: signature
    * Date: date
    */
    function copyObject ($objectname, $source_bucket, $destination_object){
    …some logic…
    }

    function moveObject ($objectname, $source_bucket, $destination_object){
    copyObject ($objectname, $source_bucket, $destination_object);
    deleteObject ($objectname, $source_bucket);
    }

    thanks

    Comment by Sam Beckett — 5/13/2008 @ 9:30 am

  34. So how does this work? Are there any examples of how to use this class anywhere? I can’t find any.

    Comment by monty — 5/23/2008 @ 7:58 pm

  35. Yes, the S3 class needs some love. Sorry I haven’t had time to update it recently!

    Comment by geoff — 5/24/2008 @ 1:18 pm

  36. [...] Evo par handy sitnica za PHP za streamanje (korisno kod velikih datoteka) na S3 server. http://www.ogleearth.com/2007/07/kmls_region_ama.html http://neurofuzzy.net/index.php?s=s3 + http://www.neurofuzzy.net/2006/08/26/amazon-s3-php-class-update/ http://www.missiondata.com/blog/linux/49/s3-streaming-with-php/ http://cesarodas.com/2007/09/php-amazon-s3-stream-wrapper.html http://www.phpclasses.org/browse/package/4144.html [...]

    Pingback by cacanov web(r)log » 2008 » June » 04 — 6/4/2008 @ 8:16 am

  37. Can you please tell me how I RETRIEVE and display a JPEG image that is set to Private using this class? There’s no info about that in any of the docs. A simple example is all that’s needed.

    Comment by monty — 6/4/2008 @ 6:41 pm

  38. I’ve done the following:

    $image = $srvc->getObject(”myphoto.jpg’, ‘awsbucket’);

    Now what? How do I display the image on screen?

    Comment by monty — 6/4/2008 @ 6:43 pm

  39. Monty – if you set your script’s MIME type in the header to image/jpeg you should just be able to echo $image and voila!

    Comment by geoff — 6/5/2008 @ 1:16 pm

  40. Thank you, Geoff! But, I’m not sure what you mean? I’m a bit of a novice at this stuff. Do you by any chance have an example of how this is done? I’d REALLY appreciate it! Thanks!

    Comment by monty — 6/5/2008 @ 3:24 pm

  41. Yes, to set your header, at the start of your script do:

    header(‘Content-type: image/jpeg’);

    then, get the jpeg data and echo it:

    $image = $srvc->getObject(‘’myphoto.jpg’, ‘awsbucket’);

    echo $image;

    Comment by geoff — 6/7/2008 @ 1:03 pm

  42. Here’s another beginner questions…;)

    after doing $ob = $s3->getObject(…, …);
    How can I attach the $ob to the client web browser?

    I mean, how can client user download that object with popped-up save-to dialogue?

    I can attatch the object to the browser page using header(content disposition) after the SERVER got the object from S3.(this makes double the time to download to final local user PC)

    I want to send the private object in s3 to the user client browser directly…(only server php knows the access key and secret key)

    How can I do that?

    Comment by Randy — 7/23/2008 @ 12:50 pm

  43. I’ve gotten a good handle on most of the features, but I still can’t serve up a file using getObject(), because the request headers are also being ouptut. When I execute:

    $file = $conn->getObject(“myfile.txt”, “mybucket”);

    Even without echoing $file, getObject returns the following:

    HTTP Request sent to: http://s3.amazonaws.com/mybucket/myfile.jpg

    Signing String: GET

    Tue, 07 Oct 2008 16:19:00 GMT
    /mybucket/myfile.jpg

    After this header is printed, the data within the file is also output. If I could suppress the header portion, everything would be fine. If I could suppress everything output by getObject, then all I’d have to do is set the header type and echo $file. Any help is appreciated.

    Comment by Todd Hudgens — 10/7/2008 @ 12:37 pm

  44. I fixed my problem, all I had to do was set the $debug flag to false.. So I ended up writing a web interface for adding and deleting buckets and objects. It's on my website if anyone is interested. Thanks Geoff, you've made my life easier!

    Comment by Todd Hudgens — 10/15/2008 @ 6:10 am

  45. for the users of PHP 5.1 or greater, if you are getting this error "AWS authentication requires a valid Date or x-amz-date header" then rename the value DATE_RFC822 to some other value e.g new_DATE_RFC822 because in PHP 5.1 it is already defined and hence the class fails to define it. so modify the s3.class.php and rename all occurrences of DATE_RFC822 to something else. and it will be fixed.

    Comment by Haroon Ahmad — 11/28/2008 @ 5:27 am

  46. Hi, We are using the S3 class version(0.3.9 – 23rd Dec 2008). Is this genuine version?

    Comment by jivadeveloper1 — 3/2/2009 @ 2:59 pm

  47. Please note that this class will not work with buckets based in the EU. you will have to tweak the code to make the API-call to http://bucketname/s3.amazonaws.com/… instead of
    http://s3.amazonaws.com/bucketname/… apart from that: thank you for the code!

    Comment by lgeeh — 6/18/2009 @ 10:42 am

  48. I am using this class with great success but when I try to launch it from an SSH connexion it does not work anymore.
    The first arrays that are supposed to have infos like Array ( [x-amz-id-2] => are empty :-s

    Any ideas ?

    Comment by Etienne — 7/13/2009 @ 12:50 pm

  49. If you are getting "SignatureDoesNotMatch" errors on putObject, make sure you are using HTTP_Request v1.4.4 – I was getting the error with v1.4.3
    http://pear.php.net/package/HTTP_Request/

    Comment by GregJ — 9/10/2009 @ 11:32 am

  50. PHP doesn't really work for such things, that's why I used python for such a requirement.

    Comment by charlieevatt — 4/29/2010 @ 7:52 pm

  51. I mistakenly put this:
    }

    function moveObject ($objectname, $source_bucket, $destination_object){

    copyObject ($objectname, $source_bucket, $destination_object);

    deleteObject ($objectname, $source_bucket);

    }

    at the introductory part. Now working on the same arrays. Share me some helps….. :p

    Comment by Fightflicks — 5/3/2010 @ 5:10 am

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

 

Powered by WordPress