Best Practices for Uploading Large Files over 100MB to Amazon S3 Using Laravel

 When dealing with large file uploads, especially to remote services like Amazon S3, timeouts can be a common issue due to the time it takes to upload such files over the internet. Here are some best practices and solutions to handle large file uploads to S3 in a Laravel application:



    1. 1. Chunked Uploads: Implement chunked uploads where you split the large file into smaller chunks and upload them individually. This reduces the likelihood of timeouts because smaller chunks are uploaded separately. Laravel provides support for chunked file uploads through packages like Laravel Chunk Upload.


    2. 2. Direct-to-S3 Uploads: Instead of uploading the file directly from your Laravel application server to S3, consider using direct-to-S3 uploads. In this approach, the client browser uploads the file directly to S3, bypassing your server. This is usually done using pre-signed URLs generated by your Laravel application, allowing clients to upload files directly to S3.

  1. ASW S3 Document
  2. 1. Uploading Photos to Amazon S3 from a Browser

    1. 3. Increase Timeout Settings: If you're still experiencing timeouts, you may need to adjust the timeout settings in your Laravel application and web server configuration. Increase the timeout values to accommodate the longer upload times required for large files.

  1. 3.1 Apache: nano 'apache/conf/httpd.conf'

  2. Apache usually set to 300 seconds (5 minutes)
  3. Timeout 300
  4. increase to 600 seconds (10 minutes)
    1. Timeout 600
    1. Then, Save the changes to the httpd.conf file and restart apache server:
        1. sudo systemctl apache2 restart 

        3.2 NGIX

      1. open NGIX config file: nano /etc/nginx/sites-available/.
    1. proxy_connect_timeout 600; proxy_send_timeout 600; proxy_read_timeout 600;

    2. Restart server:
      1. sudo systemctl reload nginx
    1. 4. Asynchronous Processing: Implement asynchronous processing for file uploads. Instead of making the user wait for the entire upload process to complete synchronously, accept the upload request, store the file temporarily, and then process the upload asynchronously in the background. This ensures that the user's request returns quickly without being affected by the upload process.


    2. 5. Resumable Uploads: Implement resumable uploads so that if the upload process is interrupted due to a timeout or network issue, it can be resumed from where it left off rather than starting from the beginning. This improves the reliability of large file uploads.


    3. 6. Optimize S3 Configuration: Ensure that your S3 bucket is properly configured for handling large file uploads. Check if there are any restrictions or limitations in place that may affect the upload process.


    4. 7. Use AWS SDK for PHP: If you're not already using it, make sure you're using the AWS SDK for PHP in your Laravel application for interacting with S3. It provides various features and optimizations for working with S3, including support for multipart uploads which can help with large file uploads.


Comments