How to send files with Guzzle without loading them into memory

3.8k views Asked by At

I have a form where I can upload multiple files to my laravel backend and I wand to send all those files with Guzzle to external API

I'm having an issue where my script is running out of memory if I upload more MB that what memory is available. Error message is

Allowed memory size of ... bytes exhausted (tried to allocate ... bytes)

Unfortunately, I cannot change the memory limit dynamically

Here is the code that I use

// in laravel controller method

/* @var \Illuminate\Http\Request $request */
$files = $request->allFiles();

$filesPayload = [];

foreach ($files as $key => $file) {
    $filesPayload[] = [
        'name'     => $key,
        'contents' => file_get_contents($file->path()),
        // 'contents' => fopen($file->path(), 'r'), // memory issue as well
        'filename' => $file->getClientOriginalName(),
    ];
}

$client = new \GuzzleHttp\Client\Client([
    'base_uri' => '...',
]);

$response = $client->post('...', [
    'headers' => [
        'Accept'         => 'application/json',
        'Content-Length' => ''
    ],
    'multipart' =>  $filesPayload,
]);

I'm using Guzzle 6. In the docs I found example of fopen but this was also throwing memory error

Is there a way to send multiple files with Guzzle without loading them into memory?

4

There are 4 answers

0
ljubadr On BEST ANSWER

I finally managed to make this work by changing

'contents' => file_get_contents($file->path()),

to

'contents' => \GuzzleHttp\Psr7\stream_for(fopen($file->path(), 'r'))

With this change files were not loaded into memory and I was able to send bigger files

0
Asfandyar Khan On

In order to get around this, I've added a curl option that you can specify on a request that will send the request body as a string rather than stream it from the entity body of the request. You can enable this behavior like so:

$options = $client->getConfig()->get('curl.options');
$options['body_as_string'] = TRUE;
$client->getConfig()->set('curl.options', $options);
0
cogis On

I was also struggling with Guzzle to send large file (2 GB to 5 GB) finally i used curl php and it works like a charm :

<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, '/url');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$headers = [];
$headers[] = 'Content-Type: multipart/form-data';
$headers[] = 'Cookie: something...';
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, 1);
$path = '/full/path/to/file';
$file = curl_file_create(realpath($path));
$post = [   
    'file' => $file,
    'other_field' => 'value',
];
curl_setopt($ch, CURLOPT_POSTFIELDS, $post);
$result = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
var_dump($result,$httpcode);
0
Invis1ble On

I've tried accepted answer, but there is no stream_for function in the \GuzzleHttp\Psr7\ namespace. Instead I've found \GuzzleHttp\Psr7\Utils::streamFor() method and it works. You can try it:

'contents' => \GuzzleHttp\Psr7\Utils::streamFor($file->path(), 'r'))