Short Version:
Sending files from S3 in more than one chunk to SFTP server fails with a Net::SFTP::StatusException (5, "bad message") error.
Long Version:
I'm trying to send a large file from my RoR application to a customer's SFTP server.
I'm using Fog::Storage as I'm reading the file from my S3 server.
Here is my example code:
file_paths = get_file_names()
Net::SFTP.start(sftp_domain, sftp_user, :password => sftp_password) do |sftp|
file_paths.each do |file_to_upload|
filename = nil
if Rails.env == "production"
filename = file_to_upload.file.path.split("/").last
# Create S3 Connection:
connection = Fog::Storage.new(provider: 'AWS'...)
# Find root directory:
bucket = connection.directories.get(ENV['MY_S3'])
# Send all files in directory, chunk-by-chunk:
sftp.file.open(sftp_folder + "/" + filename, "w") do |sftp_file|
bucket.files.get(file_to_upload.file.path) do |chunk|
sftp_file.write(chunk)
end
end
end
end
ONLY when a single file exceeds the default chunk-size, I get the following error:
Excon::Error::Socket: Net::SFTP::StatusException (5, "bad message") (Net::SFTP::StatusException)
More details:
- Using rails 4.2.5.1.
- Using ruby 2.3.4p301 (2017-03-30 revision 58214) [aarch64-linux]
- Gem locked at net-sftp (3.0.0) and net-ssh (6.1.0).
- Files that are small enough to be sent in one chunk - are sent correctly.
What I tried:
- enforcing an encoding didn't work (same error).
- reading the large file in one chunk did not work (memory error).
I will be happy to add more details as needed, sorry if I missed something!
Thanks in advance to everyone!