Server Fault Asked by 0x4a6f4672 on December 9, 2021
There seem to be a file size limit for sftp transfer from the command line, if I try to download a 20 MB file it stops at 20% when the transfered data is larger than 4 MB:
/myfile.xml 22% 4480KB 410.4KB/s 00:38 ETA
Is there a limit somewhere and how can you change it? I use a scripted sftp connection with expect. Downloading the file via Nautilus file manager or sftp from the console without shell script and expect program both seem to work. The script is as follows:
#!/usr/bin/expect
# usage: ./sftp_import.sh username host password filepath localpath
set username [lindex $argv 0]
set host [lindex $argv 1]
set password [lindex $argv 2]
set filepath [lindex $argv 3]
set localpath [lindex $argv 4]
spawn sftp $username@$host
expect "password:"
send "$passwordr";
expect "sftp>"
send "get $filepath $localpathr"
expect "sftp>"
send "bye r";
exit 0
There is no limit by default (as far as I am aware).
I would guess you are getting packet loss and timing out. Try adding -D.
-D sftp_server_path Connect directly to a local sftp server (rather than via ssh(1)). This option may be useful in debugging the client and server.
Source: http://www.cl.cam.ac.uk/cgi-bin/manpage?1+sftp
I would probably copy it with rsync, rsync -avrz --ignore-existing /folder/folder/folder [email protected]:/folder/folder/folder
Answered by Sc0rian on December 9, 2021
It seems like adding a timeout resolves the problem. The default expect timeout is 10 seconds, which means at a rate of 400 KB/s, the timeout is trigged if 4000 KB = 4 MB is reached
set timeout 300
Answered by 0x4a6f4672 on December 9, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP