Object
A general purpose downloader module for Net::SFTP. It can download files into IO objects, or directly to files on the local file system. It can even download entire directory trees via SFTP, and provides a flexible progress reporting mechanism.
To download a single file from the remote server, simply specify both the remote and local paths:
downloader = sftp.download("/path/to/remote.txt", "/path/to/local.txt")
By default, this operates asynchronously, so if you want to block until the download finishes, you can use the 'bang' variant:
sftp.download!("/path/to/remote.txt", "/path/to/local.txt")
Or, if you have multiple downloads that you want to run in parallel, you can employ the wait method of the returned object:
dls = %w(file1 file2 file3).map { |f| sftp.download("remote/#{f}", f) } dls.each { |d| d.wait }
To download an entire directory tree, recursively, simply specify :recursive => true:
sftp.download!("/path/to/remotedir", "/path/to/local", :recursive => true)
This will download "/path/to/remotedir", it's contents, it's subdirectories, and their contents, recursively, to "/path/to/local" on the local host. (If you specify :recursive => true and the source is not a directory, you'll get an error!)
If you want to pull the contents of a file on the remote server, and store the data in memory rather than immediately to disk, you can pass an IO object as the destination:
require 'stringio' io = StringIO.new sftp.download!("/path/to/remote", io)
This will only work for single-file downloads. Trying to do so with :recursive => true will cause an error.
The following options are supported:
:progress - either a block or an object to act as a progress callback. See the discussion of "progress monitoring" below.
:requests - the number of pending SFTP requests to allow at any given time. When downloading an entire directory tree recursively, this will default to 16. Setting this higher might improve throughput. Reducing it will reduce throughput.
:read_size - the maximum number of bytes to read at a time from the source. Increasing this value might improve throughput. It defaults to 32,000 bytes.
Sometimes it is desirable to track the progress of a download. There are two ways to do this: either using a callback block, or a special custom object.
Using a block it's pretty straightforward:
sftp.download!("remote", "local") do |event, downloader, *args| case event when :open then # args[0] : file metadata puts "starting download: #{args[0].remote} -> #{args[0].local} (#{args[0].size} bytes}" when :get then # args[0] : file metadata # args[1] : byte offset in remote file # args[2] : data that was received puts "writing #{args[2].length} bytes to #{args[0].local} starting at #{args[1]}" when :close then # args[0] : file metadata puts "finished with #{args[0].remote}" when :mkdir then # args[0] : local path name puts "creating directory #{args[0]}" when :finish then puts "all done!" end
However, for more complex implementations (e.g., GUI interfaces and such) a block can become cumbersome. In those cases, you can create custom handler objects that respond to certain methods, and then pass your handler to the downloader:
class CustomHandler def on_open(downloader, file) puts "starting download: #{file.remote} -> #{file.local} (#{file.size} bytes)" end def on_get(downloader, file, offset, data) puts "writing #{data.length} bytes to #{file.local} starting at #{offset}" end def on_close(downloader, file) puts "finished with #{file.remote}" end def on_mkdir(downloader, path) puts "creating directory #{path}" end def on_finish(downloader) puts "all done!" end end sftp.download!("remote", "local", :progress => CustomHandler.new)
If you omit any of those methods, the progress updates for those missing events will be ignored. You can create a catchall method named "call" for those, instead.
The default read size.
A simple struct for encapsulating information about a single remote file or directory that needs to be downloaded.
Instantiates a new downloader process on top of the given SFTP session. local is either an IO object that should receive the data, or a string identifying the target file or directory on the local host. remote is a string identifying the location on the remote host that the download should source.
This will return immediately, and requires that the SSH event loop be run in order to effect the download. (See wait.)
# File lib/net/sftp/operations/download.rb, line 146 def initialize(sftp, local, remote, options={}, &progress) @sftp = sftp @local = local @remote = remote @progress = progress || options[:progress] @options = options @active = 0 @properties = options[:properties] || {} self.logger = sftp.logger if recursive? && local.respond_to?(:write) raise ArgumentError, "cannot download a directory tree in-memory" end @stack = [Entry.new(remote, local, recursive?)] process_next_entry end
Returns the property with the given name. This allows Download instances to store their own state when used as part of a state machine.
# File lib/net/sftp/operations/download.rb, line 192 def [](name) @properties[name.to_sym] end
Sets the given property to the given name. This allows Download instances to store their own state when used as part of a state machine.
# File lib/net/sftp/operations/download.rb, line 198 def []=(name, value) @properties[name.to_sym] = value end
Forces the transfer to stop.
# File lib/net/sftp/operations/download.rb, line 178 def abort! @active = 0 @stack.clear end
Returns true if there are any active requests or pending files or directories.
# File lib/net/sftp/operations/download.rb, line 173 def active? @active > 0 || stack.any? end
Generated with the Darkfish Rdoc Generator 2.