Enables the fetching of (potential large) result sets in chunks.
Creates a new fetcher.
connection: the current ProxyConnection
options: hash of select options as described under ProxyConnection#select_cursor
# File lib/rubyrep/proxy_connection.rb, line 33 def initialize(connection, options) self.connection = connection self.options = options.clone end
Frees up all ressources
# File lib/rubyrep/proxy_connection.rb, line 88 def clear self.rows = nil end
Returns true if there are more rows to read.
# File lib/rubyrep/proxy_connection.rb, line 39 def next? unless self.rows # Try to load some records if options[:query] and last_row != nil # A query was directly specified and all it's rows were returned # ==> Finished. return false end if options[:query] # If a query has been directly specified, just directly execute it query = options[:query] else # Otherwise build the query if last_row # There was a previous batch. # Next batch will start after the last returned row options.merge! :from => last_row, :exclude_starting_row => true end query = connection.table_select_query(options[:table], options) if options[:row_buffer_size] # Set the batch size query += " limit #{options[:row_buffer_size]}" end end self.rows = connection.select_all query self.current_row_index = 0 end self.current_row_index < self.rows.size end
Returns the row as a column => value hash and moves the cursor to the next row.
# File lib/rubyrep/proxy_connection.rb, line 75 def next_row raise("no more rows available") unless next? self.last_row = self.rows[self.current_row_index] self.current_row_index += 1 if self.current_row_index == self.rows.size self.rows = nil end self.last_row end
Generated with the Darkfish Rdoc Generator 2.