top | item 35891923

(no title)

foooobaba | 2 years ago

Not bad (chat gpt can do this too) but it seems to be on par for this “simple” task:

> Write me a python function that accepts a list of filenames and uses threads and queues to load them in parallel with the pickle library

> Also, please output the results to a queue as well

Final result

  import pickle
  import threading
  import queue
  
  def load_files_in_parallel(filenames, results_queue):
    """Loads a list of files in parallel with the pickle library and outputs the results to a queue.
  
    Args:
      filenames: A list of filenames to load.
      results_queue: A queue to hold the results of the file loads.
  
    Returns:
      None.
    """
  
    # Create a queue to hold the file handles.
    file_handles = queue.Queue()
  
    # Create a thread pool to load the files.
    threads = []
    for _ in range(len(filenames)):
      thread = threading.Thread(target=load_file, args=(file_handles, filenames.pop(), results_queue))
      threads.append(thread)
      thread.start()
  
    # Wait for all threads to finish.
    for thread in threads:
      thread.join()
  
    # Return the list of objects loaded from the files.
    # No need to return anything since the results are already in the queue.
  
  def load_file(file_handles, filename, results_queue):
    """Loads a file and adds it to the results queue.
  
    Args:
      file_handles: A queue to hold the file handles.
      filename: The name of the file to load.
      results_queue: A queue to hold the results of the file loads.
  
    Returns:
      None.
    """
  
    # Open the file.
    with open(filename, 'rb') as f:
      obj = pickle.load(f)
  
    # Add the object to the results queue.
    results_queue.put(obj)

discuss

order

mdaniel|2 years ago

matches my experience with LLMs: forcing the onus of accuracy upon the code reviewer

    >>> alpha = ["a", "b", "c"]
    >>> def load_files_in_parallel(filenames):
    ...     for _ in range(len(filenames)):
    ...         _ = filenames.pop()
    ... 
    >>> load_files_in_parallel(alpha)
    >>> alpha
    []
And I mean, just damn, because it was already iterating over the list of filenames using such a convoluted mechanism, what a great opportunity to put the filename into a local var without mutating the caller's list