To test the modified script, let’s assume you’ve saved it as process_stream.sh and made it executable with chmod +x process_stream.sh. We’ll create a simple pipeline that demonstrates both adding to and removing from the data stream using the -r flag.

Scenario:

  1. Adding to the Data Stream:
  • We’ll echo several lines of text and a filename that contains specific lines we want to add to our data stream.
  1. Removing from the Data Stream:
  • We’ll use the -r flag to remove certain lines from our data stream, including lines from a file.

Setup:

  • Create two files, add.txt and remove.txt, for the demonstration.
  • add.txt contains:
  Line to add 1
  Line to add 2
  • remove.txt contains:
  Line to remove

Test Pipeline:

  1. Adding Lines and File Contents:

First, let’s add lines directly and from add.txt:

{ echo "Line to add 1"; echo "Line to add 2"; echo "add.txt"; } | ./process_stream.sh

This command simulates adding lines to your data stream. Since it’s the initial setup, it will just output the contents of add.txt along with the echoed lines.

  1. Removing Lines and File Contents:

Next, to demonstrate removing lines, including those from remove.txt, we’ll append the removal operation in another call to the script:

{ echo "Line to remove"; echo "remove.txt"; } | ./process_stream.sh -r

For this to work as intended in a single pipeline where you see the effect of both adding and removing operations sequentially, you’d actually need to chain these operations in a way that reflects a real data stream manipulation. However, given the script’s current design, it treats each execution independently, starting with an empty data stream.

To effectively demonstrate both in a scripted, single execution context, consider the following hypothetical example that assumes a continuous stream or modifications to the script to support persistent state:

# Hypothetical, assumes persistent state or chained operations
{ echo "Initial line"; echo "add.txt"; echo "Line to remove" | ./process_stream.sh; echo "remove.txt" | ./process_stream.sh -r; }

This setup doesn’t directly work as shown due to the stateless nature of shell scripts and their execution environment. The script processes input as a new instance each time it’s called, without memory of previous calls. Therefore, to truly test adding and then removing, you’d need a more complex setup or a script modification to maintain state across calls, such as using temporary files or modifying the script to handle continuous input in a single execution more dynamically.


#!/bin/bash

# Initialize an empty array to hold the data stream
declare -a data_stream

# Function to add input to the data stream
add_to_stream() {
    data_stream+=("$1")
}

# Function to remove input from the data stream
remove_from_stream() {
    # Temporarily store the data stream in another array to avoid modification issues during iteration
    local temp_stream=("${data_stream[@]}")
    data_stream=() # Clear the original data stream

    # Iterate over the temporary stream
    for line in "${temp_stream[@]}"; do
        # Only add back lines that do not match the input
        if [[ "$line" != "$1" ]]; then
            data_stream+=("$line")
        fi
    done
}

# Function to process each input line
process_input() {
    # Check if the input is a file
    if [ -f "$1" ]; then
        while IFS= read -r line; do
            "$operation" "$line"
        done < "$1"
    # Check if the input is a directory
    elif [ -d "$1" ]; then
        # If it's a directory, list all files and process each file found
        for file in "$1"/*; do
            # Recursively call this function for each file in the directory
            process_input "$file"
        done
    else
        # If it's neither, treat it as a string and apply the operation (add/remove)
        "$operation" "$1"
    fi
}

# Default operation
operation=add_to_stream

# Process command-line options
while getopts ":r" opt; do
  case ${opt} in
    r )
      operation=remove_from_stream
      ;;
    \? )
      echo "Invalid option: $OPTARG" 1>&2
      exit 1
      ;;
  esac
done
shift $((OPTIND -1))

# Read from STDIN line by line if no arguments are provided
if [ "$#" -eq 0 ]; then
    while IFS= read -r line; do
        process_input "$line"
    done
else
    # If the script receives arguments directly, process each argument
    for arg in "$@"; do
        process_input "$arg"
    done
fi

# Output the final data stream
printf "%s\n" "${data_stream[@]}"

Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *