This article was originally published on AI Study Room. For the full version with working code examples and related articles, visit the original post.
Bash Scripting Best Practices
Bash Scripting Best Practices
Bash Scripting Best Practices
Bash Scripting Best Practices
Bash scripting remains one of the most critical skills for developers, DevOps engineers, and system administrators. Despite its age, Bash is everywhere -- from CI/CD pipelines to deployment scripts and system automation. Writing robust, maintainable shell scripts requires discipline and adherence to proven practices.
Start with Strict Mode
Every production Bash script should begin with strict mode settings that catch errors early:
!/usr/bin/env bash
set -euo pipefail
IFS=$'\n\t'
set -ecauses the script to exit immediately when a command fails.set -utreats unset variables as an error.set -o pipefailmakes pipeline failures propagate.Setting
IFSto newline and tab prevents word-splitting issues with filenames containing spaces.
A more advanced option is set -o errexit combined with custom error handling:
error_handler() {
local line=$1
echo "Error on line $line" >&2
exit 1
}
trap 'error_handler $LINENO' ERR
Use Functions for Modularity
Avoid writing long linear scripts. Break logic into functions with clear names:
validate_input() { ... }
process_file() { ... }
send_notification() { ... }
Declare all functions at the top of the script, followed by argument parsing, followed by the main execution flow. This makes the script readable and testable.
Prefer [[ ]] Over [ ]
The double-bracket [[ ]] construct is a Bash keyword with fewer surprises:
if [[ -f "$file" && "$name" == "prod" ]]; then
echo "Matched"
fi
Unlike single brackets, double brackets handle empty variables safely, support pattern matching, and avoid word-splitting.
Quote Everything
Unquoted variables are one of the most common sources of bugs:
Wrong
if [ -f $file ]; then # breaks if file has spaces
Right
if [[ -f "$file" ]]; then
Quote all variable expansions: "$var", "${array[@]}", and command substitutions "$(command)".
Use trap for Cleanup
Always clean up temporary files and resources:
cleanup() {
rm -rf "$TEMP_DIR"
kill "$PID" 2>/dev/null || true
}
trap cleanup EXIT
The EXIT trap fires regardless of why the script exits -- success, failure, or signal. For signal-specific handling, add separate traps for INT and TERM.
Argument Parsing with getopts
Use getopts for reliable argument parsing instead of manual position checks:
usage() { echo "Usage: $0 -f file -o output [-v]" >&2; exit 1; }
while getopts ":f:o:v" opt; do
case $opt in
f) INPUT_FILE="$OPTARG" ;;
o) OUTPUT_DIR="$OPTARG" ;;
v) VERBOSE=true ;;
*) usage ;;
esac
done
This handles short flags robustly, including missing argument errors.
Use readonly and declare -r
Mark constants and configuration values as readonly:
readonly MAX_RETRIES=3
readonly CONFIG_PATH="/etc/myapp/config.yml"
This prevents accidental overwrites and documents intent.
Prefer printf Over echo
The echo command behaves differently across shells and platforms. Use printf for portable, predictable output:
printf "Processing file: %s\n" "$filename"
Logging with Timestamps
Implement a simple logging function for better observability:
log() {
local level="$1"
shift
printf "[%s] [%s] %s\n" "$(date '+%Y-%m-%d %H:%M:%S')" "$level" "$*" >&2
}
info() { log "INFO" "$@"; }
error() { log "ERROR" "$@"; }
Send logs to stderr so they don't interfere with stdout data output.
Validating Dependencies
Check required commands before proceeding:
require() {
for cmd in "$@"; do
if ! command -v "$cmd" &>/dev/null; then
error "Required command not found: $cmd"
exit 1
fi
done
}
require jq curl openssl
Avoid eval and Backtick Sub
Read the full article on AI Study Room for complete code examples, comparison tables, and related resources.
Found this useful? Check out more developer guides and tool comparisons on AI Study Room.
Top comments (0)