· 7 min read

Debugging Your Bash Shell Scripts: Common Issues and Solutions

Learn the most common mistakes that break Bash scripts and the commands, options and workflows to find and fix them quickly - from quoting and word-splitting to set -x tracing, shellcheck linting and trap-based stack traces.

Introduction

Bash is everywhere: simple, powerful, and unforgiving. Small mistakes - missing quotes, unintended word-splitting, or incorrect exit-status handling - can make scripts fail silently or behave unpredictably. This article catalogs common gotchas, gives concrete examples, and shows practical commands and techniques to debug Bash scripts effectively.

Why debugging Bash is special

  • The shell is both a programming language and a glue language. Many failures come from how the shell performs word-splitting, globbing, command substitution, and evaluation order.
  • Scripts are often run in different environments (cron, CI, interactive shell) where PATH, IFS, and environment variables differ.

Useful links (further reading)

Common mistakes and how to fix them

  1. Missing or wrong quoting - the number one cause

Problem:

Unquoted variables are split by IFS and subject to filename globbing.

Example:

files="$1"
rm $files

If $1 contains “a b” you’ll pass two arguments to rm; if it contains “*.txt” globbing may expand unexpectedly.

Fix:

Always quote expansions unless you specifically want splitting/globbing:

rm "$files"

When dealing with lists/arrays, use the right expansion:

arr=(one two "three four")
# Wrong: prints first element only and loses words
echo "$arr"
# Right: expands each element as a separate word, preserving spaces
echo "${arr[@]}"
  1. Word splitting in command substitution

Problem:

The result of $(cmd) is subject to word splitting unless quoted.

Example:

list=$(ls *.txt)
for f in $list; do
  echo "$f"
done

If filenames contain spaces this breaks.

Fix:

Quote command substitution or use arrays:

# safe: preserve spaces/newlines
IFS=$'\n' read -r -d '' -a files < <(printf "%s\0" *.txt)
for f in "${files[@]}"; do
  echo "$f"
done

# or simply iterate using glob directly
for f in *.txt; do
  [ -e "$f" ] || continue
  echo "$f"
done
  1. Confusing return codes, set -e surprises, and pipes

Problem:

set -e exits on the first failing command - but behavior in pipelines and conditionals is subtle. A failing command in a pipeline may be masked.

Fix:

Use set -o pipefail to make the pipeline return the status of the last failing command:

set -euo pipefail
# -e: exit on error, -u: treat unset variables as error, pipefail: fail pipelines properly

Be careful with commands used in conditionals or pipelines and check their exit codes explicitly if needed.

  1. Using echo for debug or output is fragile

Problem:

echo interprets escape sequences on some shells and may not be portable.

Fix:

Use printf (more predictable) and direct debugging output to stderr:

printf '%s\n' "debug: var=$var" >&2
  1. CRLF and executable permissions

Problem:

Scripts edited on Windows may have CRLF line endings (“\r\n”) and fail with strange errors like “bad interpreter: No such file or directory”.

Fix:

Convert line endings: dos2unix script.sh Make executable and use a correct shebang:

# top of script
#!/usr/bin/env bash
chmod +x script.sh
  1. Wrong shebang or invoking the wrong shell

Problem:

Running bash-specific constructs under /bin/sh can fail.

Fix:

Make the shebang explicit (#!/usr/bin/env bash) and run the script with bash: bash script.sh

  1. Subshells and variable scoping

Problem:

Pipelines start subshells; variable assignments inside subshells won’t be available after the pipeline.

Example:

# assigns in subshell, lost after pipeline
echo "value" | while read -r var; do
  echo "in loop: $var"
done
# here var is empty (in many shells)

Fix:

Use process substitution, redirections or temporary files; in Bash 4+ you can use a here-string or read from a file descriptor:

while read -r var; do
  echo "in loop: $var"
done < <(printf '%s\n' value)
# var remains available in the parent shell after the loop
  1. Incorrect use of test/[[ and precedence

Problem:

Mixing [ ] and [[]] syntax, or forgetting quotes with -n/-z and file tests leads to subtle bugs.

Fix:

Prefer [[…]] for expressions in Bash and quote where appropriate:

if [[ -n "$var" && -f "$file" ]]; then
  # ...
fi
  1. Relying on PATH or external commands without checking

Problem:

Scripts that depend on commands without absolute paths/availability may fail in cron/CI.

Fix:

Either set PATH at the top of the script or use full paths and check dependencies:

command -v curl >/dev/null || { printf 'curl not found\n' >&2; exit 1; }

Commands and options to debug quickly

  1. Syntax check fast: bash -n

This checks syntax without executing:

bash -n script.sh
  1. Trace execution: set -x or bash -x
  • Quick usage: bash -x script.sh
  • In-script: set -x; set +x to disable

Example:

#!/usr/bin/env bash
set -x
# commands
set +x
  1. Better, annotated traces: BASH_XTRACEFD and PS4

You can redirect xtrace to a separate FD and add file/line/function info to PS4 for readable stack traces:

exec 3>&2
BASH_XTRACEFD=3
PS4='+ ${BASH_SOURCE}:${LINENO}:${FUNCNAME[0]}: '
set -x
# your code here
set +x

This prints trace lines prefixed with the script, line number, and function.

  1. Lint your script: shellcheck

ShellCheck analyzes scripts and points out common mistakes and best practices. Use it interactively at https://www.shellcheck.net/ or install locally.

shellcheck script.sh

ShellCheck will catch unquoted variables, unused variables, and many other pitfalls.

  1. Print a runtime stack trace with trap ERR

To get a stack trace when a command fails:

#!/usr/bin/env bash
set -o errexit -o nounset -o pipefail
err_report() {
  local exit_code=$?
  echo "Error on line ${BASH_LINENO[0]} (exit code ${exit_code})" >&2
  for ((i=0; i<${#FUNCNAME[@]}; i++)); do
    printf '  %d: %s() %s\n' "$i" "${FUNCNAME[$i]}" "${BASH_SOURCE[$i+1]}:${BASH_LINENO[$i]}" >&2
  done
}
trap err_report ERR

# your script here

This prints function names and line numbers when a command fails.

  1. Check for syntax and parse errors early with shellcheck + bash -n

Use shellcheck to get style and correctness warnings, and bash -n to catch parsing errors.

  1. Use set -u to detect unset variables

set -u (or -o nounset) causes using an unset variable to be an error. This helps catch typos like $USE_SRCPATH vs $SRC_PATH.

  1. Temporary debug prints and logging

Use printf to write to stderr for debug messages so they don’t interfere with normal stdout:

log() { printf '%s\n' "${*}" >&2; }
log "Starting step X"
  1. Unit testing: bats

Introduce tests for functions using the Bats framework: https://github.com/bats-core/bats-core. Unit tests make regressions easier to catch than ad-hoc debugging.

A short example debugging session (walkthrough)

Problem: A backup script intermittently fails to include some files.

Script snippet:

#!/usr/bin/env bash
set -e
src="$1"
dest="$2"
files=$(find "$src" -type f -name '*.conf')
rsync -av $files "$dest"

Observations:

  • If files have spaces they’re not copied correctly.
  • Sometimes rsync says “No such file”.

Steps to debug:

  1. Run shellcheck: shellcheck script.sh - it flags unquoted expansions.
  2. Run bash -n script.sh - no syntax errors.
  3. Reproduce with set -x or bash -x:
bash -x script.sh '/etc/my dir' /tmp/backup

Trace shows rsync received multiple arguments from a single filename with spaces.

Fix: Avoid storing a whitespace-separated list in a scalar. Use find + rsync with —files-from or an array.

Solution A (rsync —files-from):

cd "$src"
find . -type f -name '*.conf' -print0 > /tmp/files0
rsync -av --files-from=/tmp/files0 --from0 ./ "$dest"

Solution B (array + loop):

mapfile -d '' -t files < <(find "$src" -type f -name '*.conf' -print0)
for f in "${files[@]}"; do
  rsync -av "$f" "$dest/"
done

After applying the fix, re-run the script and trace to verify correct arguments.

Systematic debugging workflow

  1. Reproduce the problem deterministically.
  2. Run shellcheck and bash -n - fix obvious issues.
  3. Add tracing (bash -x or set -x) and customize PS4 for context.
  4. Narrow scope: isolate failing function/block and add printf debugging if needed.
  5. Consider edge cases: quoting, IFS, CRLF, path/permissions, subshells and pipelines.
  6. Add tests (Bats) to prevent regression.

Checklist: Quick things to try

  • bash -n script.sh (syntax check)
  • shellcheck script.sh (static analysis)
  • bash -x script.sh or set -x / PS4 hacks (runtime trace)
  • set -o pipefail and set -u to catch hidden errors
  • use printf and redirect debug prints to stderr
  • check line endings (dos2unix) and shebang
  • ensure script invoked with bash if using Bash extensions
  • inspect variable contents with printf ‘%q\n’ “$var” (shows quoting)

Examples of small useful commands

  • Show how a variable is split and globbed:
var='a b *'
printf '>%s<\n' $var    # unquoted expansion (bad)
printf '>%s<\n' "$var"  # safe
  • Pretty-printed trace with filenames/lines
exec 3>&2
BASH_XTRACEFD=3
PS4='+ ${BASH_SOURCE}:${LINENO}:${FUNCNAME[0]}: '
set -x
# code
set +x

When to rewrite parts in a more robust language

If your script is increasingly complex (lots of data structures, complicated error handling, large parsing tasks), consider rewriting critical parts in a language with better tooling (Python, Go, Rust). But even then, the debugging techniques above remain useful for the shell glue that remains.

Conclusion

Bash scripts fail most often because of subtle semantics (quoting, splitting, subshells) rather than obscure language bugs. A combination of static analysis (shellcheck), non-executing checks (bash -n), execution tracing (set -x with PS4/BASH_XTRACEFD), and runtime checks (set -euo pipefail plus trap ERR) will save hours. Add small unit tests with Bats to prevent regressions and prefer safe idioms (”${var}”, ”${array[@]}”, full test expressions).

Further reading

Back to Blog

Related Posts

View All Posts »

Automating Tasks with Bash Shell Commands: Tips and Tricks

Learn how to leverage Bash to automate repetitive tasks - from safe scripting practices and argument parsing to scheduling with cron and systemd timers. Includes practical examples, one-liners, debugging tips, and recommended tools.

The Hidden Costs of AI Tools for JavaScript: What No One Tells You

Integrating AI into JavaScript workflows can boost productivity - but it also introduces hard-to-see costs: compute, data, maintenance, security, and organizational overhead. This article uncovers those hidden costs and gives practical mitigation steps, checklists, and a realistic cost model to help you plan.