|
From: | Robert E. Griffith |
Subject: | Re: tokenize honoring quotes |
Date: | Fri, 5 Aug 2022 18:36:14 -0400 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:91.0) Gecko/20100101 Thunderbird/91.11.0 |
On 8/5/22 16:28, Koichi Murase wrote:
I think you may use history expansions to safely cut tokens. For example, function tokenize { eval "tokens=($( local str=$1 while history -s "$str" && word=$(history -p '!:0' 2>/dev/null) && [[ $word && $str == *"$word"* ]] do printf '%q\n' "$word" str=${str#*"$word"} done ))" }
Thanks, this is a great new technique for me. I had never really given much thought to history and would never have thought to use it in a script!
FYI, the final version I am using is below. I found by reusing one history line it got faster.
# usage: tokenize2 <arrayVar> <input ...> function tokenize2() { local -n arrayRet="$1"; shift local input="$*" history -s "$input" local word i=0 arrayRet=() while word=$(history -p '!:'$i 2>/dev/null); do arrayRet+=("$word") ((i++)) done history -d 1 } --BobG
[Prev in Thread] | Current Thread | [Next in Thread] |