Nushell enjoyers represent
cut -d ' ' -f1
master raceHey I throw a
/^regexp.*/ {print $NF}
in there sometimes!…but yes, it’s mostly
print $1
—but only because I mix up the parameters whenever I try to usecut
!Why spend 30 seconds manually editing some text when you can spend 30 minutes clobbering together a pipeline involving awk, sed and jq
Or 60 minutes making it all work just with jq functions.
to be fair, out of those three, jq invokes the least existential dread in me
The important part is to learn the limits of any tool. Nowadays I no longer use jq for any long or complicated tasking. Filter and view data? jq is fine. Anything more and I just cook up a python script.
i’m more of a bash fan tbh. Ever since i started using linux, python started to irritate me
How do you get complex data structures to work? I was alienated from scripting on zsh because I wanted something like a dict and realised I would have to write my own implementation. Is there a work around for that?
I mean, there’s a point in data structure complexity where it’s useful to use Python.
But as to dicts, sure. You’re looking for zsh’s “associative array”. Bash has it too.
zsh
$ typeset -A mydict $ mydict[foo]=bar $ echo $mydict[foo] bar $
bash
$ typeset -A mydict $ mydict[foo]=bar $ echo ${mydict[foo]} bar $
This will do nicely - I had several workflows where I’d hit an API and get a massive super nested JSON as output; I’d use jq to get the specific data from the whole thing and do a bunch of stuff on this filtered data. I pretty much resigned to using python because I’d have successively complicated requirements and looking up how to do each new thing was slowing me down massively.
TIL I am an OP wizard.
joke so dark I had to turn up my screen brightness to enjoy it.
You can even do sum with awk, you don’t need excel
num-utils
hasnumsum
.
All my homies use dubious regex
my favorite awk snippet is
!x[$0]++
which is likeuniq
but doesn’t care about order. basically, it’s equivalent toprint_this_line = line_cache[$current_line] == 0; line_cache[$current_line] += 1; if $print_this_line then print $current_line end
.really useful for those long spammy logs.
Oh that’s very interesting. I usually do
sort --unique
orsort [...] | uniq
if I need specific sorting logic (like by size on disk, etc).Looking at the above awk snippet, it’ll retain order, though. So, sort will normally change the order. The awk snippet won’t, just skip occurrences of a given line after the first. Depending upon the use case, that order retention could be pretty desireable.
same, that statement saved me so much effort
Honestly I think 90% of people would never use awk if there was a simple preinstalled command for “print the nth column”
cut?
To be fair, a lot of the programs don’t use a single character, have multiple spaces between fields, and
cut
doesn’t collapse whitespace characters, so you probably want something more liketr -s " "|cut -d" " -f3
if you want behavior likeawk
’s field-splitting.$ iostat |grep ^nvme0n1 nvme0n1 29.03 131.52 535.59 730.72 2760247 11240665 15336056 $ iostat |grep ^nvme0n1|awk '{print $3}' 131.38 $ iostat |grep ^nvme0n1|tr -s " "|cut -d" " -f3 131.14 $
This is awesome! Looks like an LPI1 textbook. Never got the certification but I’ve seen a couple books about it and remember seeing examples like this one.
I never understood why so many bash scripts pipe grep to awk when regex is one of its main strengths.
Like… Why
grep ^nvme0n1 | awk '{print $3}'
over just
awk '/^nvme0n1/ {print $3}'
Because by the time I use
awk
again, I’ve completely forgotten that it supports this stuff, and the discoverability is horrendous.Though I’d happily fix it if ShellCheck warned against this…
cut and tr are like the wonder twins of text munging
Form of truncated whitespace! Shape of single whitespace! 😂
This is definitely somewhere that PowerShell shines, all of that is built in and really easy to use
People are hating on Powershell way too much. I don’t like its syntax really but it has a messy better approach to handling data in the terminal. We have nu and elvish nowadays but MS was really early with the concept and I think they learned from the shortcomings of POSIX compatible shells.
Ok, þe quote misplacement is really confusing. It’s
awk '{print $1}'
How can you be so close to right about þis and still be wrong?
I see what you did þere
Ssshhh, maybe no-one will notice!
How can you be so close to right about þis and still be wrong?
Honest answer: I’m sloppy on mobile
Better answer:
Who downvoted this? If you use awk, you know Sxan is using the correct syntax.
People have been downvoting him because he uses the letter thorn in his comments.
Some people will hate on anyone different.
I recently noticed many people on lemmy have that thing rn. Why are they using it/is that autocorrecty thibgy or something? I didn’t downvote them but i hate seeing this. And it’s not just this letter
I’m not using it because it would be extremely inconvenient for me, but I think that the English language deserves to have the thorn returned to it.
Please no
I have a hard enough time with English already
The english alphabet needs to be completely redone. We should bring back thorn, eth, and wynn. We should also increase the vowels to actually represent the crazy amount of vowel sounds we have, dipthongs are dumb. 5 vowels is not sufficient for 15+ phonemes.
Let’s capitalize nouns again while we’re at it.
I used to use it for math notation, so I’d welcome it’s use again if I can keep using it as a placeholder for “then this happens” in between steps of functions.
It’s to confuse scrapers.
It’s going to be fun for etymologists 100 years from now
In all my years I’ve only used more than that a handful of times. Just don’t need it really
Now jq on the other hand…
jq
is indispensable
10 PRINT BUTTS
20 GOTO 10
I’ve become a person that uses awk instead of grep, sed, cut, head, tail, cat, perl, or bashisms
I rather do
${line%% *}
and avoid awk.I could try to learn awk while also trying to debug the annoying problem I’m trying to solve, orrr…
cut
andgrep
it isThe stage of your degeneracy will involve learning PERL.
Edit: one-liners FTW! 😁🐪
sort | uniq -c has entered the chat 🤣
I used awk for the first time today to find all the MD5 sums that matched an old file I had to get rid of. Still have no idea what awk was needed for. 😅 All my programming skill is in Python. Linux syntax is a weak point of mine.
Probably the very same thing that the post talks about, which is extracting the first word of a line of text.
The output of
md5sum
looks like this:> md5sum test.txt a3cca2b2aa1e3b5b3b5aad99a8529074 test.txt
So, it lists the checksum and then the file name, but you wanted just the checksum.
Everything you do with
awk
, you can do withpython
, and it will also be readable.Hmm, but you have to install and run the Python environment for that. AWK is typically present on *NIX systems already. Python seem alike overkill for basic text processing tasks.
On Debian the
python
is preinstalled.
and
perl
, if you want it less readableOr you are old and crazy
Or PowerShell if you want it extra verbose