We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi Wei,
Can I request two more new features?
uniq
Thank you very much!
Best regards Wallace
The text was updated successfully, but these errors were encountered:
for join, supporting reading one of the files from stdin.
$ cat testdata/phones.csv \ | csvtk join -f 1 - testdata/region.csv \ | csvtk pretty username phone region -------- ------ --------- gri 11111 somewhere ken 22222 nowhere shenwei 999999 another
for uniq, supporting only keep duplicated or unique key entries, more like the feature of unix uniq
only keep unique key entries
It's what csvtk uniq does. csvtk uniq | csvtk cut -f key
csvtk uniq
csvtk uniq | csvtk cut -f key
only keep duplicated entries
For now, there's a workaround solution, for example
# get duplicated keys $ (seq 5 ; seq 10) | csvtk freq -H | csvtk filter2 -f '$2 > 1' | csvtk cut -f 1 > keys.txt # retrieve records with duplicated keys. $ (seq 5 ; seq 10) | csvtk grep -f 1 -P keys.txt
Sorry, something went wrong.
Thanks much for your quick reply, and looks great for your tips.
For output only unique key entries, I mean the '-u' flag for Unix uniq, the expected behaviors as below. Sorry for the confusion.
(seq 3 ; seq 5) | sort | sed '1i Title' | uniq -u Title 4 5
The current csvtk output like this, which is different:
(seq 3 ; seq 5) | sort | sed '1i Title'| csvtk uniq -f1 Title 1 2 3 4 5
No branches or pull requests
Hi Wei,
Can I request two more new features?
uniq
, supporting only keep duplicated or unique key entries, more like the feature of unixuniq
Thank you very much!
Best regards
Wallace
The text was updated successfully, but these errors were encountered: