0x37 5 hours ago

These may be objectively superior (I haven't tested), but I have come to realize (like so many others) that if you ever change your OS installation, set up VMs, or SSH anywhere, preferring these is just an uphill battle that never ends. I don't want to have to set these up in every new environment I operate in, or even use a mix of these on my personal computer and the traditional ones elsewhere.

Learn the classic tools, learn them well, and your life will be much easier.

  • bonoboTP 5 hours ago

    Some people spend the vast majority of their time on their own machine. The gains of convenience can be worth it. And they know enough of the classic tools that it's sufficient in the rare cases when working on another server.

    Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.

    • CaptainOfCoit 4 hours ago

      Yeah, this is basically what I do. One example: using neovim with bunch of plugins as a daily driver, but whenever I enter a server that doesn't have it nor my settings/plugins, it isn't a huge problem to run vim or even vi, most stuff works the same.

      Same goes for a bunch of other tools that have "modern" alternatives but the "classic" ones are already installed/available on most default distribution setups.

  • tmountain 4 hours ago

    Some are so vastly better that it's worth whatever small inconvenience comes with getting them installed. I know the classic tools very well, but I'll prefer fd and ripgrep every time.

    • BrouteMinou an hour ago

      For my part, the day I was confused why "grep" couldn't find some files that were obviously there, only to realize that "ripgrep" is ignoring files in the gitignore, that was the day I removed "ripgrep" of my system.

      I never asked for such behaviour, and I have no time for pretty "modern" opinions in a base software.

      Often, when I read "modern", I read "immature".

      I am not ready to replace my stable base utilities for some immature ones having behaviour changes.

      The scripts I wrote 5 years ago must work as is.

      • exdeejay_ an hour ago

        Sounds like the problem you have here is that `grep` is aliased to `ripgrep`. ripgrep isn't intended to be a drop-in replacement for POSIX grep, and the subjectively easier usage of ripgrep can never replace grep's matureness and adoption.

        Note: if you want to make ripgrep not do .gitignore filtering, set `RIPGREP_CONFIG_PATH` to point to a config file that contains `-uu`.

        Sources:

        - https://github.com/BurntSushi/ripgrep/blob/master/GUIDE.md#c...

        - https://github.com/BurntSushi/ripgrep/blob/master/GUIDE.md#a...

        • BrouteMinou 37 minutes ago

          So I stand corrected. I did indeed use ripgrep as a drop-in replacement.

          That's on me!

      • burntsushi an hour ago

        You did ask for it though. Because ripgrep prominently advertises this default behavior. And it also documents that it isn't a POSIX compatible grep. Which is quite intentional. That's not immature. That's just different design decisions. Maybe it isn't the software you're using that's immature, but your vetting process for installing new tools on your machine that is immature.

        Because hey guess what: you can still use grep! So I built something different.

      • maleldil an hour ago

        The very first paragraph in ripgrep's README makes that behaviour very clear:

        > ripgrep is a line-oriented search tool that recursively searches the current directory for a regex pattern. By default, ripgrep will respect gitignore rules and automatically skip hidden files/directories and binary files. (To disable all automatic filtering by default, use rg -uuu.)

        https://github.com/BurntSushi/ripgrep

  • kokada 4 hours ago

    One of the reasons I really like Nix, my setup works basically everywhere (as long the host OS is either Linux or macOS, but those are the only 2 environments that I care). I don't even need root access to install Nix since there are multiple ways to install Nix rootless.

    But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.

    • samtrack2019 2 hours ago

      are you going to install nix in a random docker container?

      • 0x696C6961 2 hours ago

        mise is a good middle ground.

  • thiht 3 hours ago

    That goes against the UNIX philosophy IMO. Tools doing "one thing and doing it well" also means that tools can and should be replaced when a superior alternative emerges. That's pretty much the whole point of simple utilities. I agree that you should learn the classic tools first as it's a huge investment for a whole career, but you absolutely should learn newer alternatives too. I don't care much for bat or eza, but some alternatives like fd (find alt) or sd (sed alt) are absolute time savers.

  • lucasoshiro 5 hours ago

    > that if you ever change your OS installation

    apt-get/pacman/dnf/brew install <everything that you need>

    You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.

    > or SSH anywhere

    When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.

    > even use a mix of these on my personal computer and the traditional ones elsewhere

    I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.

    • actinium226 2 hours ago

      > apt-get/pacman/dnf/brew install <everything that you need>

      If only it were so simple. Not every tool comes from a package with the same name, (delta is git-delta, "z" is zoxide, which I'm not sure I'd remember off the top of my head when installing on a new system). On top of that, you might not like the defaults of every tool, so you'll have config files that you need to copy over or recreate (and hopefully sync between the computers where you use these tools).

      That said I do think nix provides some good solutions for this. It gives you a nice clean way to list the packages you want in a nixfile and also to set their defaults and/or provide some configuration files. It does still require some maintenance (and I choose to install the config files as editable, which is not very nix-y, but I'd rather edit it and then commit the changes to my configs repo for future deploys than to have to edit and redeploy for every minor or exploratory change), but I've found it's much better than trying maintain some sort of `apt-get install [packages]` script.

      • kelvinjps10 2 minutes ago

        After installing it, git clone <dotfiles repo> and then stow .

      • samtrack2019 2 hours ago

        it take less than a sec or less than 10s with a google search to adapt...

    • saghm 2 hours ago

      Stronhly agreed. I don't understand why I'd want to make >99% of my time doing things less convenient in offer to try to make my usage in the <1% of the time I'm on a machine where I can't install things even in a local directory for the user I'm ssh'd into feel less bad by comparison. It's not even a tradeoff where I'm choosing which part of the curve to optimize for; it's literally flattening the high part to make the lower overall convenience level constant.

    • Hackbraten 3 hours ago

      > When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.

      One major difference can emerge from the fact that using a tool regularly inevitably builds muscle memory.

      You’re accustomed to a replacement command-line tool? Then your muscle memory will punish you hard when you’re logged into an SSH session on another machine because you’re going to try running your replacement tool eventually.

      You’re used to a GUI tool? Will likely bite you much less in that scenario.

      • lucasoshiro 2 hours ago

        > You’re accustomed to a replacement command-line tool?

        Yes.

        > Then your muscle memory will punish you hard

        No.

        I'm also used to pt-br keyboards, it's easier to type in my native language, but it's ok if I need to use US keyboards. In terms of muscle memory, keyboards are far harder to adapt.

        A non-tech example: if I go to a Japanese restaurant, I'll use chopsticks and I'm ok with them. At home, I use forks and knives because they make my life easier. I won't force myself to use chopsticks everyday only for being prepared for Japanese restaurants.

    • MarsIronPI 4 hours ago

      > You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.

      The point is that sometimes you're SSHing to a lightweight headless server or something and you can't (or can't easily) install software.

      • w0m 2 hours ago

        Because 'sometimes' doesn't mean you should needlessly handcuf yourself the other 80% of the time.

        I personally haves an ansible playbook to ~setup all my commonly used tooling on ~any cli I will use significantly; (almost) all local installs to avoid need for root. It runs in ~minute - and I have all the Niceties. If it's not worth spending that minute to run; then i won't be on the machine long enough for it to matter.

      • lucasoshiro 2 hours ago

        That's a niche case. And if you need to frequently SSH into a lightweight server you'll probably will be ok with the default commands even though you have the others installed in the local setup.

      • dangus 10 minutes ago

        It does seem like a lot of these tools basically have the same “muscle memory” options anyway.

  • oneeyedpigeon 5 hours ago

    > Learn the classic tools, learn them well, and your life will be much easier.

    Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.

    • acomjean 4 hours ago

      And when those classic tools need a little help:

      Awk and sed.

      I like the idea of new tools though. But knowing the building blocks is useful. The “Unix power tools” book was useful to get me up to speed.. there are so many of these useful mini tools.

      Miller is one I’ve made use of (it also was available for my distro)

  • PaulHoule an hour ago

    When I got my first Unix account [1] I was in a Gnu emacs culture and used emacs from 1989 to 2005 or so. I made the decision to switch to vi for three reasons: (1) less clash with a culture where I mostly use GUI editors that use ^S for something very different than what emacs does, (2) vim doesn't put in continuation characters that break cut-and-paste, (3) often I would help somebody out with a busted machine where emacs wasn't installed, the package database was corrupted, etc and being able to count on an editor that is already installed to resolve any emergency is helpful.

    [1] Not like the time one of my friends "wardialed" every number in my local calling area and posted the list to a BBS and I found that some of them could be logged into with "uucp/uucp" and the like. I think Bell security knew he rang everybody's phone in the area but decided to let billing handle the problem because his parents had measured service.

  • mcswell 36 minutes ago

    Not a comment on these particular tools, but I keep non-standard utilities that I use in my ~/bin/ directory, and they go with me when I move to a different system. The tools mentioned here could be handled the same way, making the uphill a little less steep.

  • skydhash 5 hours ago

    I do prefer some of these tools, due to a much better UX, but the only one I do install in every unix box is ripgrep.

  • chimprich 3 hours ago

    I tend to use some of these "modern" tools if they are a drop-in replacement for existing tools.

    E.g. I have ls set up aliased to eza as part of my custom set of configuration scripts. eza pretty much works as ls in most scenarios.

    If I'm in an environment which I control and is all configured as I like it, then I get a shinier ls with some nice defaults.

    If I'm in another environment then ls still works without any extra thought, and the muscle memory is the same, and I haven't lost anything.

    If there's a tool which works very differently to the standard suite, then it really has to be pulling its weight before I consider using it.

  • andai 5 hours ago

    I wanted to say we should just stick with what Unix shipped forever. But doesn't GNU already violate that idea?

    • imcritic 4 hours ago

      IMO this is very stupid: don't let past dictate future. UNIX is history. History is for historians, it should not be the basis that shapes the environment for engineers living in present.

      • mprovost 4 hours ago

        The point is that we always exist at a point on a continuum, not at some fixed time when the current standard is set in stone. I remember setting up Solaris machines in the early 2000s with the painful SysV tools that they came with and the first thing you would do is download a package of GNU coreutils. Now those utils are "standard", unless of course you're using a Mac. And newer tools are appearing (again, finally) and the folk saying to just stick with the GNU tools because they're everywhere ignore all of the effort that went into making that (mostly) the case. So yes, let's not let the history of the GNU tools dictate how we live in the present.

    • drob518 4 hours ago

      Well, even “Unix” had some differences (BSD switches vs SysV switches). Theoretically, POSIX was supposed to smooth that out, but it never went away. Today, people are more likely to be operating in a GNU Linux environment than anything else (that just a market share fact, not a moral judgement, BSD lovers). Thus, for most people, GNU is the baseline.

  • dankobgd 2 hours ago

    i learned ansible and i run 1 command and wait 10 minutes and configure new linux machine with all the stuff i want

  • dangus 13 minutes ago

    How hard is it to set up your tooling?

    I have a chef cookbook that sets up all the tools I like to have on my VMs. When I bootstrap a VM it includes all the stuff I want like fish shell and other things that aren’t standard. The chef cookbook also manages my SSH keys and settings.

  • kombine 4 hours ago

    I started a new job and spent maybe a day setting up the tools and dotfiles on my development machine in the cloud. I'm going to keep it throughout my employment so it's worth the investment. And I install most of the tools via nix package manager so I don't have to compile things or figure out how to install them on a particular Linux distribution. L

    • w0m 2 hours ago

      Learn Ansible or similar, and you you can be ~OS (OSX/Linux/even Windows) agnostic with relatively complex setups. I set mine up before Agentic systems were as good as they are now; but I assume it would be relatively effortless now.

      IMO, it's worth spending some time to clean up your setup for smooth transition to new machines in the future.

  • pjmlp 5 hours ago

    I know well enough my way around vi, because although XEmacs was my editor during the 1990's when working on UNIX systems, when visiting customers there was a very high probability that they only had ed and vi installed on their server systems.

    Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.

    • mprovost 42 minutes ago

      Ed is the standard text editor.

      • munchlax 22 minutes ago

        And not installed by default in many distros. FML.

  • landgenoot 5 hours ago

    This is how I feel as well. Spend some time "optimizing" my CLI with oh my zshell etc. when I was young.

    Only to feel totally handicapped when logging in into a busybox environment.

    I'm glad I learned how to use vi, grep, sed..

    My only change to an environment is the keyboard layout. I learned Colemak when I was young. Still enjoying it every day.

  • GuB-42 4 hours ago

    I have some of these tools, they are not "objectively superior". A lot of them make things prettier with colors, bargraphs, etc... It is nice on a well-configured terminal, not so much in a pipeline. Some of them are full TUIs, essentially graphical tools that run in a terminal rather than traditional command line tools.

    Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.

    So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.

  • 112233 3 hours ago

    Never will I ever set up tools and home environment directly on the distro. Only in a rootfs that I can proot/toolbx/bwrap into. Not only I don't want to set up again on different computer, distro upgrade has nuked "fancy" tools enough times to be not worth it.

  • trebligdivad 4 hours ago

    Agreed, but some are nice enough that I'll make sure I get them installed where I can. 'ag' is my go to fast grep, and I get it installed on anything I use a lot.

  • jauntywundrkind 2 hours ago

    I indeed would not want to feel stranded with a bespoke toolkit. But I also don't think shying away from good tools is the answer. Generally I think using better tools is the way to go.

    Often there are plenty of of paths open to getting a decent environment as you go:

    Mostly, I rely on ansible scripts to install and configure the tools I use.

    One fallback I haven't seen mentioned, that can get a lot of mileage from it: use sshfs to mount the target system locally. This allows you to use local tool & setup effectively against another machine!

  • shmerl an hour ago

    I do it at least for ripgrep.

    • fireflash38 an hour ago

      Fzf has saved me so much time over the years. It's so good.

      • shmerl 14 minutes ago

        Yes, good point. I use it all the time too. Plus fzf-lua in neovim which depends on it.

  • samcat116 5 hours ago

    For some people the "uphill battle" is the fun part

  • ta1243 3 hours ago

    Along those lines, Dvorak layouts are more efficent, but I use qwerty because it works pretty much everywhere (are small changes like AZERTY still a thing? Certainly our French office is an "international" layout, and generally the main pain internationally are "@" being in the wrong place, and \ not working -- for the latter you can use user@domain when logging into a windows machine, rather than domain\user)

    • doodpants 3 hours ago

      I've been using Dvorak for 24 years. 99% of the time I'm using my own machines, so it's fine. For the other 1% I can hunt-and-peck QWERTY well enough.

  • UltraSane 5 hours ago

    "I don't want to be a product of my environment. I want my environment to be a product of me."

sigio 4 hours ago

As someone who logs into hundreds of servers in various networks, from various customers/clients, there is so little value in using custom tooling, as they will not be available on 90% of the systems.

I have a very limited set of additional tools I tend to install on systems, and they are in my default ansible-config, so will end up on systems quickly, but I try to keep this list short and sweet.

95% of the systems I manage are debian or ubuntu, so they will use mostly the same baseline, and I then add stuff like ack, etckeeper, vim, pv, dstat.

  • twic 3 hours ago

    "servers" is the key word here. Some of the tools listed on that page are just slightly "improved" versions of common sysadmin utilities, and indeed, those are probably not worth it. But some are really development tools, things that you'd install on the small number of machines where you do programming. Those might be.

    The ones that leap out at me are ripgrep (a genuinely excellent recursive grepper), jq (a JSON processor - there is no alternative to this in the standard unix toolkit), and hyperfine (benchmarking).

    • zdc1 2 hours ago

      In my last role rg and jq were included as part of our standard AMI as well as our base container images. It broadens our CVE exposure but it was undoubtably worth it.

  • jayd16 41 minutes ago

    Is there any tool or ssh extension that would bring these apps into the remote session?

    Is something like that possible? Seems like you could conceivablely dump these small file size tools into a temp folder and and use them and that could be automated.

    Is there a security issue with that? Do any of these tools need more permission than the remote session would have?

    Maybe the main issue is portability of these apps?

    This is certainly a common sentiment (I've felt it myself) so is it at all possible?

  • 63stack 2 hours ago

    What's the relevance of these "as someone who ..." posts? Nobody cares that these tools don't happen to fit into your carefully curated list of tools that you install on remote computers. You can install these on your local computer to reap some benefits.

    • mprovost 34 minutes ago

      It's the bean soup theory ("what if I don't like beans") in action.

  • arminiusreturns 3 hours ago

    Another reason emacs as an OS (not fully, but you know) is such a great way to get used to things you have on systems. Hence the quote: "GNU is my operating system, linux is just the current kernel".

    As a greybeard linux admin, I agree with you though. This is why when someone tells me they are learning linux the first thing I tell them is to just type "info" into the terminal and read the whole thing, and that will put them ahead of 90% of admins. What I don't say is why: Because knowing what tooling is available as a built-in you can modularly script around that already has good docs is basically the linux philosophy in practice.

    Of course, we remember the days where systems only had vi and not even nano was a default, but since these days we do idempotent ci/cd configs, adding a tui-editor of choice should be trivial.

    • e3bc54b2 2 hours ago

      > we remember the days where systems only had vi and not even nano was a default

      What are you talking about? I'm still living those days in modern day AWS with latest EC2 machines!

  • carlosjobim 2 hours ago

    You're again confusing this website with your personal email inbox. This is a public message board, all messages you see haven't been written for you specifically - including this blog post.

bieganski 5 hours ago

i wish there was an additional column in the table, that says "what problem does it solve". oh, and 'it's written in rust' does not count.

  • drob518 4 hours ago

    “It’s written in Rust”

    Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam

    • demetris 4 hours ago

      The Rust rewrites can become tiresome, they have become a meme at this point, but there are really good tools there too.

      An example from my personal experience: I used to think that oxipng was just a faster optipng. I took a closer look recently and saw that it is more than that.

      See: https://op111.net/posts/2025/09/png-compression-oxipng-optip...

      • yjftsjthsd-h an hour ago

        If a new tool has actual performance or feature advantages, then that's the answer to "what problem does it solve", regardless of what language it's in.

    • IshKebab 4 hours ago

      That is a differentiator if your competitors are written in Python or Ruby or Bash or whatever. But yeah obviously for marketing to normal people you'd have to say "it's fast and reliable and easy to distribute" because they wouldn't know that these are properties of Go.

      • account42 4 hours ago

        You can write slow unmaintainable brittle garbage in any language though. So even if your competition is literally written in Bash or whatever you should still say what your implementation actually does better - and if it's performance, back it up with something that lets me know you have actually measured the impact on real world use cases and are not just assuming "we wrote it in $language therefore it must be fast".

        • IshKebab an hour ago

          > You can write slow unmaintainable brittle garbage in any language though.

          Sure. You can drive really slowly in a sports car. But if you're looking for travel options for a long distance journey are you going to pick the sports car or the bicycle.

          Also I have actually yet to find slow unmaintainable brittle garbage written in Go or Rust. I'm sure it's possible but it's vastly less likely.

      • drob518 4 hours ago

        No. The differentiator is whatever benefits such an implementation might deliver (e.g., performance, reliability, etc.). Customers don’t start whipping out checkbooks when you say, “Ours is written in Go.”

        • maeln 4 hours ago

          That is what the post you responding to is saying

          • drob518 30 minutes ago

            I’m still responding to the first sentence. It’s not a differentiator, even to smart engineers who understand the programming language in question (e.g., Go or Rust). Whenever an engineer leads with the programming language, I know that it’s just their favorite. You can write great Python, Ruby, Go or Rust, or you can write crappy Python, Ruby, Go, or Rust. Yes, some languages make more sense for certain environments (probably don’t want to do kernel programming in Ruby, for instance), but the programming language is never the differentiator itself. It’s what you make happen with it.

  • oneeyedpigeon 5 hours ago

    Many of the entries do include this detail — e.g. "with syntax highlighting", "ncurses interface", and "more intuitive". I agree that "written in rust", "modern", and "better" aren't very useful!

    • account42 4 hours ago

      Some of this just makes me think that they are compared against the wrong tool though. E.g.

      > cat clone with syntax highlighting and git integration

      doesn't make any sense because cat is not really meant for viewing files. You should be comparing your tool with the more/less/most family of tools, some of which can already do syntax highlighting or even more complex transforms.

      • oneeyedpigeon 4 hours ago

        Yup, I made that same point in another comment. Out of interest, though, how do you get syntax highlighting from any of those pagers? None of them give it to me out of the box.

  • pasc1878 4 hours ago

    Also using a non GPL license does not count.

  • dragonelite 3 hours ago

    A lot of those tools are also usable on windows thats why i like them.

oslem 4 hours ago

I always enjoy these lists. I think most folks out there could probably successfully adopt at least one or two of these tools. For me, that’s ripgrep and jq. The former is a great drop-in replacement for grep and the latter solves a problem I needed solving. I’ll try out a few of the others on this list, too. lsd and dust both appeal to me.

I just enjoy seeing others incrementally improve on our collective tool chest. Even if the new tool isn’t of use to me, I appreciate the work that went into it. They’re wonderful tools in their own right. Often adding a few modern touches to make a great tool just a little bit better.

Thank you to those who have put in so much effort. You’re making the community objectively better.

  • arminiusreturns 3 hours ago

    I think many of us linux admins have such a list. Mine in particular is carefully crafted around GPL-izing my stack as much as possible. I really like the format of this ikrima.dev one though! The other stuff is great too, worth a peruse.

roger_ an hour ago

I kinda wish there was a modern *suite* of improved tools, developed by one team with consistent designs (parameters, colors, tables, etc.)

fergie 5 hours ago

I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.

I genuinely don't know what is going on here.

  • maeln 4 hours ago

    > I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.

    > I genuinely don't know what is going on here.

    I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.

    I genuinely don't know what is going on here.

    Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.

    • fergie 4 hours ago

      > Joking aside, have you ever tried to use some of these tools

      No.

      > I use to not understand why people where using vim until I really tried.

      There's your problem. I respectfully suggest installing Emacs.

  • qustrolabe 33 minutes ago

    You never use fzf? What a tough life in terminal then huh. It's not as useful to run it directly, but pretty much any shell has plugin for fzf support that let's you Ctrl+R to fuzzy search over bash_history (or fish_history or whatever) and Ctrl+T let's you fuzzy search files in current directory.

  • esafak 35 minutes ago

    What do you do in the terminal all day that does not leave you with the desire to improve your toolset? Do you write all your own tools?

  • oneeyedpigeon 5 hours ago

    The core Unix toolset is so good, that you can easily get by with it. Many of these tools are better, but still not necessary, and they certainly aren't widely available by default.

  • robenkleene 4 hours ago

    Out of curiosity, how would you recursively grep files ignoring (hidden files [e.g., `.git`]), only matching a certain file extension? (E.g., `rg -g '*.foo' bar`.)

    I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.

    (And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)

    • MontyCarloHall 3 hours ago

      Depends on how big the directory is. If it only contains a few files, I'd just enumerate them all with `find`, filter the results with `grep`, and perform the actual `grep` for "bar" using `xargs`:

         find . -type f -name "*.foo" | grep -v '/\.' | xargs grep bar
      
      (This one I could do from muscle memory.)

      If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:

         find . -path '*/\.*' -prune -o -type f -name "*.foo" -exec grep bar {} +
      
      (I had to look that one up.)
      • robenkleene 3 hours ago

        Thanks yeah this is a good example of why I prefer the simpler interface for `rg` and `fd`. Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.

        For the record, I think `git grep` is probably the best builtin solution to the problem I gave, but personally I don't know off-hand how to only search for files matching a glob and to use the current directory rather than the repository root with `git grep` (both of which are must haves for me). I'd also need to learn those same commands for different source control systems besides git (I use one other VCS regularly).

        • MontyCarloHall 2 hours ago

          >Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.

          Makes sense. If I had to do this frequently, I'd add a function/alias encapsulating that `find` incantation to my .bashrc, which I keep in version control along with other configuration files in my home directory. That way, when moving to a new environment, I can just clone that repo into a fresh home directory and most of my customizations work out-of-the-box.

          • robenkleene an hour ago

            Yeah I do the same sometimes, at the risk of going too deep into personal preference. A couple of notes about that approach:

            1. I don't recommend using shell functions or aliases for this (e.g., `bashrc`) because then these scripts can't be called from other contexts, e.g., like Vim and Emacs builtin support for shell commands. This can easily be solved by creating scripts that can be called from anywhere (my personal collection of these scripts is here https://github.com/robenkleene/Dotfiles/tree/master/scripts). Personally, I only use Bash functions for things that have to do with Bash's runtime state (e.g., augmenting PATH is a common one).

            2. The more important part though, is I don't always want to search in `*.foo`, I want a flexible, well-designed, API that allows me to on-the-fly decide what to search.

            #2 is particularly important and drifts in philosophy of tooling, a mistake I used to make is building my workflow today into customizations like scripts. This is a bad idea because then the scripts aren't useful as your tasks change, and hopefully your tasks are growing in complexity over time. I.e., don't choose your tools based on your workflow today, otherwise you're building in limitations. Use powerful tools that will support you no matter what task you're performing, that scale practically infinitely. "The measure of a bookshelf is not what has been read, but what remains to be read."

    • HankStallone 3 hours ago

        find . -type f -name '*.foo' -not -path '*/.*' -print0 | xargs -0 grep bar
      • MontyCarloHall 2 hours ago

        The one issue with this approach is that it would still traverse all hidden folders, which could be expensive (e.g. in a git repo with an enormous revision history in `.git/`). `-not -path ...` just prevents entities from being printed, not being traversed. To actually prevent traversal, you need to use `-prune`.

    • fergie 4 hours ago

      grep -ri foo ./*

      Hits in hidden files is not really a pain point for me

      • robenkleene 4 hours ago

        Curious if that answers the "I genuinely don't know what is going on here" then? Not searching hidden files (or third-party dependencies, which `rg` also does automatically with its ignore parsing) isn't just a nice to have, it's mandatory for a number of tasks a software engineer might be performing on a code base?

      • hrimfaxi 3 hours ago

        That doesn't apply to the very specific case for which the parent asked a solution.

  • twic 3 hours ago

    How would you filter and transform a large JSON file, without jq?

    • eulers_secret 2 hours ago

      Many of us just don't use JSON in our day jobs, weird I know, but true.

      The only thing I use JQ for at work is parsing the copilot API response so I remember what the model names are - that's it! TBH, I could just skip it and read the json

  • account42 4 hours ago

    They tend to be popular with the "rewrite it in rust/go" crowd as far as I can tell. Or in other words, you are no longer part of the cool kids.

    • anthk 2 hours ago

      I've seen an online radio player in Go which was unusabiily slow on my Atom n270 due to the badly coded ANSI audio visualization FX' using floating math. Meanwhile a with Cava or another visualizer and mpd+mpc I could do the same using 200x less resources.

seplox an hour ago

I'd like to read this list, but the color scheme is among the least accessible that I've ever come across. Dark, greyish-blue text with dark, bluish-grey highlighting over a dark grey background. Wow.

If any fledgling designers are here, then take note and add this to your list of examples to avoid.

Otek 5 hours ago

This is 2023 article. As with most “modern tools” half of them probably already have some newer, shinier and more trendy replacements

  • oneeyedpigeon 5 hours ago

    There's a lot of tools here. Half still leaves plenty of value.

  • antegamisou 3 hours ago

    I find the opposite to be true. Most of these are really just reinventing the wheel of foundational GNU tools that are really powerful provided one has spent some time on them.

    • Johanx64 2 hours ago

      It's like people dont even know why people use or want these "modern" tools. It's called "sane defaults", and improved UX.

      Those "foundational GNU tools" just suck, sure, people are familiar with them and they are everywhere, but they just plain suck.

      For many common operations you'd want to do by default with grep/find and so on, you have to type mountains of random gibberish to get it done. And that random gibberish isn't something that rolls of your tongue either, thus at minimum you'd define truckload of aliases.

      OR you can use a tool(s) that has marginally "sane defaults" and marginally sane UX out of the box.

      It really isn't that complicated. This has nothing to do with "rust".

      • esafak 30 minutes ago

        Some people just like torturing themselves with spells like

           perl -ne 'map{$h{lc$_}++}/(\w+)/g;END{map{print"$h{$_} $_\n"}sort{$h{$b}<=>$h{$a}}keys%h}'
_ZeD_ 5 hours ago

the second item is

exa modern replacement for ls/tree, not maintained

"not maintained" doesn't smell "modern" to me...

  • JohnKemeny 5 hours ago

    That's the Lindy effect: old tools like ls last because they've already lasted, while modern ones often don’t stick around long enough to.

        the future life expectancy is proportional to its current age
    
    https://en.wikipedia.org/wiki/Lindy_effect
  • Hendrikto 5 hours ago

    Literally the next line lists its replacement eza.

  • arccy 5 hours ago

    like good open source, it's now forked by a community instead of having only a single maintainer

    eza: https://github.com/eza-community/eza

    • abenga 4 hours ago

      The README has an ad at the top.

      Yeeeah, nope.

      • CaptainOfCoit 4 hours ago

        For a cloud-based terminal emulator that heavily focuses on AI none the less. And they have the stomach to call it "for developers".

      • selectnull 4 hours ago

        The tool itself has no ads. What's wrong with a README having an ad?

        • _ZeD_ 3 hours ago

          everything.

          • esafak 27 minutes ago

            Why don't you sponsor the project so they can take it down?

          • selectnull 3 hours ago

            So much talk about paying open source developers and when someone actually does something about it and try to make some money, it's again not good enough.

            Damned if you do and damned if you don't.

  • throw_a_grenade 5 hours ago

    On the contrary, that's exactly what “modern” sounds like. I wonder when all those tools will go unmaintained. Coreutils, with all their problems, are maintained since before authors of many listed tools were born.

ed_blackburn 4 hours ago

I’m on a Mac, and some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions. Rather than replace or fight against the system tools, I supplement them with a few extras. Honestly, most are marginal upgrades over what macOS ships with, except for fzf, which is a huge productivity boost. Fuzzy-finding through my shell history or using interactive autocompletion makes a noticeable difference day to day.

  • MontyCarloHall 4 hours ago

    >some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions

    That’s because they’re not GNU coreutils, they’re BSD coreutils, which are spartan by design. (FWIW, this is one of my theories for why Linux/GNU dominated BSD: the default user experience of the former is just so much richer, even though the system architecture of the latter is arguably superior.)

maeln 4 hours ago

Every time such a list is posted, it tends to generate a lot of debate, but I do think there is at least 2 tools that are really a good addition to any terminal :

`fd`: first I find that the argument semantic is way better than `find`, but that is more a bonus than a real killer feature. Now, it being much, much faster than `find` on most setup, I would consider a valuable feature. But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`

`rg` a.k.a `ripgrep` : Honestly it is just about the speed. It is so much faster than `grep` when searching through a directory, it opens up a lot of possibilities. Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.

But there is one other thing that I really like with `ripgrep` and that I think should be a feature of any "modern" CLI tool : It can format its output in JSON. Not that I am a big fan of JSON, but at least it is a well-defined exchange format. "Classic" CLI tool just output in a "human-readable" format which might just happen to be "machine-readable" if you mess with `awk` and `sed` enough. But it makes piping and scripting just that much more annoying and error & bug prone. Being able to output json, `jq` it and feed it to the next tool is so much better and feel like the missing chain of the terminal.

The big advantage of the CLI is that it is composable and scriptable by default. But it is missing a common exchange format to pass data, and this is what you have to wrangle with a lot of time when scripting. Having json, never mind all the gripes I have with this format, really join everything together.

Also, honorable mention for `zellij` which I find to be a much saner UX-wise alternative to `tmux`, and the `helix` text editor, which for me is neo-vim but with, again, a better UX (especially for beginner) and a lot more battery included feature while remaining faster (IMEX) than nvim with matching plugin for feature-parity.

EDIT: I would also add difftastic ( https://github.com/Wilfred/difftastic ) which is a syntax aware diff tool. I don't use it much, but it does makes some diff so so much easier to read.

[0] https://github.com/sharkdp/fd?tab=readme-ov-file#placeholder...

  • esafak 13 minutes ago

    Great tools like dasel are format agnostic on input and output.

  • maleldil an hour ago

    > Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.

    grep will try to search inside .git. If your project is Javascript, it might be searching inside node_modules, or .venv if Python. ripgrep ignores hidden files, .gitignore and .ignore. You could try using `git grep` instead. ripgrep will still be faster, but the difference won't be as dramatic.

    • maeln 23 minutes ago

      No i specifically made sure to run it in a dir without a .git, node_modules, etc. It is just that slow

  • rkomorn 4 hours ago

    I briefly resisted the notion that fd and ripgrep were useful when a friend suggested them.

    Then I tried them and it was such a night and day performance difference that they're now immediate installs on any new system I use.

  • Izkata 4 hours ago

    > But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`

    That was inherited from find, it has "-exec". Even uses the same placeholder, {}, though I'm not sure about {.}

    • maeln 4 hours ago

      `find` only support `{}`, it does not support `{/}`, `{//}`, `{.}` etc, which is why you often need to do some parsing magic to replicate basic thing such has "the full path without the extension`, `only the filename without the extension` etc

      • 149765 2 hours ago

        I think GNU parallel has similar placeholders, but I do prefer to just use `fd`.

        • maeln 22 minutes ago

          I think it does, and tbf, `fd` is bascially `find` + `parallel`, but I do find that it is nice that it is just one tool and I don't need GNU parrallel :)

tomxor an hour ago

btop is a worthy and missing contender.

It looks quite fancy but I actually like it more for it's functionality, particularly it's tree view for navigating the processes list. I'm not a big fan of full multicolor in these kinds of tools and so appreciate how easy it is to flip to grey scale mode from the built in colour schemes (even from the TUI settings menu).

demetris 4 hours ago

Many are available on Windows too.

I know I have hyperfine, fd, and eza on my Windows 11, and maybe some more I cannot remember right now.

They are super easy to install too, using winget.

oniony 2 hours ago

qq should be on this list. It's like jq but works with multiple file formats, including JSON, YAML, XML, &c. and has a really cool interactive TUI mode.

https://github.com/JFryy/qq

  • mcswell 27 minutes ago

    THAT looks like something worthwhile; I'll be looking into it.

    I was going to top-post that the Unix/Linux command line tools were designed back in the day when data was pretty much line-oriented, e.g. one database record per line. Since then XML, and more recently JSON, have been invented, and tools like grep and sed just don't work for those formats. But you ninja'd me, sort of.

Scotrix 5 hours ago

would be good to have an indicator if it’s available with your distro by default or what package you’ll need to install it since all tools are only as useful as available they are…

PaulKeeble 5 hours ago

duf is pretty good for drive space, has some nice colours and graphs. But its also not as useful for feeding into other tools.

btop has been pretty good for watching a machine to get an overview of everything going on, the latest version has cleaned up how the lazy CPU process listing works.

zoxide is good for cding around the system to the same places. It remembers directories so you avoid typing full paths.

bandrami 2 hours ago

Isn't baobab almost 30 years old?

anthk 5 hours ago

Modern doesn't always mean better. A better replacement for mplayer was mpv, and in some cases mplayer was faster than mpv (think about legacy machines).

   - bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.

   - alias ls='ls -Fh' , problem solved. Now you have * for executables, / for directories and so on.

   - ncdu it's fine, perfect for what it does

   - iomenu it's much faster than fzf and it almost works the same

   - jq it's fine, it's a good example on a new Unix tool

   - micro it's far slower than even vim

   - instead of nnn, sff https://github.com/sylphenix/sff with soap(1) (xdg-open replacement) from https://2f30.org create a mega fast environment. Add MuPDF and sxiv, and nnn and friends will look really slow compared to these.
Yes, you need to set config.h under both sff and soap, but they will run much, much faster than any Rust tool on legacy machines.
  • oneeyedpigeon 5 hours ago

    > bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.

    It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).

    • mcswell 22 minutes ago

      I have two uses for 'cat':

      1) Piping the contents of some file into a process.

      2) Showing the contents of some short file.

      Now (1) is better done with redirection (< or >). The only time I use cat is when I'm testing some pipeline where I only want a few lines of input, so I use 'head' or something similar. Once I have the pipeline working right, I edit the command line to replace 'head' with 'cat'. Easier than re-arranging whole words.

      And it's rare that (2) is the right solution--too often I find that the file was longer than I thought, and I have to use 'more' (actually 'less').

      So a replacement for 'cat' that does color coding sounds pretty much useless to me.

      • oneeyedpigeon 19 minutes ago

        Right, don't think of it as a cat replacement, think of it as a coloriser. If you never want a coloriser, fair enough, ignore bat! But I find it quite nice when I'm reading through source code or Markdown.

  • stryan an hour ago

    > bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.

    From the README:

    >Whenever bat detects a non-interactive terminal (i.e. when you pipe into another process or into a file), bat will act as a drop-in replacement for cat and fall back to printing the plain file contents

    bat works as normal cat for normal uses of cat and a better cat for all those "useless cat" situations we find ourselves in.

  • lucasoshiro 5 hours ago

    > bat it's a useless cat

    I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.

    • ed_blackburn 4 hours ago

      I agree with this. cat is great for "cating" bat is great for throwing shit on the terminal in a fashion that makes it semantically easier to reason with, two different use cases.

      • anthk 3 hours ago

        There's ccze which colorizes stuff without creating a supposed cat(1) replacement.

    • drob518 4 hours ago

      Part of the problem is “naming/marketing.” Bat compares ITSELF to cat, not to more/less. IMO, this confuses the issue.

      • ziml77 3 hours ago

        I think that's because it's super common to use cat to quickly view a file. It has the nice property of using your terminal's scrollback rather than putting you into a pager application. For that use-case it is an alternative to cat.

        That said, I've never really cared much about missing syntax highlighting for cases where I'm viewing file contents with cat. So the tool doesn't really serve a purpose for me and instead I'll continue to load up vim/neovim if I want to view a file with syntax highlighting.

      • listeria an hour ago

        Maybe it should be called "lest"? As in a less/most replacement written in rust. Although it does divert from the theme of more/less/most.

Symmetry 3 hours ago

tldr is an incredible tool and 95% of the time I'll quickly find what I'm looking for there instead of having to search through the man page.