I’ve been trying nushell and words fail me. It’s like it was made for actual humans to use! 🤯 🤯 🤯

It even repeats the column headers at the end of the table if the output takes more than your screen…

Trying to think of how to do the same thing with awk/grep/sort/whatever is giving me a headache. Actually just thinking about awk is giving me a headache. I think I might be allergic.

I’m really curious, what’s your favorite shell? Have you tried other shells than your distro’s default one? Are you an awk wizard or do you run away very fast whenever it’s mentioned?

  • apt_install_coffee@lemmy.ml
    link
    fedilink
    arrow-up
    10
    ·
    22 hours ago

    I used nushell for a good 6 months, it was nice having structured data, but the syntax difference to bash which I use for my day job was just too jarring to stick with.

    Fish was (for me) the right balance of nice syntactic sugar and being able to reasonably expect a bash idiom will work.

  • priapus@piefed.social
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 day ago

    I love Nushell, it’s so much more pleasant for writing scripts IMO. I know some people say they’d just use Python if they need more than what a POSIX shell offers, but I think Nushell is a perfect option in between.

    With a Nushell scripts you get types, structured data, and useful commands for working with them, while still being able to easily execute and pipe external commands. I’ve only ever had two very minor gripes with Nushell, the inability to detach a process, and the lack of a -l flag for cp. Now that uutils supports the -l flag, Nushell support is a WIP, and I realized systemd-run is a better option than just detaching processes when SSHd into a server.

    I know another criticism is that it doesn’t work well with external cli tools, but I’ve honestly never had an issue with any. A ton of CLI tools support JSON output, which can be piped into from json to make working with it in Nushell very easy. Simpler tools often just output a basic table, which can be piped into detect columns to automatically turn it into a Nushell table. Sometimes strange formatting will make this a little weird, but fixing that formatting with some string manipulation (which Nushell also makes very easy) is usually still easier than trying to parse it in Bash.

  • esa@discuss.tchncs.de
    link
    fedilink
    arrow-up
    48
    ·
    2 days ago

    I’ve been using fish (with starship for prompt) for like a year I think, after having had a self-built zsh setup for … I don’t know how long.

    I’m capable of using awk but in a very simple way; I generally prefer being able to use jq. IMO both awk and perl are sort of remnants of the age before JSON became the standard text-based structured data format. We used to have to write a lot of dinky little regex-based parsers in Perl to extract data. These days we likely get JSON and can operate on actual data structures.

    I tried nu very briefly but I’m just too used to POSIX-ish shells to bother switching to another model. For scripting I’ll use #!/bin/bash with set -eou pipefail but very quickly switch to Python if it looks like it’s going to have any sort of serious logic.

    My impression is that there’s likely more of us that’d like a less wibbly-wobbly, better shell language for scripting purposes, but that efforts into designing such a language very quickly goes in the direction of nu and oil and whatnot.

    • Overspark@piefed.social
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      nu 's commands also work on JSON, so you don’t really need jq (or xq or yq) any more. It offers a unified set of commands that’ll work on almost any kind of structured data.

    • phantomwise@lemmy.mlOP
      link
      fedilink
      arrow-up
      6
      ·
      2 days ago

      That’s interesting I hadn’t thought about the JSON angle! Do you mean that you can actually use jq on regular command outputs like ls -l?

      Oil is an interesting project and the backward compatibility with bash is very neat! I don’t see myself using it though, since it’s syntax is very close to bash on purpose I’d probably get oil syntax and bash syntax all mixed up in my head and forget which is which… So I went with nushell because it doesn’t look anything like bash. If you know python what do you think about xonsh? I

      • esa@discuss.tchncs.de
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        That’s interesting I hadn’t thought about the JSON angle! Do you mean that you can actually use jq on regular command outputs like ls -l?

        No, you need to be using a tool which has json output as an option. These are becoming more common, but I think still rare among the GNU coreutils. ls output especially is unparseable, as in, there are tons of resources telling people not to do it because it’s pretty much guaranteed to break.

        • elmicha@feddit.org
          link
          fedilink
          arrow-up
          8
          ·
          1 day ago

          There’s jc (CLI tool and python library that converts the output of popular command-line tools, file-types, and common strings to JSON, YAML, or Dictionaries).

          • cyrl@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            1 day ago

            You’ve opened a rabbit hole I know I’m just going to fall down… thanks netizen!

      • brianary@lemmy.zip
        link
        fedilink
        arrow-up
        5
        ·
        21 hours ago

        What awful syntax?

        Ffs bash uses echo "${filename%.*}" and substring=${string:0:5} and lower="${var,,}" and title="${var^}" &c. It doesn’t use $ for assignment, only in expressions.

  • communism@lemmy.ml
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    1 day ago

    Nushell looks cool but I prefer to stick with the POSIXes so that I know my scripts will always work and syntax always does what I expect it to. I use zsh as a daily driver, and put up with various bashes, ashes, dashes, that come pre-installed with systems I won’t be using loads (e.g. temporary vms).

    • nimpnin@sopuli.xyz
      link
      fedilink
      arrow-up
      25
      ·
      1 day ago

      Always confuses me when people say this. You can use multiple different shells / scripting languages, just as you can use multiple programming languages.

      • Ferk@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        16 hours ago

        If you want your scripts to “always work” you’ll need to go with the most common/standard language, because the environments you work on might not be able to use all of those languages.

        • nimpnin@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          15 hours ago

          I mean if all your scripts are fully general purpose. That just seems really weird to me. I don’t need to run my yt-dlp scripts on the computational clusters I work on.

          Moreover, none of this applies to the interactive use of the shell.

          • Ferk@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            14 hours ago

            It’s not only clusters… I have my shell configuration even in my Android phone, where I often connect to by ssh. And also in my Kobo, and in my small portable console running Knulli.

            In my case, my shell configuration is structured in some folders where I can add config specific to each location while still sharing the same base.

            Maybe not everything is general, but the things that are general and useful become ingrained in a way that it becomes annoying when you don’t have them. Like specific shortcuts for backwards history search, or even some readline movement shortcuts that apparently are not standard everywhere… or jumping to most ‘frecent’ directory based on a pattern like z does.

            If you don’t mind that those scripts not always work and you have the time to maintain 2 separate sets of configuration and initialization scripts, and aliases, etc. then it’s fine.

            • nimpnin@sopuli.xyz
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              14 hours ago

              those scripts not always work

              This feels like ragebait. I have multiple devices, use fish whenever that can be installed and zsh/bash when not, and have none of these issues.

              EDIT:

              or some methods to jump to most recent directory like z.

              Manually downloading the same shell scripts on every machine is just doing what the package manager is supposed to do for you. I did this once to get some rust utils like eza to get them to work without sudo. It’s terrible.

              • Ferk@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                13 hours ago

                Manually downloading the same shell scripts on every machine is just doing what the package manager is supposed to do for you

                If you have a package manager available, and what you need is available there, sure. My Synology NAS, my Knulli, my cygwin installs in Windows, my Android device… they are not so easy to have custom shells in (does fish even have a Windows port?).

                I rarely have to manually copy, in many of those environments you can at least git clone, or use existing syncing mechanisms. In the ones that don’t even have that… well, at least copying the config works, I just scp it, not a big deal, it’s not like I have to do that so often… I could even script it to make it automatic if it ever became a problem.

                Also, note that I do not just use things like z straight away… my custom configuration automatically calls z as a fallback when I mistype a directory with cd (or when I intentionally use cd while in a far/wrong location just so I can reach faster/easier)… I have a lot of things customized, the package install would only be the first step.

                • nimpnin@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  13 hours ago

                  So you’re willing to do a lot of manual package managing, in general put a lot of work into optimizing your workflow, adjusting to different package availability, adjusting to different operating systems…

                  …but not writing two different configs?

                  That is your prerogative but you’re not convincing me. Though I don’t think I’ll be convincing you either.

                  I have separate configs/aliases/etc for most of my machines just because, well, they are different machines with different hardware, software, data, operating systems and purposes. Even for those (most) that I can easily install fish on.

      • communism@lemmy.ml
        link
        fedilink
        arrow-up
        9
        ·
        1 day ago

        I know that. I just don’t have a use case for alternative shells. Zsh works fine for me and I know how it works. I don’t have problems that need fixing, so I don’t need to take the time to learn a new, incompatible shell.

      • elmicha@feddit.org
        link
        fedilink
        arrow-up
        9
        arrow-down
        3
        ·
        1 day ago

        Some people work on machines where they are not allowed to install anything.

      • esa@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 day ago

        Yeah, there should be a clear separation between scripts, which should have a shebang, and interactive use.

        If a script starts acting oddly after someone does a chsh, then that script is broken. Hopefully people don’t actually distribute broken script files that have some implicit dependency on an unspecified interpreter in this day and age.

      • communism@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        They have !/bin/sh shebangs. /bin/sh is a symlink, in my case to zsh. I like using one language.

    • phantomwise@lemmy.mlOP
      link
      fedilink
      arrow-up
      11
      ·
      1 day ago

      I don’t really mind having a non-POSIX shell since it doesn’t prevent bash scripts from working, but I get that if you want portability bash is still best since it’ll work mostly anywhere.

      • 4am@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        If I can shebang nutshell (assuming all the builtins from bash or even sh work) and pass a flag to remove all the fancy UI-for-humans formatting so that piped commands int eh scripts work, then I think this is incredible.

        Yeah having this installed along side other more “standard” shells is fine I guess, but it looks like maybe it has some neat functionality that is more difficult in other shells? I guess I’d need to read up on it more but having a non-interactive mode for machines to read more easily would be a huge plus for it overall. I suppose that depends on what it offers/what it’s trying to accomplish.

  • Obin@feddit.org
    link
    fedilink
    arrow-up
    11
    ·
    1 day ago

    I’m really curious, what’s your favorite shell?

    Emacs eshell+eat

    It essentially reverses the terminal/shell relationship. Here, it’s the shell that starts a terminal session for every command. Eshell is also tightly integrated with Emacs and has access to all the extended functionality. You can use Lisp in one-liners, you can pipe output directly to an emacs buffer, you can write custom commands as lisp functions, full shortcut customization not limited to terminal keys, history search via the completion framework (i.e. consult-history), easy prompt customization, etc.

    There’s also Tramp, which lets you transparently cd into remote hosts via ssh, docker containers, SMB/NFS-shares, archive files, and work with them as if they were normal directories (obviously with limited functionality in some cases, like archives).

    And probably a lot of stuff I’m missing right now.

  • Ferk@lemmy.ml
    link
    fedilink
    arrow-up
    21
    ·
    2 days ago

    I prefer getting comfortable with bash, because it’s everywhere and I need it for work anyway (no fancy shells in remote VMs). But you can customize bash a lot to give more colored feedback or even customize the shortcuts with readline. Another one is pwsh (powershell) because it’s by default in Windows machines that (sadly) I sometimes have to use as VMs too. But you can also install it in linux since it’s now open source.

    But if I wanted to experiment personally I’d go for xonsh, it’s a python-based one. So you have all the tools and power of python with terminal convenience.

    • phantomwise@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      Yeah if you need to work on machines with bash it makes sense to stick with it. Sorry you have to work on Windows… how is powershell compared to bash?

      I don’t know python but xonsh seems really cool, especially since like nushell it works on both linux and windows so you don’t have to bother about OS specific syntax

      • Ferk@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 day ago

        powershell, in concept, is pretty powerful since it’s integrated with C# and allows dealing with complex data structures as objects too, similar to nushell (though it does not “pretty-print” everything the way nushell does, at least by default).

        But in practice, since I don’t use it as much I never really get used to it and I’m constantly checking how to do things… I’m too used to posix tools and I often end up bringing over a portable subset of msys2, cygwin or similar whenever possible, just so I can use grep, sed, sort, uniq, curl, etc in Windows ^^U …however, for scripts where you have to deal with structured data it’s superior since it has builtin methods for that.

    • Overspark@piefed.social
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 day ago

      Yeah, it has. I think they started out as loving the concepts of PowerShell but hating the implementation, combined with the fact that PowerShell is clearly a Windows-first shell and doesn’t work so well on other OSes (it surprised me a lot to find out that PowerShell even has support for linux).

      nu tries to implement these concepts in a way that’s more universal and can work equally well on Linux, macOS or Windows.

      • ReluctantMuskrat@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        Powershell works really well on other OSs now. I use it on MacOS and Linux daily. I might loath MS but Powershell is a fantastic shell and after working with an object-oriented shell I hate going back to anything else.

      • sunbeam60@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        15 hours ago

        That was the foundational concept in powershell; everything is an object. They then went a ruined it with insane syntax and a somewhat logical, but entirely in practiceimpractical verb-noun command structure.

        Nushell is powershell for humans. And helps that it runs across all systems. It’s one of the first things I install.

        • Ephera@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          somewhat logical, but entirely in practice verb-noun command structure.

          That’s supposed to be “impractical”, not “in practice”, for others reading along.

          For example, the “proper” command to list a directory is: Get-ChildItem
          The “proper” command to fetch a webpage is: Invoke-WebRequest https://example.com/

          In these particular cases, they do have aliases defined, so you can use ls, dir and curl instead, but …yeah, that’s still generally what the command names are like.

          It’s partially more verbose than C#, which is one of the most verbose programming languages out there. I genuinely feel like this kind of defeats the point of having a scripting language in the first place, when it isn’t succinct.
          Like, you’re hardly going to use it interactively, because it is so verbose, so you won’t know the commands very well. Which means, if you go to write a script with Powershell, you’ll need to look up how to do everything just as much as with a full-fledged programming language. And I do typically prefer the better tooling of a full-fledged programming language…

  • bastion@feddit.nl
    link
    fedilink
    arrow-up
    9
    ·
    1 day ago

    I like nushell, but I love xonsh. Xonsh is the bastard love child of Python and Bash.

    it can be thought of as:

    • try this statement in Python
    • if there’s an exception, try it in bash.

    Now, that’s not a very accurate description, because the reality is more nuanced, but it allows for things like:

    for file in !(find | grep -i '[.]mp3^'):
        file = Path(file.strip())
        if file != Path('.') and file != file.with_suffix('.mp3'):
        mv @(file) @(file.with_suffix('.mp3'))
    

    Now, there are things in there I wouldn’t bother with normally - like, rather than using mv, I’d just use file.rename(), but the snippet shows a couple of the tools for interaction between xonsh and sh.

    • !(foo) - if writing python, execute foo, and return lines
    • @(foo) - if writing sh, substitute with the value of the foo variable.

    But, either a line is treated in a pyhony way, or in a shelly way - and if a line is shelly, you can reference Python variables or expressions via @(), and if it’s Pythony, you can execute shell code with !() or $(), returning the lines or the exact value, respectively.

    Granted, I love python and like shell well enough, and chimeras are my jam, so go figure.

    • priapus@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Xonsh is also a really cool option. If I used Python more regularly and was more comfortable using it without having to look stuff up, I’d probably use it over Nushell.

      • bastion@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        It’s a superset of python, so valid python should run fine. Imports into your shell are doable, too – for example, I import path.Path in my xonshrc, so it’s always available when I hit the shell. I don’t often have to use Path, because regular shell commands are often more straightforward. But when I do, it’s nice to have it already loaded. Granted, that could get kooky, depending on what you import and execute.

        You can associate/shebang Xonsh with .xsh files, or run “xonsh foo.xsh” - and that works like “bash foo.sh” would, except using xonsh syntax, of course.

        It’s not Bash compatible - copypasta of scripts may not work out. But it’s a good shell with some typical shell semantics.

        there are some great plugins, too - like autovox, which allows you to create python venvs associated with specific subfolders. so, cd myproject does the equivalent of cd myproject; . path/to/venv/bin/activate.

        overall, there definitely is some jank, but it’s a great tool and I love it.

        • MonkCanatella@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          7 hours ago

          Hm. That sounds delightful. I do think once your script hits a not one liner level of complexity, python is a logical next step.

          Does it provide any useful stuff to Python itself? Would I like, derive any benefit to writing a script in xonsh over pure python?

  • Cat_Daddy [any, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 day ago

    I’ve had nushell as my daily driver for a couple years now and I love it. “Made for actual humans to use” is exactly the description I’d give.

  • h4x0r@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    22 hours ago

    Formatting doesn’t like my input for some reason so I am just going to shorten it to ls -ltc | grep '>'.

  • DasFaultier@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    ·
    2 days ago

    (…) 'cause it was quarter part eleven

    on a Saturday in 1999

    🎶🎶

    To answer your questions, I work on the Bash, because it’s what’s largely used at work and I don’t have the nerve to constantly make the switch in my head. I have tried nushell for a few minutes a few months ago, and I think it might actually be great as a human interface, but maybe not so much for scripting, idk.

    • Overspark@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 day ago

      It’s arguably better as a scripting language than as an interactive shell. There are a lot of shell scripts out there that also dabble in light data processing, and it’s not the easiest thing to achieve well or without corner cases. So nu scripts are great if all you need is shell scripts with some data processing.

      nu as an interactive shell is great for the use cases it shines at (like OP’s example), but a bit too non-POSIXy for a lot of people, especially since it’s not (yet) as well polished as something like fish is for example.

      Edit to add that nu’s main drawback for scripting currently is that the language isn’t entirely stable yet, so you better be prepared to change your scripts as required to keep up with newer nu versions (they’re at 0.107 for a reason).

    • Ŝan@piefed.zip
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      13
      ·
      2 days ago

      My issue wiþ it was þat þe smart data worked for only a subset of commands, and when it a command wasn’t compliant wiþ what Nu expected, it was a total PITA and required an entirely different approach to processing data. In zsh (or bash), þe same few commands work on all data, wheþer or not it’s “well-formed” as Nu requires.

      Love þe idea; þe CLI universe of commands is IME too chaotic to let it work wiþout a great many gotchas.

      • TrickDacy@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        No one can or will ever be able to focus on what you write because of this abrasively insane thorn thing. Maybe find a better way of getting attention?

      • otp@sh.itjust.works
        link
        fedilink
        arrow-up
        9
        ·
        1 day ago

        Love þe idea

        Wouldn’t that be a different character because it’s a voices th? Usually that character represents a voiceless th.

        • Ŝan@piefed.zip
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          7
          ·
          1 day ago

          In Icelandic, yes. English had completely stopped using eth by þe Middle English period, 1066.

          • Ferk@lemmy.ml
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            1 day ago

            Didn’t they also stop using the þ in Modern English?

            Why use þ (Þ, thorn) but not ð (Ð, eth)? …and æ (Æ, ash) …might as well go all the way if you want to type like that.

      • Ferk@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        I agree completely with that sentiment, I had the same problem, the output of most commands was interpreted in a way that was not compatible with the way Nu structures data and yet it still rendered as if it were a table with 1 single entry… it was a bit annoying.