• jkercher@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 hours ago

    Meh. I had a bash job for 6 years. I couldn’t forget it if I wanted to. I imagine most people don’t use it enough for it to stick. You get good enough at it, and there’s no need to reach for python.

  • 6mementomorib@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    6
    ·
    5 hours ago

    i used powershell, and even after trying every other shell and as a die hard Linux user I’ve considered going back to powershell cause damn man

    • ronflex@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      I am a huge fan of using PowerShell for scripting on Linux. I use it a ton on Windows already and it allows me to write damn near cross-platform scripts with no extra effort. I still usually use a Bash or Fish shell but for scripting I love being able to utilize powershell.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 hours ago

      Everything is text! And different programs output in different styles. And certain programs can only read certain styles. And certain programs can only convert from some into others. And don’t get me started on IFS.

  • Pixelbeard@lemmy.ca
    link
    fedilink
    Français
    arrow-up
    1
    ·
    5 hours ago

    Je comprend tellement! Je répond en français pour ma première réponse sur Lemmy juste pour voir comment ça va être géré!

    • Pixelbeard@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      I so understand! Answering I. French for my first Lemmy reply just to see how it’s handled.

      Realizing now that language selection is mainly for people filtering. It be cool if it auto translated for people that need it.

  • Victor@lemmy.world
    link
    fedilink
    arrow-up
    28
    ·
    12 hours ago

    Ever since I switched to Fish Shell, I’ve had no issues remembering anything. Ported my entire catalogue of custom scripts over to fish and everything became much cleaner. More legible, and less code to accomplish the same things. Easier argument parsing, control structures, everything. Much less error prone IMO.

    Highly recommend it. It’s obviously not POSIX or anything, but I find that the cost of installing fish on every machine I own is lower than maintaining POSIX-compliant scripts.

    Enjoy your scripting!

    • HyperMegaNet@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      7 hours ago

      Thank you for this. About a year ago I came across ShellCheck thanks to a comment just like this on Reddit. I also happened to be getting towards the end of a project which included hundreds of lines of shell scripts across dozens of files.

      It turns out that despite my workplace having done quite a bit of shell scripting for previous projects, no one had heard about Shell Check. We had been using similar analysis tools for other languages but nothing for shell scripts. As you say, it turned up a huge number of errors, including some pretty spicy ones when we first started using it. It was genuinely surprising to see how many unique and terrible ways the scripts could have failed.

    • ethancedwards8@programming.dev
      link
      fedilink
      English
      arrow-up
      6
      ·
      14 hours ago

      I wish it had a more comprehensive auto correct feature. I maintain a huge bash repository and have tried to use it, and it common makes mistakes. None of us maintainers have time to rewrite the scripts to match standards.

      • I honestly think autocorrecting your scripts would do more harm than good. ShellCheck tells you about potential issues, but It’s up to you to determine the correct behavior.

        For example, how could it know whether cat $foo should be cat "$foo", or whether the script actually relies on word splitting? It’s possible that $foo intentionally contains multiple paths.

        Maybe there are autofixable errors I’m not thinking of.

        FYI, it’s possible to gradually adopt ShellCheck by setting --severity=error and working your way down to warnings and so on. Alternatively, you can add one-off #shellcheck ignore SC1234 comments before offending lines to silence warnings.

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          4 hours ago

          For example, how could it know whether cat $foo should be cat "$foo", or whether the script actually relies on word splitting? It’s possible that $foo intentionally contains multiple paths.

          Last time I used ShellCheck (yesterday funnily enough) I had written ports+=($(get_elixir_ports)) to split the input since get_elixir_ports returns a string of space separated ports. It worked exactly as intended, but ShellCheck still recommended to make the splitting explicit rather than implicit.

          The ShellCheck docs recommended

          IFS=" " read -r -a elixir_ports <<< "(get_elixir_ports)"
          ports+=("${elixir_ports[@]}")
          
      • stetech@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        12 hours ago

        Then you’ll have to find the time later when this leads to bugs. If you write against bash while declaring it POSIX shell, but then a random system’s sh doesn’t implement a certain thing, you’ll be SOL. Or what exactly do you mean by “match standards”?

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      5
      ·
      12 hours ago

      For a defacto windows admin my Powershell skills are…embarrassing lol but I’m getting there!

  • Aceticon@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 hours ago

    When I was finishing of my degree at Uni I actually spent a couple of months as an auxiliary teacher giving professional training in Unix, which included teaching people shell script.

    Nowadays (granted, almost 3 decades later), I remember almost nothing of shell scripting, even though I’ve stayed on the Technical Career Track doing mostly Programming since.

    So that joke is very much me irl.

  • Tungsten5@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    14 hours ago

    And I thought I was the only one… for smaller bash scripts chatGPT/Deepseek does a good enough job at it. Though I still haven’t tried VScode’s copilot on bash scripts. I have only tried it wirh C code and it kiiiinda did an ass job at helping…

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      6
      ·
      13 hours ago

      AI does decently enough on scripting languages if you spell it out enough for it lol, but IMO it tends to not do so well when it comes to compiled languages

      I’ve tried Python with VScode Copilot (Claude) and it did pretty good

      • Tungsten5@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        Yeah I tried that, Claude with some C code. Unfortunately the Ai only took me from point A to point A. And it only took a few hours :D

        • cm0002@lemmy.worldOP
          link
          fedilink
          arrow-up
          6
          ·
          12 hours ago

          I was chalking it up to some scripting languages just tending to be more popular (like python) and thus having more training data for them to draw from

          But that’s a good point too lol

  • coldsideofyourpillow@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    15 hours ago

    That’s why I use nushell. Very convenient for writing scripts that you can understand. Obviously, it cannot beat Python in terms of prototyping, but at least I don’t have to relearn it everytime.

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      16 hours ago

      So the alternative is:

      • either an obtuse script that works everywhere, or
      • a legible script that only works on your machine…
      • shortrounddev@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        12 hours ago

        I am of the opinion that production software shouldn’t be written in shell languages. If it’s something which needs to be redistributed, I would write it in python or something

        • Hexarei@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          7 hours ago

          I tend to write anything for distribution in Rust or something that compiles to a standalone binary. Python does not an easily redistributable application make lol

          • shortrounddev@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            6 hours ago

            Yeah but then you either need to compile and redistribute binaries for several platforms, or make sure that each target user has rust/cargo installed. Plus some devs don’t trust compiled binaries in something like an npm package

        • AnUnusualRelic@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          10 hours ago

          For a bit of glue, a shell script is fine. A start script, some small utility gadget…

          With python, you’re not even sure that the right version is installed unless you ship it with the script.

    • Akito@lemmy.zip
      link
      fedilink
      English
      arrow-up
      11
      ·
      16 hours ago

      Nu is great. Using it since many years. Clearly superior shell. Only problem is, that it constantly faces breaking changes and you therefore need to frequently update your modules.

        • Akito@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          Yesterday, I upgraded from 0.101.0 to 0.102.0 and date to-table was replaced equally (actually better) with into record, however it was not documented well in the error. Had to research for 5 to 10 minutes, which does not sound much, but if you get this like every second version, the amount of time adds up quickly.

            • Akito@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 hours ago

              Yes, I switched to an older version and there was the warning. However, there was no warning on 0.101.0 whatsoever, so upgrading just one patch version broke my master module.

              Sometimes, I skip some versions, so I am certain, that I jumped from < 0.100.0 straight to 0.101.0 and here we are, without any deprecation warning.

        • barsoap@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          14 hours ago

          Not really. They’ve been on the stabilising path for about two years now, removing stuff like dataframes from the default feature set to be able to focus on stabilising the whole core language, but 1.0 isn’t out yet and the minor version just went three digits.

          And it’s good that way. The POSIX CLI is a clusterfuck because it got standardised before it got stabilised. dd’s syntax is just the peak of the iceberg, there, you gotta take out the nail scissors and manicure the whole lawn before promising that things won’t change.

          Even in its current state it’s probably less work for many scripts, though. That is, updating things, especially if you version-lock (hello, nixos) will be less of a headache than writing sh could ever be. nushell is a really nice language, occasionally a bit verbose but never in the boilerplate for boilerplate’s sake way, but in the “In two weeks I’ll be glad it’s not perl” way. Things like command line parsing are ludicrously convenient (though please nushell people land collecting repeated arguments into lists).

          • Akito@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 hours ago

            Fully agree on this. I do not say, it’s bad. I love innovation and this is what I love about Nushell. Just saying, that using it at work might not always be the best idea. ;)

    • expr@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      13 hours ago

      We have someone at work who uses it and he’s constantly having tooling issues due to compatibility problems, so… yeah.

      I’m sure it’s fine for sticking in the shebang and writing your own one-off personal scripts, but I would never actually main it. Too much ecosystem relies on bash/posix stuff.

    • _stranger_@lemmy.world
      link
      fedilink
      arrow-up
      23
      ·
      12 hours ago

      It’s more like bash did it one way and everyone who came after decided that was terrible and should be done a different way (for good reason).

      Looking right at you -eq and your weird ass syntax

      if [[ $x -eq $y ]]