if you could pick a standard format for a purpose what would it be and why?

e.g. flac for lossless audio because…

(yes you can add new categories)

summary:

  1. photos .jxl
  2. open domain image data .exr
  3. videos .av1
  4. lossless audio .flac
  5. lossy audio .opus
  6. subtitles srt/ass
  7. fonts .otf
  8. container mkv (doesnt contain .jxl)
  9. plain text utf-8 (many also say markup but disagree on the implementation)
  10. documents .odt
  11. archive files (this one is causing a bloodbath so i picked randomly) .tar.zst
  12. configuration files toml
  13. typesetting typst
  14. interchange format .ora
  15. models .gltf / .glb
  16. daw session files .dawproject
  17. otdr measurement results .xml
  • @[email protected]
    link
    fedilink
    English
    472 years ago

    I don’t know what to pick, but something else than PDF for the task of transferring documents between multiple systems. And yes, I know, PDF has it’s strengths and there’s a reason why it’s so widely used, but it doesn’t mean I have to like it.

    Additionally all proprietary formats, specially ones who have gained enough users so that they’re treated like a standard or requirement if you want to work with X.

    • @[email protected]
      link
      fedilink
      102 years ago

      I would be fine with PDFs exactly the same except Adobe doesn’t exist and neither does Acrobat.

    • @[email protected]
      link
      fedilink
      52 years ago

      When PDF was introduced it made these things so much better than they were before that I’ll probably remain grateful for PDF forever and always forgive it all its flaws.

    • @[email protected]
      link
      fedilink
      32 years ago

      I would be fine with PDFs exactly the same except Adobe doesn’t exist and neither does Acrobat.

    • kkard2
      link
      fedilink
      English
      172 years ago

      oh it’s x, not x… i hate our timeline

  • @[email protected]
    link
    fedilink
    30
    edit-2
    2 years ago

    I’d setup a working group to invent something new. Many of our current formats are stuck in the past, e.g. PDF or ODF are still emulating paper, even so everybody keeps reading them on a screen. What I want to see is a standard document format that is build for the modern day Internet, with editing and publishing in mind. HTML ain’t it, as that can’t handle editing well or long form documents, EPUB isn’t supported by browsers, Markdown lacks a lot of features, etc. And than you have things like Google Docs, which are Internet aware, editable, shareable, but also completely proprietary and lock you into the Google ecosystem.

    • @[email protected]
      link
      fedilink
      English
      142 years ago

      Epub isn’t supported by browsers

      So you want EPUB support in browser and you have the ultimate document file format?

      • @[email protected]
        link
        fedilink
        132 years ago

        It would solve the long-form document problem. It wouldn’t help with the editing however. The problem with HTML as it is today, is that it has long left it’s document-markup roots and turned into an app development platform, making it not really suitable for plain old documents. You’d need to cut it down to a subset of features that are necessary for documents (e.g. no Javascript), similar to how PDF/A removes features from PDF to create a more reliable and future proof format.

    • @[email protected]
      link
      fedilink
      72 years ago

      EPubs are just websites bound in xhtml or something. Could we just not make every browser also an epub reader? (I just like epubs).

      • @[email protected]
        link
        fedilink
        72 years ago

        They’re basically zip files with a standardized metadata file to determine chapter order, index page, … and every chapter is a html file.

      • @[email protected]
        link
        fedilink
        12 years ago

        That’s the idea, and while at it, we could also make .zip files a proper Web technology with browser support. At the moment ePub exists in this weird twilight where it is build out of mostly Web technology, yet isn’t actually part of the Web. Everything being packed into .zip files also means that you can’t link directly to the individual pages within an ePub, as HTTP doesn’t know how to unpack them. It’s all weird and messy and surprising that nobody has cleaned it all up and integrated it into the Web properly.

        So far the original Microsoft Edge is the only browser I am aware of with native ePub support, but even that didn’t survive when they switched to Chrome’s Bink.

        • @[email protected]
          link
          fedilink
          1
          edit-2
          2 years ago

          Microsoft Edge’s ePub reader was so good! I would have used it all the time for reading if it hadn’t met its demise. Is there no equivalent fork or project out there? The existing epub readers always have these quirks that annoy me to the point where I’ll just use Calibre’s built in reader which works well enough.

  • darcy
    link
    fedilink
    112 years ago

    i hate to be that guy, but pick the right tool for the right job. use markdown for a readme and latex for a research paper. you dont need to create ‘the ultimate file format’ that can do both, but worse and less compatible

    • @[email protected]
      link
      fedilink
      32 years ago

      I agree with your assertion that there isn’t a perfect format. But the example you gave - markdown vs latex has a counter example - org mode. It can be used for both purposes and a load of others. Matroska container is similarly versatile. They are examples that carefully designed formats can reach a high level of versatility, though they may never become the perfect solution.

  • @[email protected]
    link
    fedilink
    English
    122 years ago

    Some sort of machine-readable format for invoices and documents with related purposes (offers, bills of delivery, receipts,…) would be useful to get rid of some more of the informal paper or PDF human-readable documents we still use a lot. Ideally something JSON-based, perhaps with a cryptographic signature and encryption layer around it.

    • @[email protected]
      link
      fedilink
      English
      132 years ago

      This one exists. SEPA or ISO20022. Encryption/signing isn’t included in the format, it’s managed on transfer layer, but that’s pretty much the standard every business around here works and many don’t even accept PDFs or other human-readable documents anymore if you want to get your money.

      • @[email protected]
        link
        fedilink
        English
        32 years ago

        Well, okay, let me rephrase that. It would be nice if the B2C communication used something like that too.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          In Finland it kinda-sorta does, for some companies (mostly for things where you pay monthly). You can get your invoices directly to your banking account and even accept them automatically if you wish. And that doesn’t include anything else than invoices, so not exactly what you’re after. And I agree, that would be very nice.

          Some companies, like one of our major grocery chain, offer to store your receipts on their online service, but I think that you can only get a copy of the receipt there and it’s not machine readable.

  • danielfgom
    link
    fedilink
    English
    162 years ago

    Definitely FLAC for audio because it’s lossless, if you record from a high fidelity source…

    exFAT for external hard drives and SD cards because both Windows and Mac can read and write to it as well as Linux. And you don’t have the permission pain…

      • danielfgom
        link
        fedilink
        English
        42 years ago

        If you were to format the drive with extra and then copy something to it from Linux - if you try open it on another Linux machine (eg you distro hop after this event) it won’t open the file because your aren’t the owner.

        Then you have to jump though hoops trying to make yourself the owner just so you can open your own file.

        I learnt this the hard way so I just use exFAT and it all works.

        • @[email protected]
          link
          fedilink
          1
          edit-2
          2 years ago

          Then you have to jump though hoops trying to make yourself the owner just so you can open your own file.

          I mean, if you want to set permissions on a drive to a userid and groupid in /etc/passwd and /etc/group on the current machine:

          $ sudo chown -R /mnt/olddrive username
          $ sudo chgrp -R /mnt/olddrive groupname
          

          That’s not that painful, though I guess it could take a while to run on a drive with a lot of stuff.

  • Björn Tantau
    link
    fedilink
    93
    edit-2
    2 years ago

    zip or 7z for compressed archives. I hate that for some reason rar has become the defacto standard for piracy. It’s just so bad.

    The other day I saw a tar.gz containing a multipart-rar which contained an iso which contained a compressed bin file with an exe to decompress it. Soooo unnecessary.

    Edit: And the decompressed game of course has all of its compressed assets in renamed zip files.

    • @[email protected]
      link
      fedilink
      182 years ago

      .tar.zstd all the way IMO. I’ve almost entirely switched to archiving with zstd, it’s a fantastic format.

        • raubarno
          link
          fedilink
          192 years ago

          Gzip is slower and outputs larger compression ratio. Zstandard, on the other hand, is terribly faster than any of the existing standards in means of compression speed, this is its killer feature. Also, it provides a bit better compression ratio than gzip citation_needed.

          • @[email protected]
            link
            fedilink
            72 years ago

            Yes, all compression levels of gzip have some zstd compression level that is both faster and better in compression ratio.

            Additionally, the highest compression levels of zstd are comparable in compression level to LZMA while also being slightly faster in compression and many many times faster in decompression

        • @[email protected]
          link
          fedilink
          5
          edit-2
          2 years ago

          gzip is very slow compared to zstd for similar levels of compression.

          The zstd algorithm is a project by the same author as lz4. lz4 was designed for decompression speed, zstd was designed to balance resource utilization, speed and compression ratio and it does a fantastic job of it.

      • Turun
        link
        fedilink
        32 years ago

        The only annoying thing is that the extension for zstd compression is zst (no d). Tar does not recognize a zstd extension, only zst is automatically recognized and decompressed. Come on!

          • Turun
            link
            fedilink
            1
            edit-2
            2 years ago

            Not sure what that does.

            Yes, you can use options to specify exactly what you want. But it should recognize .zstd as zstandard compression instead of going “I don’t know what this compression is”. I don’t want to have to specify the obvious extension just because I typed zstd instead of zst when creating the file.

        • @[email protected]
          link
          fedilink
          2
          edit-2
          2 years ago

          If we’re being entirely honest just about everything in the zstd ecosystem needs some basic UX love. Working with .tar.zst files in any GUI is an exercise in frustration as well.

          I think they recently implemented support for chunked decoding so reading files inside a zstd archive (like, say, seeking to read inside tar files) should start to improve sooner or later but some of the niceties we expect from compressed archives aren’t entirely there yet.

          Fantastic compression though!

    • @[email protected]
      link
      fedilink
      362 years ago

      It was originally rar because it’s so easy to separate into multiple files. Now you can do that in other formats, but the legacy has stuck.

  • @[email protected]
    link
    fedilink
    292 years ago

    Data output from manufacturing equipment. Just pick a standard. JSON works. TOML / YAML if you need to write as you go. Stop creating your own format that’s 80% JSON anyways.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      JSON is nicer for some things, and YAML is nicer for others. It’d be nice if more apps would let you use whichever you prefer. The data can be represented in either, so let me choose.

  • @[email protected]
    link
    fedilink
    26
    edit-2
    2 years ago

    I’d like an update to the epub ebook format that leverages zstd compression and jpeg-xl. You’d see much better decompression performance (especially for very large books,) smaller file sizes and/or better image quality. I’ve been toying with the idea of implementing this as a .zpub book format and plugin for KOReader but haven’t written any code for it yet.

  • Resume information. There have been several attempts, but none have become an accepted standard.

    When I was a consultant, this was the one standard I longed for the most. A data file where I could put all of my information, and then filter and format it for each application. But ultimately, I wanted to be able to submit the information in a standardised format - without having to re-enter it endlessly into crappy web forms.

    I think things have gotten better today, but at the cost of a reliance on a monopoly (LinkedIn). And I’m not still in that sort of job market. But I think that desire was so strong it’ll last me until I’m in my grave.

  • @[email protected]
    link
    fedilink
    English
    26
    edit-2
    2 years ago

    XML for machine-readable data because I live to cause chaos

    Either markdown or Org for human-readable text-only documents. MS Office formats and the way they are handled have been a mess since the 2007 -x versions were introduced, and those and Open Document formats are way too bloated for when you only want to share a presentable text file.

    While we’re at it, standardize the fucking markdown syntax! I still have nightmares about Reddit’s degenerate four-space-indent code blocks.

    • Agentseed
      link
      fedilink
      182 years ago

      Man, I’d love if markdown was more widely used, it’s pretty much the perfect format for everything I do

      • @[email protected]
        link
        fedilink
        42 years ago

        Markdown misses checkboxes anywhere, especially in tables.

        But markdown is just good. It’s just writing text as normal basically

      • @[email protected]
        link
        fedilink
        12 years ago

        You can convert Markdown to a number of formats with pandoc, if you want to author in Markdown and just distribute in some other format.

        Not going to work if you need to collaborate with other people, though.

      • raubarno
        link
        fedilink
        62 years ago

        Markdown, CommonMark, .rst formats are good for printing basic rich text for technical documentation and so on, when text styling is made by an external application and you don’t care about reproducible layout.

        But you also want to make custom styles (font size, text alignment, colours), page layout (paper format, margin size, etc.) and make sure your document is reproducible across multiple processing applications, that the layout doesn’t break, authoring tools, maybe even some version control, etc. This is when it strikes you bad.

  • u/lukmly013 💾 (lemmy.sdf.org)
    link
    fedilink
    English
    52 years ago

    Something for I/Q recordings. But I don’t know what would do it. Currently the most supported format seems to be s16be WAV, but there’s different formats, bit depths and encodings. I’ve seen .iq, .sdriq, .sdr, .raw, .wav. Then there’s different bit depths and encodings: u8, s8, s16be, s16le, f32,… Also there’s different ways metadata like center frequency is stored.

      • u/lukmly013 💾 (lemmy.sdf.org)
        link
        fedilink
        English
        22 years ago

        God damnit. I wrote an answer and it disappeared a while after pressing reply. I am lazy to rewrite it and my eyes are sore.

        Anyway, I am too dumb to actually understand I/Q samples. It stands for In-Phase and Quadrature, they are 90° out of phase from each other. That’s somehow used to reconstruct a signal. It’s used in different areas. For me it’s useful to record raw RF signals from software defined radio (SDR).
        For example, with older, less secure systems, you could record signal from someone’s car keyfob, then use a Tx-capable SDR to replay it later. Ta-da! Replay attack. You unlocked someone’s car.
        In a better way, you could record raw signal from a satellite to later demodulate and decode it, if your computer isn’t powerful enough to do it in real-time.

        If you want an example, you can download DAB+ radio signal recording here: https://www.sigidwiki.com/wiki/DAB%2B and then replay it in Welle.io (available as Appimage) if it’s in compatible format. I haven’t tested it.