• @[email protected]
    link
    fedilink
    52 months ago

    The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      2 months ago

      Of course you can. This is why people use CDNs.

      Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.