Unbelievable today the push-back on filtering by what is in-essence, the entire key to lemmy - published date

  • RoundSparrow @ BT@bulletintree.comOPM
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    I presented a per-computed post_aggregates.inclusion column design as a temporary measure… I did all the leg work…

    Aging out the data is essential. There seems to not be a realization that Lemmy works OK with 25,000 posts in it, but you have to tell PostgreSQL that you only really want what is fresh meat for end-users to read! published date is fundamental to the entire Lemmy experience, NEW posts start with 1 vote, and popularity starts with FRESH postings. Why do I have to keep saying this out loud when the PostgreSQL design of Lemmy ignores it!

  • RoundSparrow @ BT@bulletintree.comOPM
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    11 months ago

    And caching, from MAY 17, 2010 Reddit: “Everything on Reddit is a listing: the front page, in box, comment pages. All are precomputed and dumped into the cache. When you get a listing it’s taken from the cache. Every link and every comment is probably stored in a 100 different versions. For example, a link with 2 votes that’s 30 seconds old is rendered and cached separately. When it hits 30 seconds it’s rendered again. And so on. Every little piece of HTML comes from cache so the CPU isn’t wasted on rendering. When things get slow just add more cache.”

    The avoidance of caching and pre-generated is beyond unbelievable in Lemmy Servers crashing constantly, servers not able to cope with what I consider a small number of users… and the constant avoiding of caching and per-generating. Even for 30 seconds!! You can have front-end machines per-generating those pages, a post-production layer to front the API.