• KISSmyOS@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    5
    ·
    1 year ago

    Western culture is indigenous to the west.

    Europe’s culture has developed from Christian Roman culture after the Roman Empire conquered most of the continent. The indiginous cultures and religions of mainland Europe were destroyed by the Romans and the Roman-influenced kingdoms springing up after Romes demise.
    The culture resulting from that was exported to Britain with the Norman conquest, and that new British culture was exported to the US and other “Western” nations through colonialism.

    The indiginous “western” culture doesn’t exist anymore.

    • blindbunny@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Huh guess my culture doesn’t exist I’ll be sure to tell everyone on the reservation.

    • jimbo@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 year ago

      The indiginous cultures and religions of mainland Europe were destroyed by the Romans and the Roman-influenced kingdoms springing up after Romes demise.

      Okay, by that reasoning then the indigenous cultures of Australia, the Americas, the Caribbean, Pacific islands, etc were destroyed by European empires/colonization and thus also don’t exist anymore.

    • barsoap@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      You give the Romans way too much credit. They didn’t even manage to export their style of music and had their tradition supplanted by the rest of Europe: Previously they were part of the Mediterranean tradition (which nowadays people recognise as oriental, definitely a misnomer), have a listen. Nowadays they have just as much as an allergy to flourishes as much of the rest of the continent (modulo Greece but also Spain, also at least parts of the Balkan)

      …and this is just to serve as an example.