Victory Day is a holiday that commemorates the Soviet victory over Nazi Germany in 1945. It was first inaugurated in the 15 republics of the Soviet Union following the signing of the German Instrument of Surrender late in the evening on 8 May 1945 (9 May Moscow Time). The Soviet government announced the victory early on 9 May after the signing ceremony in Berlin. Although the official inauguration occurred in 1945, the holiday became a non-labor day only in 1965, and only in certain Soviet republics.

The German Instrument of Surrender was signed twice. An initial document was signed in Reims on 7 May 1945 by Alfred Jodl (chief of staff of the German OKW) for Germany, Walter Bedell Smith, on behalf of the Supreme Commander of the Allied Expeditionary Force, and Ivan Susloparov, on behalf of the Soviet High Command, in the presence of French Major-General François Sevez as the official witness.

Since the Soviet High Command had not agreed to the text of the surrender, and because Susloparov, a relatively low-ranking officer, was not authorized to sign this document, the Soviet Union requested that a second, revised, instrument of surrender be signed in Berlin.

A second surrender ceremony was organized in a surviving manor in the outskirts of Berlin late on 8 May, when it was already 9 May in Moscow due to the difference in time zones.

During the Soviet Union’s existence, 9 May was celebrated throughout it and in the Eastern Bloc. Though the holiday was introduced in many Soviet republics between 1946 and 1950, it became a non-working day only in the Ukrainian SSR in 1963 and the Russian SFSR in 1965

The celebration of Victory Day continued during subsequent years. The war became a topic of great importance in cinema, literature, history lessons at school, the mass media, and the arts. The ritual of the celebration gradually obtained a distinctive character with a number of similar elements: ceremonial meetings, speeches, lectures, receptions and fireworks.

Victory Day in modern Russia has become a celebration in which popular culture plays a central role. The 60th and 70th anniversaries of Victory Day in Russia (2005 and 2015) became the largest popular holidays since the collapse of the Soviet Union.

Megathreads and spaces to hang out:

reminders:

  • 💚 You nerds can join specific comms to see posts about all sorts of topics
  • 💙 Hexbear’s algorithm prioritizes comments over upbears
  • 💜 Sorting by new you nerd
  • 🌈 If you ever want to make your own megathread, you can reserve a spot here nerd
  • 🐶 Join the unofficial Hexbear-adjacent Mastodon instance toots.matapacos.dog

Links To Resources (Aid and Theory):

Aid:

Theory:

  • hexaflexagonbear [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    There was a paper from a group of physicists recently which explored KANs (Kolmogorov-Arnold Networks) as an altenernative to multilayer perceptrons for PDE solvers, and it performed surprisingly well. Though the paper definitely had issues with rigour, and I don’t think it’s something you can handwave away for PDE solvers, as even physical PDE can have very pathological solutions.

    Broadly it’s definitely a cool application. I think my main objection to NN PDE solvers though would be that numerical analysis is quite mature, so the error bounds on solutions are likely tighter. Like I can’t imagine having a generic function approximating machine would perform better in terms of controlling error than actually specifying basis functions and exploiting the underlying structure of the differential equations.

    Also, I was mostly annoyed because it compared the two “prediction” times without accounting for training time for NN.

    • dualmindblade [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      So this is way way outside my expertise, grain of salt and whatnot… Wouldn’t the error in most CFD simulations, regardless of technique, quickly explode to its maximum due to turbulence? Like if you’re designing a stirring rotor for a mixing vessel you’re optimizing for the state of the system at T+ [quite a bit of time], I don’t believe hand crafter approximations can give you any guarantees here. And I get the objection about training time, but I think the ultimate goal is to train a NN on a bunch of physical systems with different boundary conditions and fluid properties so you only need to train once and then you can just do inference forevermore.

      • hexaflexagonbear [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        And I get the objection about training time, but I think the ultimate goal is to train a NN on a bunch of physical systems with different boundary conditions and fluid properties so you only need to train once and then you can just do inference forevermore.

        But you wouldn’t have a priori estimates on this. And I’m not sure how it would be trained, but I don’t think it would be possible for the training data to be anything that would allow it to perform as well as a PDE solver (unless our understanding of the underlying physics is fundamentally wrong). The training data would either have to be made by a PDE solver or through direct experimental data, which makes it inherently as or less accurate than a PDE solver.

        • dualmindblade [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Okay just thinking out loud here, everything I’ve seen so far works as you described, the training data is taken either from reality or generated by a traditional solver. I’m not sure this is a fundamental limitation though, you should be able to create a loss function that asks “how closely does the output satisfy the PDE?” rather than “how closely does the output match the data generated by my solver?”. But anyway you wouldn’t need to improve on the accuracy of the most accurate methods to get something useful, if the NN is super fast and has acceptable accuracy you can use that to do the bulk of your optimization and then use a regular simulation and or reality to check the result and possibly do some fine-tuning.