a/b is the unique solution x to a = bx, if a solution exists. This definition is used for integers, rationals, real and complex numbers.
Defining a/b as a * (1/b) makes sense if you’re learning arithmetic, but logically it’s more contrived as you then need to define 1/b as the unique solution x to bx = 1, if one exists, which is essentially the first definition.
Computers don’t subtract, and you can’t just add a negative, a computer can’t interpret a negative number, it can only store a flag that the number is negative. You need to use a couple addition tricks to subtract to numbers to ensure that the computer only has to add. It’s addition all the way down.
a/b is the unique solution x to a = bx, if a solution exists. This definition is used for integers, rationals, real and complex numbers.
Defining a/b as a * (1/b) makes sense if you’re learning arithmetic, but logically it’s more contrived as you then need to define 1/b as the unique solution x to bx = 1, if one exists, which is essentially the first definition.
That’s me, a degree-holding full time computer scientist, just learning arithmetic I guess.
Bonus question: what even is subtraction? I’m 99% sure it doesn’t exist since I’ve never used it, I only ever use addition.
Addition by the additive inverse.
Now you just replaced one incalculable thing with a different incalculable thing.
Eh?
Computers don’t subtract, and you can’t just add a negative, a computer can’t interpret a negative number, it can only store a flag that the number is negative. You need to use a couple addition tricks to subtract to numbers to ensure that the computer only has to add. It’s addition all the way down.
What does this have to do with computers?
It’s just addition wearing a trench coat, fake beard and glasses
The example was just to illustrate the idea not to define division exactly like that