# Math as a universal language.

#### Patriotic Voter

##### Get a vaccine, not COVID-19!
DP Veteran
Right, so once you run out of single digits, you have to start over with two digits, putting a 1 in the tens place to signify that you have one set of 10, and, and a 0 in the one's place to signify that you have no extras beyond that set of ten. Then 11 means you have 1 set of 10 and 1 extra. 12 means you have one set of 10 and 2 extras etc.

Now imagine that you ran out of single digits a little bit earlier. No one ever invented 9, so you run out of digits once you get to 8. How would you keep counting after 8 if you didn't have any more single digits?

This is complete nonsense. The reason we have rounds of 10 is humans have 10 fingers.

#### AConcernedCitizen

DP Veteran
This is complete nonsense. The reason we have rounds of 10 is humans have 10 fingers.

That is actually quite silly. If we were going by fingers, we should be using base 11. If you start counting from zero fingers, you don't need to start over until you get to 11.

Regardless, however silly you may imagine it to be, binary is actually quite useful because of the way semiconductors work. Hexidecimal is useful because of the way powers of 2 work. Just because you don't personally understand them doesn't make them complete nonsense. You rely on them every day. You couldn't post your outrage about what nonsense they are without them.

#### RAMOSS

DP Veteran
Did you miss the part that 2+2=10 in TRINARY? Are you under the impression that base 10 is the only numeric system in the world?

And, there are 10 kinds of people. Those who understand binary, and those who don't.

#### AConcernedCitizen

DP Veteran
And, there are 10 kinds of people. Those who understand binary, and those who don't.

That was almost as funny as when I said it in post #55.

#### Mithros

DP Veteran
Then you have no comprehension of number systems. Computers use both binary and hexadecimal number systems, or base 2 and base 16. In binary, There are only two digits, 0 and 1. Thus 1+1=10 in binary. 9+5=e is in hexadecimal. Trinary would be only three digits, and base 4 would only have four digits. Progression in base 4 would be 0, 1, 2, 3, 10, 11, 12, 13, 20, etc. So 2+2=10 in base 4.
You make good points, and you're about to start running into some of the higher aspects of mathematics. First we need to differentiate between the representation of a concept and the concept itself.

Hex and octal are convenient ways to represent binary. Since base 16 = 2^4 and base 8 = 2^3 they are trivially convertible. Thus E3(hex) = 1110 0011 (bin) = 011 100 011 (bin) = 343(oct) = (15*16^1 + 3*16^0)= 243 dec. It's also CCXLIII in Roman Numerals. All of these are forms of representing a number. And when we get to computers it's even more varied. What about negative numbers? We could stick a bit on the end for the sign, or we could use twos compliment notation. Now what about fractions? Floating points use a signed exponent with a signed significand. And different hardware / OS's can represent these concepts in different ways, eg big endian vs little endian.

Now lets put 243 marbles in a bag. You may say there are CCXLIII marbles, someone else might say t here are E3 marbles..etc.. The representation might change, but the concept it represents is immutable.

When people say that math is a universal language, they're not talking about the symbols used to represent the math. They're talking about the underlying concepts. Take your example of order of operations, we could require you to exactly specify the order, or we can allow a notational shorthand of mult/div then add/sub.

What we think of as math is built from a core set of unprovable self evident axioms which define four operators (*, /, +, -) according to three properties (communicative, Associative< and comparitors (=, !=, <, >)
a+b = b+a (communicative)
a+(b+c) = (a+b) + c (associative)
a+0 = a (identity)
a +(-a) = 0 && a-a = 0 (identity)

Multiplication/Division
a*b = b*a (communicative)
a*(b*c) = (a*b)*c (associative)
a*0 = 0 (identity)
a* (1/a) = 1 (identity)

a*(b+c) = a*b + a*c (transitive)
0 != 1 (zero identity)

Essentially all math can be derived and represented with those concepts The idea of math as a universal language is that math starts out with these self evident axioms. An alien civilization would start with the same concepts. Since all (essentially) math can be derived from those universally self evident concepts, math itself is a universal language. And thus with enough effort, it's possible to understand anyone else's mathematics.

Well at least until you get to Godel's Incompleteness theorem. But if we've gotten that far, we're not having this conversation.

#### AConcernedCitizen

DP Veteran
You make good points, and you're about to start running into some of the higher aspects of mathematics. First we need to differentiate between the representation of a concept and the concept itself.

Hex and octal are convenient ways to represent binary. Since base 16 = 2^4 and base 8 = 2^3 they are trivially convertible. Thus E3(hex) = 1110 0011 (bin) = 011 100 011 (bin) = 343(oct) = (15*16^1 + 3*16^0)= 243 dec. It's also CCXLIII in Roman Numerals. All of these are forms of representing a number. And when we get to computers it's even more varied. What about negative numbers? We could stick a bit on the end for the sign, or we could use twos compliment notation. Now what about fractions? Floating points use a signed exponent with a signed significand. And different hardware / OS's can represent these concepts in different ways, eg big endian vs little endian.

Now lets put 243 marbles in a bag. You may say there are CCXLIII marbles, someone else might say t here are E3 marbles..etc.. The representation might change, but the concept it represents is immutable.

When people say that math is a universal language, they're not talking about the symbols used to represent the math. They're talking about the underlying concepts. Take your example of order of operations, we could require you to exactly specify the order, or we can allow a notational shorthand of mult/div then add/sub.

What we think of as math is built from a core set of unprovable self evident axioms which define four operators (*, /, +, -) according to three properties (communicative, Associative< and comparitors (=, !=, <, >)
a+b = b+a (communicative)
a+(b+c) = (a+b) + c (associative)
a+0 = a (identity)
a +(-a) = 0 && a-a = 0 (identity)

Multiplication/Division
a*b = b*a (communicative)
a*(b*c) = (a*b)*c (associative)
a*0 = 0 (identity)
a* (1/a) = 1 (identity)

a*(b+c) = a*b + a*c (transitive)
0 != 1 (zero identity)

Essentially all math can be derived and represented with those concepts The idea of math as a universal language is that math starts out with these self evident axioms. An alien civilization would start with the same concepts. Since all (essentially) math can be derived from those universally self evident concepts, math itself is a universal language. And thus with enough effort, it's possible to understand anyone else's mathematics.

Well at least until you get to Godel's Incompleteness theorem. But if we've gotten that far, we're not having this conversation.
I believe the word you were looking for was commutative, rather than communicative.

Also, just to be a pedant, I will point out that multiplication isn't always commutative.

For example, when multiplying basis elements for quaternions, ij != ji.

#### ttwtt78640

##### Sometimes wrong
DP Veteran
Math is also racist. The "Woke" crybabies said so.

Yep, it is too objective and thus not open to different cultural interpretations.

#### Mithros

DP Veteran
I believe the word you were looking for was commutative, rather than communicative.

Also, just to be a pedant, I will point out that multiplication isn't always commutative.

For example, when multiplying basis elements for quaternions, ij != ji.
Spelling is obviously not my forte.

And if we're being pedantic, * is an operator. Operators have definitions. When you define operators you also define their properties. Obviously order matters for quaternions or any other vector/matrix math. But matrix multiplication is a different operator than scalar multiplication. Matrix multiplication is defined in terms of scalar multiplication and addition. Even basic number representation is defined by scalar multiplication and addition.

Of course we can define other mathematical systems from different axioms.. which then likely have the problem of not being internally consistent... But that's getting pretty far off topic.. and brings us back to Godel.

But going back to math as a universal language, its said to be so because "essentially" all of math is derived from a small set of self evident axioms and that we have yet to find any evidence that an alternative "consistent" but incompatible system is possible. Thus any sufficiently advanced civilization **must** share equivalent mathematical concepts even if the symbols and representations are totally different.