Then you have no comprehension of number systems. Computers use both binary and hexadecimal number systems, or base 2 and base 16. In binary, There are only two digits, 0 and 1. Thus 1+1=10 in binary. 9+5=e is in hexadecimal. Trinary would be only three digits, and base 4 would only have four digits. Progression in base 4 would be 0, 1, 2, 3, 10, 11, 12, 13, 20, etc. So 2+2=10 in base 4.
You make good points, and you're about to start running into some of the higher aspects of mathematics. First we need to differentiate between the representation of a concept and the concept itself.
Hex and octal are convenient ways to represent binary. Since base 16 = 2^4 and base 8 = 2^3 they are trivially convertible. Thus E3(hex) = 1110 0011 (bin) = 011 100 011 (bin) = 343(oct) = (15*16^1 + 3*16^0)= 243 dec. It's also CCXLIII in Roman Numerals. All of these are forms of representing a number. And when we get to computers it's even more varied. What about negative numbers? We could stick a bit on the end for the sign, or we could use twos compliment notation. Now what about fractions? Floating points use a signed exponent with a signed significand. And different hardware / OS's can represent these concepts in different ways, eg big endian vs little endian.
Now lets put 243 marbles in a bag. You may say there are CCXLIII marbles, someone else might say t here are E3 marbles..etc.. The representation might change, but the concept it represents is immutable.
When people say that math is a universal language, they're not talking about the symbols used to represent the math. They're talking about the underlying concepts. Take your example of order of operations, we could require you to exactly specify the order, or we can allow a notational shorthand of mult/div then add/sub.
What we think of as math is built from a core set of unprovable self evident axioms which define four operators (*, /, +, -) according to three properties (communicative, Associative< and comparitors (=, !=, <, >)
AdditionSubtraction
a+b = b+a (communicative)
a+(b+c) = (a+b) + c (associative)
a+0 = a (identity)
a +(-a) = 0 && a-a = 0 (identity)
Multiplication/Division
a*b = b*a (communicative)
a*(b*c) = (a*b)*c (associative)
a*0 = 0 (identity)
a* (1/a) = 1 (identity)
Additional
a*(b+c) = a*b + a*c (transitive)
0 != 1 (zero identity)
Essentially all math can be derived and represented with those concepts The idea of math as a universal language is that math starts out with these self evident axioms. An alien civilization would start with the same concepts. Since all (essentially) math can be derived from those universally self evident concepts, math itself is a universal language. And thus with enough effort, it's possible to understand anyone else's mathematics.
Well at least until you get to Godel's Incompleteness theorem. But if we've gotten that far, we're not having this conversation.