Many comments mention that Roman numerals were not used for arithmetic. I cannot comment on that from a historical point of view, but I can say that addition and subtraction are easier with Roman numbers than with decimal notation (for those numbers representable in Roman numerals, of course). For instance let’s do CCXXXVII + CLVIII. You just throw everything together in order and then reduce:
CCCLXXXVVIIIII -> CCCLXXX(VV)(IIIII) = CCCLXXXXV = CCCXCV
Things like IX would have to be treated as single symbols in this process, or they can be expanded to VIIII before combining.
Subtraction is equally easy. To do CCXXXVII – CLVIII we remove C, so CXXXVII, then L so LXXXVII, then V so LXXXII, and finally we have three I where removing the first two gives us LXXX, and removing the last one gives us the final result LXXIX.
Well, basically, they didn’t. Most Roman maths was addition and subtraction, and they used tables and the abacus for multiplying. I don’t remember how they did division, but it must have been painful.
On top of that, most of this was integers with a very limited range, no zero, no negative numbers in the modern sense. For some uses, they had a “kind of fractions” based on 1/12.
So, for the Romans, “26 divided by 4 is 6, remainder 2” was *advanced* math. And “26 divided by 4 is 6 and a semi” (semi=6/12) was *very advanced* math.
Forget things like calculus, that was 1500 years later.
There was no need for advanced mathematics in daily life, almost all math was “how many goods” “how far is it”
Eratosthenes or Pythagoras are famous for understanding “how far is it” really good.
Even if Eratosthenes or Pythagoras understood advanced mathematics, the majority wouldn’t understand anything they are trying to tell them.
They didn’t. They never made it past natural and rational numbers. That’s whole positive numbers and fractions without zero.
The mathematical proof that pi is not rational got buried by the Roman scholars. If you believe in hearsay, [it got buried along with the body of the guy who first proved it.](https://www.youtube.com/watch?v=nFiv00Msw5k)
The concept of zero which came to Europe from India through Arabia was rejected in Europe for millennia. It was only accepted around year 1200.
A lot of modern math is less than 500 years old. Linear algebra got invented in ~1600. Calculus just before ~1700 by Isaac Newton.
They didn’t, depending on your definition of advanced mathematics. My husband took a history of mathematics course and had to do math the way Greeks, Roman’s, and Egyptians did and it was absolutely tediois. One of the key points the professor made during the class is that the choice of mathematical notation can be a hindrance or an asset to the advancement of further mathematics. The Arabic numerals we use today have nice properties that have led to their continued use.
That’s the best part – they didn’t. Some euclidean geometry was the most complicated they got up to, but they didn’t really have a concept of analythical geometry so they didn’t calculate any of it. It was just straight edge, compass and what you could construct using those, no numerical evaluation.
Latest Answers