Here is a question about this article: As in the Scheiner system, speeds were expressed in 'degrees'. Originally the sensitivity was written as a fraction with 'tenths' (for example "18/10° DIN"), where the resultant value 1.8 represented the relative base 10 logarithm of the speed. 'Tenths' were later abandoned with DIN 4512:1957-11, and the example above would be written as "18° DIN". The degree symbol was finally dropped with DIN 4512:1961-10. This revision also saw significant changes in the definition of film speeds in order to accommodate then-recent changes in the American ASA PH2.5-1960 standard, so that film speeds of black-and-white negative film effectively would become doubled, that is, a film previously marked as "18° DIN" would now be labeled as "21 DIN" without emulsion changes.
What is the answer to this question: How was sensitivity expressed at first in the DIN system?
as a fraction with 'tenths' (for example "18/10° DIN")