Browse Questions

# The caesium clock are capable of measuring time with an accuracy of $1$ part in $10^{11}$. If two such clock run for $100$ years, find the difference in time shown by the clock

$(a)\;0.032s \quad (b)\;0.01 s \quad (c)\;0.0032 s\quad (d)\;0.001s$

Possible fractional error=$\large\frac{1}{10^{11}}$
$100\; years=100 \times 364 \times 86400$ seconds
$\qquad\qquad=3.2 \times 10^9s$
Difference in time $=\large\frac{1}{10^{11}}$$\times 3.2 \times 10^9$
$=0.032s$
Hence a is the correct answer.

edited Jan 9, 2014 by meena.p