A Brief History of Timekeeping

Humanity has always sought to measure the passage of time in an accurate sensible way. Seasons, days, and nights use to be good enough. But now...

A Brief History of TimekeepingBy Jasmine Noel    22 September 2016      Thinking

In the beginning… well, in the beginning it was hard to know when anything began, as there weren’t any clocks around to measure beginnings or endings. We humans have always felt that innate need to measure and understand the world around them in and have spent a few millennia honing our timekeeping techniques.

The journey of the sun across the sky was our first way of keeping time, but soon we noticed that the light of the sun gave us more information than we thought, and so in places like Egypt sundials appeared, using the slow tracking of shadows from a fixed object to track the day. It may surprise you to learn that the Ancient Egyptian obelisks were actually clocks used to tell time by their shadows.

Water clocks, which drained at a constant rate to mark the passage of time, were later used, and somewhere along the line the concept of the dividing the day into 12 units arose. Originally, these 12 units divided the light hours and the dark hours, and were thus variable and fixed to the gradually shifting sunrise and sunset. Eventually, the entire day was made in a 24-hour period and fixed not to light and dark parts of the day.

Many of the first mechanical clocks did not even have minutes demarcated on them: the division of an hour into quarters or even further 12ths was considered quite accurate enough. Fast forward a few hundred years and we’ve learned neat tricks about vibrating quartz, caesium atoms, and solved science-fiction level problems like “distributed cavity phase and microwave lensing frequency shifts.” We have built machines capable of measuring time in such tiny increments that it defies the imagination.

Time is what we use to make sense of events that occur and the order that they occur. But resolution matters: for a human, things happening within a second might often be considered simultaneous. To the human eye, it’s hard to tell whether the first or third picture in an online album loaded first, so we just consider it to be “at the same time.” In other words, our human timestamp resolution is about one second.

But machines can intake information, process it, make a decision, and take an action in a fraction of the time it takes for humans even realize they’ve seen something. Now, we must be able to tell time, and record it, at the level of the machines which run some of our most important tasks, not the least of which is moving trillions of dollars in the financial markets each day.

We need a machine-time watch for a machine-time world, which sees and records in nanoseconds. This soon-to-be universal reality is already becoming a regulatory reality in the finance, the fastest industry: MiFID II mandates that all business clocks involved in high-speed trading must be synchronized to within 100 microseconds of Coordinated Universal Time (UTC), based on an atomic clock, with timestamp resolution of a microsecond or better. Even this may not be enough to reliably say what happened first in the eyes of the trading machines conducting transactions on behalf of us humans.

If only we were still in a world where day and night was the biggest difference we needed to spot. But the reality is that if machines are involved, then the units of time we require to understand what and why something happens are now on the scale of micro- or nanoseconds. Can you be sure, as sure as the sun rises and sets, what caused which effects if you can’t see which event happened first?

A Brief History of Timekeeping

Jasmine Noel, Product Marketing & Sales Enablement, Corvil
Corvil safeguards business in a machine world. We see a future where all businesses trust digital machines to algorithmically conduct transactions on their behalf. For some businesses, this future is now.
@corvilinc

You might also be interested in...