You don't have to look far to find the time. Even if you're not wearing a watch, there's probably a clock on the wall. Or else it's pulsing away on your mobile phone or the computer screen or the car dashboard.
We've never had so many opportunities to know so accurately how late we're running.
The measuring of time is central to how we structure each day. The alarm clock wakes us up, there's a timetable rationing out the school day, when we get home we check the television schedule. Whether it's catching a flight or going to see a movie, it all depends on a mutual understanding of the time.
But how do we all agree on the time? Where did our measurement of time begin? And how is the precise length of a day determined?
Looking for the origin of our modern timekeeping is like looking for the origin of a language. It doesn't have a single source, but has roots in different cultures, stretching back thousands of years. For instance, we divide an hour into 60 minutes and a minute into 60 seconds because the ancient Babylonians used 60 as the basic unit of their counting system. And their division of hours and minutes into sixtieths was adopted by later civilisations.
Like a language, the measurement of time is always changing, influenced by political and economic demands as well as scientific. The Romans changed the names of months to flatter emperors, and in our era we change the clocks twice a year to accommodate winter and summer.
There is nothing inevitable about the current western calendar. The Aztec year had 13 months and was based on a 260-day cycle. If the Babylonians had had a decimal system, maybe our wristwatches would show 100 minutes to the hour.
But whenever our earliest human ancestors decided to take an interest in the passing of time, the first benchmarks would have been from observing nature in the lengthening and shortening of the day, the turning of the seasons and the changes in the night sky.
People from the earliest times watched the Moon and followed its month-long cycle. There are scratches in bone carved by European hunters 20,000 years ago which are thought to count the days of the phases of the Moon. By 3000 bc the Sumerians were using a 30-day month.
The other great natural timepiece is the Sun. The ancient Egyptian astronomers had established by about 4000 bc that there was a solar cycle which lasted about 365 days, which became their measurement for a year.
This time-keeping wasn't just a matter of scientific curiosity. The setting of a calendar went hand in hand with the development of agriculture and towns. The cycle of sowing and harvesting, paying wages and collecting taxes all benefited from a reliable calendar.
The Romans, who borrowed much from earlier timekeeping systems, took the process of codification a step further, imposing a fixed calendar across their empire. The year, consisting of 12 months, was to begin on January 1 and would have 365 days, except for a leap year when another day would be added. We still use the names of their months, such as July and August, and these reforms instituted by Julius Caesar were used in Britain until the 18th century.
They also used a seven-day week. And considering that there is no natural reason for this, it has proved a great survivor of timekeeping and has fended off repeated attempts to displace it.
Among the last attempts to scrap the seven-day week was the Soviet Union's in 1929. In an attempt to increase industrial output and remove the religious associations of Sunday, the government imposed a five-day week and a six-week month.
The system was based on a complicated division of the workforce into continuous shift patterns. But this new system buckled under two pressures.
First, the loss of the weekend meant there was no common time when workers could socialise. And second, the peasants continued to base their market days and farming on the old seven-day patterns. The revolutionary month was dropped in favour of the seven-day week in 1940.
However the months are counted, a more practical, everyday question has been how to tell the time during the day. And the earliest methods of measuring the hour also depended on using nature, in the form of sun and water.
The Egyptians were using sun-dials in 3500 bc, with the movement of a shadow on a dial measuring the passing day. By about 1500 bc there were portable versions. But the length of an "hour" shown on these dials, as a proportion of the day, would vary depending on the time of year. When there was not enough sun, the Egyptians used water clocks, where the dripping of a fixed amount of water through a small hole measured the passing of time.
The Romans and Greeks further refined the concept of marking out intervals of time, developing mechanical devices that counted out the hours. These could be elaborate public monuments, such as the Tower of the Winds, built in the 1st century, which combined a water clock and sun dials.
After the break up of the Roman empire, the technology of timekeeping slipped into abeyance and it must be assumed that for most people time was a rough and ready process, perhaps simply using landmarks in the day, such as sunrise, noon or when the sun moved below a certain point in the landscape.
The ancestors of modern clocks and watches did not appear in western Europe until the 11th century, when monasteries began to develop primitive alarm clocks to get monks out of bed for their prayers. The monastic day was based around a timetable of prayer and clocks could ring the hours.
By the 14th century, these had evolved into more sophisticated time pieces which could be placed in churches or in the homes of the wealthy. They were not particularly accurate but offered a public and more objective version of time.
These early clocks, right through to the 17th century, were likely to lose or gain 15 minutes a day, and this wouldn't have mattered much. In a pre-industrial society, with no one clocking on or off, there was little need to worry about the minute hand and in many ways, this more haphazard approach reflects nature. But this looser interpretation of time drew to a close when the addition of a pendulum by the Dutch scientist Christian Huygens in 1657 brought clocks into a new era of accuracy, with the daily fluctuation measured in seconds rather than minutes.
And the prospect of a very accurate timekeeper had a great significance to a maritime economy such as Britain's. The navigation of ships depended on knowing exactly how far they had travelled westwards or eastwards in an ocean without any visible landmarks. And the key to establishing this longitude was to find a clock which could measure time in a way that was constant, regardless of the climate or sea conditions.
A prize of pound;20,000 was offered by the British government in 1714 to build this "marine chronometer". A self-taught clockmaker, John Harrison, obsessively worked on this project for 42 years, refining his efforts from a clock the size of a cupboard to something that would fit inside a big pocket. These chronometers are on show at the Old Royal Observatory in Greenwich.
In the process John Harrison set new standards of design and accuracy, producing a clock that had a margin of error of only a third of a second a day.
The connection of astronomy with time, which could be traced back to the Babylonians, was broken only in the second half of the last century when time became a scientific calculation, made without reference to the stars.
Modern time is calibrated by atomic clocks which work on the extremely constant movements of an atom. The first example was developed in the United States in 1949, when the National Bureau of Standards began measuring the vibrations of an ammonia molecule.
Six years later, the National Physical Laboratory in Britain began measuring time using an atomic clock based on the cesium atom. Cesium-based atomic clocks have set the accepted standard for time and are accurate to one second in 20 million years.