Did you know that at work you probably should never type into your computer’s search engine the words “biggest clock” because Auto-Correct can do strange things to get you into trouble with IT?
Instead, I decided to see if I could find anything about Japanese clocks - on the home computer. After spending hours with Alice down the rabbit hole, I decided I should look up stuff about those timepieces.
Granted this isn’t the latest news around—it coming from 2015… but if Albert Einstein has taught me anything, it is that "time is only relevant to the observer."
That’s why it can be midnight where I am, and 5PM the next day where you are.
Let's say its 5PM right now (It's 5PM right now) (Thank-you).
If you look at the sun right now, since it is eight minutes old (provided it's not cloudy where you are), the actual sun you think you are seeing at 5PM doesn’t actually hit your face until 5:08PM. That means that if the sun, in its own observable time blew up a millisecond past 5:01PM YOUR time, you wouldn’t know about it until 5:09PM.
Time is a man-made concept anyways… so whatever. Einstein... I understand.
Which is a perfect segue into how a team from the University of Tokyo, led by professor Katori Hidetoshi (surname first), has built clocks so accurate, that they can keep time accurate to the point where they will only lose about one second every 16 billion years.
These are cryogenic optical lattice clocks… and apparently are so precise that current technology can’t measure them.
Which leads me to wonder then: So how the fug do you know it loses one second of time every 16 billion years?
How do you measure something like that, but say that it is impossible to measure?
Guess what… that statement of one-second per 16 billion years is an accuracy that is, in reality, an estimate made by the researchers.
|The two cryogenic optical lattice clocks made by the University of Tokyo.|
Okay… I did my best, but I don’t understand what that means without a better frame of reference.
Let’s see… the current best way we have of measuring an exact time (again, it’s a made up concept ergo any measurement of it is both correct and incorrect at the same time) is via atomic clocks.
Apparently the faster a clock ticks, the more precise it is.
In an atomic clock, the “pendulum” is the radiation which excites the transmission between two atomic states of different energy.
Nope… that didn’t really clear things up. Sorry… I tried.
The Tokyo clock is comprised of a very delicate system that, like in the old days of computer technology where the computer was as large as a mouse but had less computing power of a standard iphone… it operates optimally at around -180C (-292F)… which means it’s going to be a real bugger of a time to determine anything, except that it’s time to stop being so anal about time measurement.
The cold temperature is required to reduce the impact of the surrounding electromagnetic waves, which helps the clock’s accuracy.
“Hey John… what time does your new Japanese Seiko watch say it is?
“John… John… John?
“Crap. He’s dead. Time of death at… I can’t read it because the watch glass has iced over.”
It was only after the University of Tokyo researchers connected the two cryogenic optical lattice clocks, and let them run together for a month… well, since the difference between the two clocks was sooooooo tiny, and actually impossible to measure by today’s equipment, the researchers best-guessed that the two clocks would develop a one-second difference after 16 billion years.
That’s right… too tiny to measure, so the Japanese scientists GUESSED the time lag, and guessed what it would be over 16 billion years.
A guess… that is as inaccurate as knowing what time it is.
Here’s what else the articles I read failed to provide: An EXPLANATION as to why one of the clocks would develop a lag at all?
Shouldn’t BOTH clocks have the same rate of time decay if all things are being equal… and they should be equal, meaning that all of the cryogenic optical lattice clocks are being built exactly the same… ergo the cryogenic optical lattice clocks are the most accurate clock.
But if the cryogenic optical lattice clocks have different rates of decay, then every single cryogenic optical lattice clock would have a different time loss over similar periods of time.
Measurable, and therefore more “real” in telling time, is the old-school cesium atom clock, used to define "one second", which can develop a one second error every 30 million years. Apparently this is something human beings can actually measure.
Now… although I can’t see a practical time-measuring use for the cryogenic optical lattice clocks, those one-percenters who know, believe that the technology could be applied to satellite-based global positioning systems and communications networks, while also serving as a foundation for various precision technologies.
That’s great… what are we using now for these things? Something right… so will using cryogenic optical lattice clock technology make things like GPS more accurate?
According to the University of Tokyo research team who have created the world’s most accurate clocks based on a guesstimate, they hope that the 47 people on this planet who are really into such things instead of more interesting things like practicing procreation, will examine their research and that “Through improved precision, we hold high hopes for accelerated discussions on redefinition of the 'second’.”
Now just hold on a minute! Just how inaccurate are such things as our GPS and wristwatch now?
Like Albert Einstein said:
Time is an illusion.
And I am out of it.
PS: per the image above of Groucho Marx, Marxism is the worst thing I ever heard of.