Doug Peterson, Serkan Bolat, Mar 11, 2013
Commentary by Stephen Downes

Dough Peterson asks, "Just how do computers store time anyway?  Does it think in days, hours, and minutes?" I actually have something like an answer to that question. Computers think in seconds (or, if they're stuffy, milliseconds). For a computer, any given time is a certain number of seconds after an arbitrary start date, known as the epoch. For unix computers (and therefore Linux and, these days, Apple) the epoch started at 12 a.m. January 1, 1970 (GMT). Right now it's about 1.36 billion (I remember when it turned a billion - that was a crazy day). Windows has several epochs, including NT system time (January 1, 1601) and NTP (January 1, 1900). Meanwhile, .Net shares January 1, 1 as its epoch with Dershowitz and Reingold source code (where it is known as Rata Die). For accuracy, systems like Windows can have a service that references an atomic clock server, such as the one run by the National Institute of Standards and Technology (NIST) in the United States. Everything else to do with time is just interface - structures of dates and hours and time zones generated by algorithms from epoch time. So you see - for me (and for computers) time is very straightforward, but humans deal with it very poorly. Dealing with scheduling and calendars is a huge task, one far greater than most simple data management like calculus and quantuum physics. (Photo: John Kellden)

Views: 0 today, 167 total (since January 1, 2017).[Direct Link]
Creative Commons License. gRSShopper

Copyright 2015 Stephen Downes ~ Contact: stephen@downes.ca
This page generated by gRSShopper.
Last Updated: Oct 17, 2017 11:53 p.m.