Hacker News new | past | comments | ask | show | jobs | submit login

What the heck is with MS and weird datetime precision? I figured out some bug with a third party API was due to SQL Server using 1/300th seconds. Who would even think to check for that if you’re not using their products?



1/300 seconds is an odd one for sure. In the case of DateTime however, I'd say it is designed to hit as many use cases as possible with a 64 bit data structure. Using a 1970 epoch as is (logically) used for Unix system times naturally misses even basic use cases like capturing a birthdate.

It is quite hard actually to disagree with the 100ns tick size that they did use. 1 microsecond may have also been reasonable as it would have provided a larger range but there are not many use cases for microsecond accurate times very far in the past or in the future. Similarly using 1 nanosecond may have increased the applicability to higher precision use cases but would have reduced the range to 100 years. Alternately, they could have used a 128 bit structure providing picosecond precision precision from big bang to solar system flame out with the resultant size/performance implications.


God speed trying to parse dates from excel. They have bugs _intentionally built in_


Who would even think to check if the system is counting from 1970/01/01 you’re not using their products?


If they’re using ISO format I don’t really care what they’re counting from. But I care if some ISO dates are preserved exactly and some are rounded to another value… especially when that rounding is non-obvious. It took me months to even identify the pattern clearly enough to find an explanation. Up to that point we just had an arbitrary allowance that the value might vary by some unknown amount, and looked for the closest one.


This is an established standard.

It's really not any stranger than starting dates at 1 CE, or array indexes starting at 0.


> This is an established standard.

Established doesn't mean it is understandable without documentation. Anyone who is not familiar with it doesn't know why it starts in 1970 and counts seconds. You need to actually open the documentation to know about that, and it's name (be it unix time, epoch, epoch time or whatever) doesn't help in understanding what it is and what unit it is using.


The metric system is also not understandable without documentation, byt you don't need to explain it every time because every living person should have gotten that documentation drilled into them at age ten.

UNIX time is an easy international standard, everybody with computer experience knows what it is and how to work with it.


> everybody with computer experience knows what it is and how to work with it.

Thanks for the laugh.

The only ones who needs to know about unixtime is:

developers when they do something which takes it/produces it

*nix sysadmins

Everyone else with "computer experience" could live all their life without the need to know what unixtime is.


Yeah, that's what I meant. People who program or do sysadmin, ie anybody who will ever need to call a sleep function, should know what a unixtime is.


> anybody who will ever need to call a sleep function

Bullshit.

I never needed to know what unixtime is when I wrote anything with sleep().

All I needed to know is how long in seconds I want the execution to pause for, never ever I needed to manually calculate something from/to unixtime, even when working with datetimes types.


No, but as a person who has had a need for sleep() before, you are also the type of person who could be expected to know what unix time is.

Nobody is saying that the two things need to be connected, the point is that it can be name dropped in a type definition, and you would know what it means.


https://devblogs.microsoft.com/oldnewthing/20090306-00/?p=18...

Windows uses the Gregorian Calendar as its epoch.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: