Unix Timestamp Converter: Seconds Since 1970
Unix timestamps count seconds elapsed since January 1, 1970 UTC — the 'epoch.' They're how computers store time internally, and converting them to readable dates (and back) is a daily task for developers.
The Unix epoch is January 1, 1970 at 00:00:00 UTC. Every Unix timestamp is the number of seconds since that moment. As of 2026, current timestamps are around 1,778,000,000 (about 56 years × 31.5 million seconds/year).
Many languages use milliseconds (timestamp × 1000) instead of seconds. JavaScript, for example, defaults to ms. Always check whether you're working with seconds or milliseconds.
Common timestamp gotchas
- •Seconds vs milliseconds — JavaScript Date.now() returns ms; most APIs use seconds
- •Time zones — timestamps are always UTC. Display in local time requires conversion
- •32-bit overflow — the 'Year 2038 problem.' Old systems using 32-bit signed integers can't represent timestamps past Jan 19, 2038
- •Leap seconds — Unix timestamps don't count them; the time between two timestamps may be 1 second off across leap second boundaries
Extended FAQ
Why January 1, 1970?
It was the design choice when Unix was created. The earliest Unix systems stored time in seconds since this date for engineering simplicity.
What if I see a 13-digit timestamp?
It's milliseconds. Divide by 1000 to get seconds.
Are my timestamps stored?
No — runs entirely in your browser.
