What Is a Unix Timestamp?
A Unix timestamp (also called POSIX time or epoch time) is the number of seconds that have elapsed since the Unix epoch: January 1, 1970, 00:00:00 UTC. It is the most common way to represent a point in time in programming because it is a simple integer, timezone-independent, and trivial to compare or do arithmetic on.
For example, the timestamp 1714060800 represents April 25, 2024, 16:00:00 UTC.
Seconds vs Milliseconds
Most Unix timestamps are in seconds and are 10-digit numbers (e.g., 1714060800). However, JavaScript's Date.now() returns milliseconds — a 13-digit number (e.g., 1714060800000). When pasting a timestamp, check whether it has 10 or 13 digits to determine the unit.
Where Are Unix Timestamps Used?
- Databases: MySQL, PostgreSQL, SQLite, and MongoDB all store or expose timestamps as Unix epoch integers.
- APIs and logs: REST APIs commonly return
created_at and updated_at as Unix timestamps.
- JWT tokens: The
exp, iat, and nbf claims in JWTs are Unix timestamps in seconds.
- File systems: File modification times (mtime) are stored as Unix timestamps.
- Scheduling: Cron jobs, webhooks, and event systems often use timestamps to define trigger times.
Timezones and Timestamps
Unix timestamps are always in UTC. When you display them to a user, you convert to their local timezone. When you accept a date from a user, you convert it from their local timezone to UTC before storing. This is why timestamps are so useful — they remove all ambiguity about timezone.
Frequently Asked Questions
What is the Unix epoch? +
The Unix epoch is the reference point: January 1, 1970, 00:00:00 UTC. Unix timestamps count seconds from this moment. The choice of 1970-01-01 was arbitrary — it was simply a round date chosen by the original Unix developers.
What is the maximum Unix timestamp? +
A 32-bit signed Unix timestamp overflows on January 19, 2038 — the "Year 2038 problem." Modern systems use 64-bit integers, which extend the range by hundreds of billions of years.
How do I get the current timestamp in JavaScript? +
Use Date.now() for milliseconds, or Math.floor(Date.now()/1000) for seconds. In many backend environments, new Date().getTime() also works.
My timestamp has 13 digits — is that wrong? +
No — it is a millisecond timestamp. JavaScript uses milliseconds by default. Divide by 1000 to convert to seconds, or select "Milliseconds" in the unit dropdown on this tool.
Can I convert a negative timestamp? +
Yes. Negative timestamps represent dates before January 1, 1970. For example, -86400 is December 31, 1969, 00:00:00 UTC.