Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates.

Current Unix Timestamp

Timestamp to Date

Seconds or milliseconds since Jan 1, 1970
Result will appear here

Date to Timestamp

Result will appear here

Unix timestamps in one paragraph

A Unix timestamp is the number of seconds elapsed since midnight UTC on January 1, 1970 — an arbitrary moment chosen by the early Unix designers as a convenient zero. It's how almost every computer in the world stores a moment in time internally, because it's a single number, monotonically increasing (mostly), and free of the entire mess of time zones, calendars, and locale-specific date formats. When you see 1746537600 in a log file, that's the same instant for everyone who reads it.

Seconds vs. milliseconds

The original Unix timestamp is in seconds. JavaScript's Date.now() returns milliseconds. Most modern APIs use one or the other, and confusing them is a frequent source of bugs — a millisecond timestamp interpreted as seconds becomes a date in the year 57000-something. The rule of thumb: if your number has 10 digits, it's seconds; 13 digits, milliseconds. This tool detects which you've pasted and converts accordingly.

Time zones, the part that actually trips people up

A Unix timestamp does not have a time zone. It represents a single instant. The time zone enters only when you display it — and that's where things go sideways. The same timestamp 1746537600 is "May 6, 2025 12:00 PM UTC", "May 6, 2025 8:00 AM in New York", and "May 6, 2025 9:00 PM in Tokyo" simultaneously. None of those is more correct; they're three views of the same moment.

This tool shows both the UTC and the local-browser interpretation, so you can sanity-check the conversion either way. If you're debugging a bug where "the timestamp is off by 5 hours," it's almost always a time-zone display issue, not a stored-value bug.

Common conversions you'll do

Edge cases worth knowing

The Year 2038 problem

Systems that store Unix timestamps in a signed 32-bit integer overflow at 2147483647 — January 19, 2038, 03:14:07 UTC. After that, the value wraps around to a negative number representing 1901. Most modern systems have moved to 64-bit timestamps, but plenty of embedded devices, older databases, and legacy file formats haven't. If you maintain anything that stores timestamps as INT instead of BIGINT or TIMESTAMP, plan to fix it.

Leap seconds

Unix time pretends leap seconds don't exist. When one is inserted, most systems just freeze or smear the clock around it. For application-level work this never matters; for very-precise scientific or financial timing it occasionally does.

Negative timestamps

Dates before 1970 are represented as negative numbers. Most language libraries handle them; some database drivers don't. If your domain involves historical dates, test the cases.