Time Format Converter
Convert between Unix timestamp (seconds and milliseconds), ISO 8601, RFC 2822, HTTP date, local time, and UTC. Live conversion in any direction.
How to Use
- Type into any row — every other format updates instantly to match.
- Click 'Now' to populate every field with the current time.
- Unix (seconds) is the standard 10-digit timestamp used in most APIs and databases.
- Unix (ms) is the 13-digit JavaScript timestamp (milliseconds since 1970-01-01 UTC).
- ISO 8601 is the international standard format and is what most modern APIs return.
- Local shows your browser's time zone; UTC always shows Coordinated Universal Time.
Time Formats
A Brief History of Computer Time
The Unix epoch — January 1, 1970 — was chosen for a practical reason: it was a recent round date when Unix was being designed, and choosing anything earlier would have wasted bits on values that would never be used. AT&T's Sixth Edition Unix (1975) defined time as a 32-bit unsigned integer measured in 60ths of a second, which only covered about 2.5 years. The switch to seconds-since-epoch came shortly after, and signed 32-bit time_t became the universal standard until 64-bit machines arrived.
ISO 8601 was first published in 1988, codifying a date-and-time format that had been independently invented by various international organizations through the 1970s and 80s. The key innovations: years before months before days (sortable as strings), 24-hour times, explicit time-zone offsets, and a strict separator scheme. Earlier email-era dates (RFC 822 in 1982, RFC 2822 in 2001) used the older English-language "Mon, 28 Apr 2026" format that still appears in HTTP headers and email today, but ISO 8601 has won every other domain.
The Y2K crisis of 1999–2000 dramatically raised awareness of time-related bugs and accelerated adoption of 4-digit years and standardized formats. The Y2K38 problem looms next: 32-bit signed Unix timestamps overflow on January 19, 2038. Most active code is already 64-bit-clean, but legacy embedded systems, certain databases, and dormant code may still need attention as that date approaches.
About This Converter
This converter uses the browser's native Date object as the universal pivot. Type into any field; the value is parsed (with format detection), converted to a Date instance, and re-emitted in every other format. Local time always uses your browser's configured time zone; UTC is always offset zero. The "Now" button populates every field with the current moment.
Everything runs entirely in your browser; no times, time zones, or other data are transmitted or stored. The conversion is idempotent — round-tripping through any format and back returns the same value (subject to the precision differences between seconds and milliseconds).
Frequently Asked Questions
What is a Unix timestamp?
The number of seconds elapsed since 1970-01-01 00:00:00 UTC, ignoring leap seconds. It's the universal "machine time" of the Unix world: small, language-neutral, time-zone-free, and easy to compare. JavaScript and many modern systems use milliseconds (×1000) for higher resolution but the underlying concept is the same.
Why does my timestamp look different than expected?
Three common gotchas: (1) seconds vs milliseconds — a 13-digit number is ms, a 10-digit number is seconds; (2) time-zone confusion — Unix is always UTC, but your local-time display shifts by your offset; (3) negative or huge values — anything before 1970 is negative; anything past year 2038 may overflow 32-bit signed integers in legacy code (the 'Y2K38' problem).
What's the difference between ISO 8601 and RFC 2822?
ISO 8601 (e.g., 2026-04-28T15:30:00Z) is the modern international standard — sortable as a string, unambiguous, used by every modern API. RFC 2822 (e.g., Mon, 28 Apr 2026 15:30:00 +0000) is the older email/HTTP format — human-readable but harder to parse and sort. Use ISO 8601 for everything new; RFC 2822 only when interoperating with email or older HTTP.
What does the Z at the end of an ISO timestamp mean?
Z stands for 'Zulu' (military shorthand for UTC offset 0). 2026-04-28T15:30:00Z and 2026-04-28T15:30:00+00:00 are identical. Other offsets are written as ±HH:MM, e.g., −05:00 for U.S. Eastern Standard Time. ISO 8601 forbids omitting the offset — naked timestamps without Z or offset have ambiguous time zones.
How do I avoid time zone bugs?
Three rules. (1) Store timestamps as UTC, always. (2) Convert to local time only at the moment of display. (3) When comparing, parsing, or transmitting, use ISO 8601 with explicit offset. The classic mistake is storing 'naive' local times in databases — they're impossible to compare correctly across users in different zones.
What's the Y2K38 problem?
32-bit signed integer Unix timestamps overflow at 03:14:08 UTC on January 19, 2038. Modern systems use 64-bit timestamps (good for the next 292 billion years), but legacy code, embedded systems, and older databases may still hit the limit. Most active migration completes well before then; the bug exists today only in dormant codebases.
Common Use Cases
Debugging API responses
Translate a Unix timestamp from a JSON payload to a human-readable date.
Reading log files
Convert log line timestamps between Unix, ISO, and local time when correlating events across services.
Database query construction
Generate ISO timestamps to plug into WHERE clauses without time-zone confusion.
Cross-team coordination
Confirm what 14:30 UTC means in your local time, or what 2pm Pacific looks like to a teammate in Berlin.
Generating test fixtures
Find the Unix timestamp for a specific date and time to seed integration tests.
Old-format conversion
Translate timestamps from RFC 2822-formatted email headers or HTTP-date headers into modern ISO format.
Last updated: