epochkit

Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates. Seconds or milliseconds, with code snippets in 9 languages.

Unit
or set date & time
Unix (seconds)

1778464819

10-digit, seconds since epoch

Unix (milliseconds)

1778464819000

13-digit, milliseconds since epoch

ISO 8601

2026-05-11T02:00:19+00:00

with UTC offset

Relative

less than a minute ago

from now

Press 14 to copy a format • N for now • T for dark/light

Code snippets

JavaScript
const date = new Date(1778464819 * 1000);
console.log(date.toISOString());

How to use

1

Enter a Unix timestamp

Type a Unix timestamp in the input field (seconds or milliseconds). The date and time fields update automatically.

2

Or pick a date and time

Use the date, time, and timezone inputs. The Unix timestamp updates in real time as you change any field.

3

Copy the format you need

Click any output card to copy the value, or switch to a language tab to copy a ready-to-run code snippet.

What is a Unix timestamp?

A Unix timestamp is a single integer representing the number of seconds (or milliseconds) elapsed since January 1, 1970, 00:00:00 UTC — the Unix epoch. Every operating system, programming language, and database understands this format, making it the universal currency of time in software.

The key difference between a Unix timestamp and a human-readable date is that a Unix timestamp has no timezone. It always refers to the same absolute moment in time. When you display it to users, you apply a timezone offset to convert it to their local time. This is why Unix timestamps are the preferred format for storing and transmitting time across systems: you store one number, and each client renders it in the correct local format.

Seconds vs milliseconds — the original Unix specification uses seconds. JavaScript's Date.now() and many web APIs use milliseconds. A 10-digit timestamp is seconds; a 13-digit timestamp is milliseconds. Divide by 1,000 to convert. This page auto-detects which unit you enter.

The Year 2038 problem — systems that store timestamps as a 32-bit signed integer (common in older C code and databases) can only represent timestamps up to 2,147,483,647 — January 19, 2038 at 03:14:07 UTC. Modern 64-bit systems handle timestamps well beyond the year 292,277,026,596. If you work with legacy systems, this converter will highlight timestamps in the danger zone.

Frequently asked questions

What is a Unix timestamp?
A Unix timestamp is the number of seconds (or milliseconds) that have elapsed since January 1, 1970 at 00:00:00 UTC — a moment known as the Unix epoch. It is the universal time representation used by operating systems, databases, and APIs across every programming language and platform.
How is Unix time different from UTC?
UTC is a time standard with timezones and human-readable formatting. Unix time is a single integer that always counts from the same epoch in the same unit. Converting between them is straightforward: divide by 1000 (if in milliseconds) to get seconds, then apply a timezone offset to display a human-readable time.
Why does my timestamp have 13 digits?
A 13-digit timestamp is in milliseconds. JavaScript's Date.now() and many APIs return milliseconds by default. A 10-digit timestamp is in seconds, which is what Unix systems traditionally use. Divide by 1000 to convert milliseconds to seconds.
What is the Year 2038 problem?
Many older systems store Unix timestamps as a 32-bit signed integer, which can hold values up to 2,147,483,647 — corresponding to January 19, 2038 at 03:14:07 UTC. After that point, the value overflows to a large negative number, causing date calculations to fail. Modern 64-bit systems are not affected.
How do I get the current Unix timestamp in my language?
Use the code snippet section on this page. In JavaScript: Math.floor(Date.now() / 1000). In Python: import time; int(time.time()). In Go: time.Now().Unix(). Every major language has a built-in way to get the current Unix timestamp in one or two lines.

Learn more

Other tools you might find useful