What is Unix Epoch Time? The Complete Developer's Guide
Unix time (also known as epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC. This single integer makes it incredibly easy to store, compare, and calculate time differences across any programming language or system.
If you need to quickly convert a Unix timestamp to a readable date (or vice versa), try our Unix timestamp converter tool. But first, let's understand how this system works and why it's so widely used.
Why January 1, 1970? The History of Epoch Time
The Unix epoch date wasn't chosen for any cosmic significance. When Ken Thompson and Dennis Ritchie were developing Unix at Bell Labs in the late 1960s, they needed a starting point for their timekeeping system.
January 1, 1970, was simply a convenient, round date that was recent enough to be practical. The original Unix systems used a 32-bit signed integer, which could count about 68 years in either direction from the epoch. Starting in 1970 meant the system could handle dates from 1901 to 2038.
The choice proved so practical that virtually every modern operating system and programming language adopted it, making Unix timestamps the lingua franca of computer time.
How Unix Timestamps Work
The concept is beautifully simple: count seconds from a fixed starting point. Here's how different timestamps translate:
| Unix Timestamp | Human-Readable Date (UTC) |
|---|---|
0 | January 1, 1970, 00:00:00 |
86400 | January 2, 1970, 00:00:00 |
1000000000 | September 9, 2001, 01:46:40 |
1735689600 | January 1, 2025, 00:00:00 |
Key insight: There are 86,400 seconds in a day (60 × 60 × 24). This makes date math trivial—to add one day, just add 86400 to your timestamp.
Converting Unix Timestamps in Every Language
Every major programming language has built-in support for Unix timestamps. Here's how to convert them:
JavaScript
// Current Unix timestamp (seconds)
const timestamp = Math.floor(Date.now() / 1000);
// Convert timestamp to Date
const date = new Date(timestamp * 1000);
// Convert Date to timestamp
const ts = Math.floor(date.getTime() / 1000);Note: JavaScript's Date.now() returns milliseconds, not seconds. Divide by 1000 for Unix timestamps.
Python
import time
from datetime import datetime
# Current Unix timestamp
timestamp = int(time.time())
# Convert timestamp to datetime
dt = datetime.fromtimestamp(timestamp)
# Convert datetime to timestamp
ts = int(dt.timestamp())PHP
<?php
// Current Unix timestamp
$timestamp = time();
// Convert timestamp to date string
$date = date('Y-m-d H:i:s', $timestamp);
// Convert date string to timestamp
$ts = strtotime('2025-01-01 00:00:00');Java
import java.time.Instant;
// Current Unix timestamp
long timestamp = Instant.now().getEpochSecond();
// Convert timestamp to Instant
Instant instant = Instant.ofEpochSecond(timestamp);
// Convert Instant to timestamp
long ts = instant.getEpochSecond();SQL (MySQL)
-- Current Unix timestamp
SELECT UNIX_TIMESTAMP();
-- Convert timestamp to datetime
SELECT FROM_UNIXTIME(1735689600);
-- Convert datetime to timestamp
SELECT UNIX_TIMESTAMP('2025-01-01 00:00:00');Go
package main
import "time"
// Current Unix timestamp
timestamp := time.Now().Unix()
// Convert timestamp to Time
t := time.Unix(timestamp, 0)
// Convert Time to timestamp
ts := t.Unix()Common Timestamp Pitfalls
Even experienced developers run into these issues. Here's what to watch for:
Seconds vs. Milliseconds
This is the most common mistake. Unix timestamps are traditionally in seconds, but JavaScript and Java often use milliseconds. A 13-digit number is milliseconds; a 10-digit number is seconds.
1735689600= seconds (correct Unix timestamp)1735689600000= milliseconds (JavaScript style)
Timezone Handling
Unix timestamps are always UTC. When you convert to a local time, you need to account for timezone offset. Many bugs come from assuming timestamps are in local time.
Negative Timestamps
Dates before 1970 are represented as negative numbers. -86400 represents December 31, 1969. Not all systems handle negative timestamps correctly.
The Year 2038 Problem: What Developers Need to Know
On January 19, 2038, at 03:14:07 UTC, 32-bit Unix timestamps will overflow. The maximum value of a signed 32-bit integer is 2,147,483,647, and that's exactly how many seconds will have passed since 1970.
After this moment, 32-bit systems will wrap around to negative numbers, potentially interpreting the date as December 13, 1901. This is similar to the Y2K bug but affects the underlying timestamp representation.
The solution: Use 64-bit integers for timestamps. Most modern systems already do this. A 64-bit timestamp can represent dates billions of years into the future—long after our sun has burned out.
If you're maintaining legacy systems, audit your code for 32-bit timestamp storage, especially in databases and file formats.
Frequently Asked Questions
What is epoch time?
Epoch time (also called Unix time or POSIX time) is a system for tracking time as a running total of seconds. It counts the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC, not counting leap seconds.
Why does Unix time start in 1970?
Unix time starts on January 1, 1970, because that was approximately when the Unix operating system was being developed at Bell Labs. The date was chosen as a convenient, round number that was recent enough to be useful for the computing needs of that era.
What's the difference between Unix timestamp and ISO 8601?
Unix timestamp is a single integer representing seconds since 1970, while ISO 8601 is a human-readable string format like "2025-12-31T12:00:00Z". Unix timestamps are compact and easy to compare mathematically, while ISO 8601 is more readable and includes timezone information explicitly.
Convert Timestamps Instantly
Understanding Unix timestamps is essential for any developer working with dates and times. Whether you're debugging an API response, storing dates in a database, or calculating time differences, Unix time provides a universal, language-agnostic solution.
Ready to convert a timestamp? Use our Unix Timestamp Converter to instantly translate between Unix time and human-readable dates.