CSV to JSON Converter

Convert CSV files to JSON instantly. Auto-detects numbers, booleans, and null values, handles quoted fields with commas, and outputs either an array of objects or a keyed object.

What is a CSV to JSON Converter?

A CSV to JSON converter transforms tabular comma-separated values into structured JSON objects that can be consumed by APIs, JavaScript applications, and modern data pipelines. CSV (Comma-Separated Values) is the most common format for exporting data from spreadsheets and databases, but APIs and frontend frameworks almost universally expect JSON. This free online tool parses CSV instantly in the browser — no upload, no server, no privacy risk — and gives you fine-grained control over delimiters, type coercion, and output structure.

How type detection works

When "Auto-detect types" is enabled, every cell value is tested against a set of patterns before being written to JSON. Integers (42), floats (3.14), booleans (true/false), and empty cells (null) are all coerced to their native JSON types. This means your downstream consumers receive proper numbers and booleans rather than strings, avoiding silent type bugs in JavaScript comparisons and PHP strict-mode validations. Disable the option if you want all values preserved as strings — useful when leading zeros matter (e.g. zip codes like 07030).

Frequently Asked Questions

What is CSV and when should I use it?

CSV (Comma-Separated Values) is a plain-text format where each line represents a row and values within a row are separated by a delimiter (usually a comma). It is supported by every spreadsheet application (Excel, Google Sheets, LibreOffice Calc) and most relational databases via EXPORT or COPY commands. Use CSV when sharing tabular data between systems that don't share a common API, or when human-editability and file size matter more than type fidelity.

What is RFC 4180 and does my CSV follow it?

RFC 4180 is the de-facto standard that defines CSV formatting rules: fields containing commas, double quotes, or newlines must be enclosed in double quotes; a double quote inside a quoted field must be escaped as two consecutive double quotes (""); lines end with CRLF. This tool implements a fully RFC-4180-compliant parser, so quoted fields with embedded newlines and escaped quotes are handled correctly. Many real-world CSV files deviate slightly (using LF instead of CRLF, or missing quotes around fields with spaces), and this parser tolerates those common deviations.

How does auto-detect types work?

When enabled, each cell value is tested in order: empty string becomes null, values matching /^-?\d+$/ become integers, values matching /^-?\d+\.\d+$/ become floats, and case-insensitive true/false become JSON booleans. Everything else remains a string. Disable this option when string preservation is required — for example, phone numbers with leading plus signs, version strings like 1.0 that should stay as strings, or zip codes where 07030 would incorrectly become the integer 7030.

What is the difference between array output and keyed object output?

By default the tool outputs a JSON array of objects — each row becomes one element: [{"id":1,"name":"Alice"}, ...]. Enabling "Keyed by first column" produces a JSON object where the first column's value is used as the key: {"1":{"id":1,"name":"Alice"}, ...}. Use the keyed format when you need O(1) lookups by a unique identifier (e.g. keying a product catalog by SKU, or a user list by ID) rather than iterating an array to find a record.

How do I handle CSV with special characters, quotes, or embedded newlines?

Enclose any field that contains the delimiter, double quotes, or newlines in double quotes as per RFC 4180. To include a literal double quote inside a quoted field, double it: "He said ""hello""" produces the string He said "hello". If your CSV uses a different delimiter (tab, semicolon, pipe), select it from the Delimiter dropdown before converting — the parser will then treat only that character as a field separator, so commas inside fields are safe without quoting.

Related tools