Skip to main content
C
CodeUtil

JSON to CSV: Exporting API Data for Excel and Spreadsheets

At Šikulovi s.r.o., someone asks me to export API data to Excel almost every week. Marketing wants campaign data, finance wants transactions. Here's how I handle the conversion - including the nested object problem that took me forever to solve properly.

2026-01-267 min
Related toolJSON ↔ CSV Converter

Use the tool alongside this guide for hands-on practice.

The weekly Excel export request

It happens every week at Šikulovi s.r.o.. The marketing team needs campaign data, the finance team needs transaction records, the support team needs user lists. They all want Excel files. And our APIs return JSON. So I convert JSON to CSV, which Excel opens perfectly.

At first, I wrote custom export scripts for each request - a different one every time. Then I got tired of that (around request number fifty, I think) and built a general-purpose converter. Now any JSON array becomes a CSV file in seconds. It handles the edge cases I kept running into: nested objects, arrays in fields, special characters, and different date formats.

Why CSV for data exchange

I've tried other formats - XLSX, ODS, even custom XML exports. But CSV remains the universal data format that never fails. Every spreadsheet program opens it. Every database imports it. Every programming language has CSV libraries. When you need to move data between systems, CSV is the path of least resistance. I've never had a client unable to open a CSV.

  • Universal compatibility: Excel, Google Sheets, Numbers, LibreOffice all read CSV
  • Simple format: Just text with commas, no complex parsing required
  • Human readable: You can open it in any text editor
  • Version control friendly: Diffs are meaningful, unlike binary Excel files
  • Database import: Most databases have CSV import built in
  • Small file size: No formatting overhead like XLSX

The nested object problem

This is the problem that drove me crazy for months. JSON loves nesting. A user object might have an address object with city, state, and zip fields. A product might have a nested category with name and id. CSV is flat - just rows and columns. How do you flatten nested data without losing your mind?

After trying several approaches, I settled on dot notation for nested fields. user.address.city becomes a column header, and the value goes in that column. It's not perfect - you lose some structure - but it works well enough for most exports. The alternative is multiple CSV files with foreign keys, which is overkill for the kind of quick exports I usually need.

  • Dot notation: user.address.city flattens to a single column
  • Arrays: Convert to comma-separated or JSON string in the cell
  • Null values: Empty cell or literal "null" depending on use case
  • Objects in arrays: Often need multiple CSVs or denormalization
  • Deep nesting: Flatten only what you need, ignore the rest

Handling special characters and encoding

CSV looks simple until you hit edge cases - I learned this the hard way when a client's product descriptions broke our entire export. What if a field contains a comma? What about newlines inside a field? Quotes? Different languages?

Once I found RFC 4180, everything made sense. Fields with special characters get wrapped in double quotes. Double quotes inside fields get escaped by doubling them. Use UTF-8 encoding for international characters. Excel sometimes needs a BOM (byte order mark) to recognize UTF-8 properly - this one still trips me up occasionally with European clients.

  • Commas in fields: Wrap the field in double quotes
  • Quotes in fields: Escape as double-double quotes ""
  • Newlines in fields: Wrap in quotes, the newline stays literal
  • UTF-8: Include BOM for Excel compatibility (EF BB BF bytes)
  • Semicolons: Some European locales use ; as delimiter instead of comma

Converting CSV back to JSON

The reverse conversion is just as common at Šikulovi s.r.o.. Someone gives me a spreadsheet of data to import into our system - often from a client who prefers Excel over any other format. I need to turn those rows and columns back into JSON objects for the API.

The first row becomes property names. Each subsequent row becomes an object. The tricky part is data types - CSV is all strings, but JSON has numbers, booleans, and nulls. I usually auto-detect: if it looks like a number, parse it as a number. Empty cells become null. "true" and "false" become booleans.

  • Headers become keys: First row defines the object property names
  • Type inference: Numbers, booleans, and nulls detected automatically
  • Empty cells: Can be null, empty string, or omitted depending on preference
  • Date parsing: Tricky - Excel dates are numbers, ISO strings are better
  • Nested reconstruction: Dot notation in headers can rebuild nested objects

Real-world data migration example

Last month at Šikulovi s.r.o., I migrated 50,000 products from an old e-commerce system to a new one. The old system only exported CSV. The new system wanted JSON for its API. The products had categories, variants, and pricing tiers - all nested in the new schema.

I wrote a mapping configuration: source CSV column to target JSON path. The converter handled the transformation, building nested objects from flat data. What would have been a week of manual work took an afternoon of configuring the mapping and running the conversion.

Using the JSON CSV Converter

The converter handles both directions - I built it because I needed something quick that didn't require running scripts. Paste JSON and get CSV, or paste CSV and get JSON. For JSON to CSV, it automatically detects the array of objects pattern and creates appropriate headers. For nested objects, it flattens with dot notation.

I added options for the tricky stuff: delimiter choice (comma or semicolon), quote style, handling of nested arrays, and whether to include a header row. Most of the time the defaults work, but when you are dealing with European Excel or unusual data structures, those options help.

FAQ

Why does Excel show my CSV data in one column?

This frustrated me for years until I understood the issue. Excel might be using a different delimiter based on your locale. In some regions, Excel expects semicolons instead of commas. You can either change the delimiter in the CSV or use Excel's Text Import Wizard (Data > From Text) to specify the delimiter.

How do I handle arrays inside JSON objects?

You have a few options: convert the array to a JSON string in the cell, join array elements with a separator like | or ;, or create multiple rows (one per array element). The best choice depends on what you'll do with the CSV.

What about dates - JSON vs Excel format?

This is a pain point I deal with constantly. JSON typically uses ISO 8601 strings (2024-01-15T10:30:00Z). Excel stores dates as serial numbers internally. When converting, I recommend keeping ISO strings - Excel usually parses them correctly, and they're unambiguous across locales.

Can I convert nested JSON with multiple levels?

Yes, but there are limits. Two or three levels deep flattens well with dot notation. Beyond that, you might want to extract just the parts you need or use multiple CSV files with relationships. Deeply nested data often means the CSV format isn't the right choice.

How do I preserve data types when converting CSV to JSON?

The converter auto-detects types, but you can override this. Numbers stay as numbers, 'true'/'false' become booleans, empty cells become null. If you need a number as a string, prefix it with a quote in the CSV or use the string preservation option.

What is the maximum file size for conversion?

The browser-based converter handles files up to about 50MB comfortably. Beyond that, you might hit memory limits. For very large files, I use command-line tools like jq for JSON processing or csvkit for CSV manipulation.

Martin Šikula

Founder of CodeUtil. Web developer building tools I actually use. When I'm not coding, I experiment with productivity techniques (with mixed success).

Related articles