Why I Switched from YAML to TOML (And When I Still Use YAML)
YAML indentation bugs cost me a Friday night once. Now I use TOML for most configs. Here's my practical guide to picking the right format.
I convert CSV to JSON probably weekly - data imports, API migrations, analytics exports. Here is everything I have learned about doing it right, including the edge cases that used to break my scripts.
At Šikulovi s.r.o., I deal with data from everywhere - client exports from Excel, database dumps, analytics CSVs, third-party APIs. CSV is the universal exchange format that everyone can produce, but JSON is what my applications actually need.
The conversion sounds simple until you hit the edge cases: European number formats, quoted strings containing commas, mixed data types. I built this tool after one too many broken imports.
CSV looks simple - just values separated by commas, right? But there is a lot of hidden complexity. The first row is usually headers, which become your JSON keys. And the delimiter is not always a comma.
In raw CSV, everything is a string. The number 42 is just the characters "42". Type inference detects numbers, booleans, and null values and converts them to proper JSON types automatically.
I enable this by default because it saves me from writing parseInt() and JSON.parse() calls everywhere in my code. But I can disable it when I need strings to stay strings.
Array of Objects is what I use 90% of the time - each row becomes an object with header names as keys. Array of Arrays is for when I need the raw tabular structure, usually for re-exporting or when headers are unreliable.
Sometimes I need to export JSON data for clients who want spreadsheets. The tool extracts object keys as headers and values as rows. Fair warning: nested objects do not translate cleanly to flat CSV - I usually flatten my data first.
After breaking imports countless times, here is what I always check now. I validate after every conversion because assumptions about data are often wrong.
Absolutely. This was a priority when I built the tool - all processing happens in your browser. Your CSV never leaves your machine. I would never upload client data to random servers myself.
Just select semicolon from the delimiter dropdown. I added support for comma, semicolon, tab, and pipe because I have encountered all of them in real data.
Uncheck "First row is header" and the output will use numeric indices (0, 1, 2) as keys. I hit this with machine-generated data that skips the header row.
Flat JSON arrays work great. Deeply nested structures are tricky because CSV is inherently flat. I usually flatten my data first with a script, or just export the top-level properties.
Founder of CodeUtil. Web developer building tools I actually use. When I'm not coding, I experiment with productivity techniques (with mixed success).
YAML indentation bugs cost me a Friday night once. Now I use TOML for most configs. Here's my practical guide to picking the right format.
After pasting sensitive API responses into random online tools for years, I finally did a proper comparison. Here's what I actually use now and why privacy matters more than features.
After years of config file debates at Šikulovi s.r.o., here's my honest take on when to use what. Spoiler: I use all three, but for very different reasons.