Citrus Surf 🏄‍
JSON to SQL Converter
Convert JSON arrays of objects into SQL VALUES, UPDATE loops, INSERT statements, or UPSERT operations for database operations.
About JSON to SQL Converter
Transform JSON data into SQL statements for database operations

What is a JSON to SQL Converter?

A JSON to SQL converter transforms JSON data (arrays of objects) into SQL statements that can be executed in a database. This tool is perfect for converting API responses, exported data, or any structured JSON into database-ready SQL queries including INSERT statements, UPDATE loops, and VALUES clauses.

Common Use Cases

  • API Data Import: Convert JSON responses from APIs directly into SQL for database storage.
  • Data Migration: Transform JSON exports from NoSQL databases or applications into SQL for relational databases.
  • Bulk Operations: Generate UPDATE or INSERT statements for multiple records from JSON data.
  • Testing & Development: Create SQL statements from JSON test data or mock API responses.
  • Data Synchronization: Convert JSON data from external systems into SQL for database updates.

How to Use This Tool

  1. Paste your JSON array of objects into the input field
  2. Choose your desired output format (VALUES, UPDATE Loop, INSERT Loop, or UPSERT Loop)
  3. For UPDATE/INSERT/UPSERT loops, enter your table name and configure column mappings
  4. Specify type casting for columns that need it (e.g., ::uuid, ::timestamp)
  5. Click "Convert to SQL" to generate the SQL statements
  6. Copy the generated SQL and execute it in your database

Output Formats

  • VALUES: Generates a simple VALUES clause that can be used in INSERT statements or CTEs.
  • UPDATE Loop: Creates a PostgreSQL DO block that updates existing records based on a WHERE condition.
  • INSERT Loop: Generates a PostgreSQL DO block that inserts new records into the specified table.
  • UPSERT Loop: Creates a PostgreSQL DO block that uses INSERT with ON CONFLICT to either insert new records or update existing ones based on a conflict column (unique constraint or index). Perfect for data synchronization when you want to handle both inserts and updates in a single operation.

JSON Structure Requirements

Your JSON should be an array of objects where each object represents a row of data:

[
  {
    "id": "123e4567-e89b-12d3-a456-426614174000",
    "name": "John Doe",
    "email": "john@example.com",
    "age": 30
  },
  {
    "id": "987fcdeb-51a2-43d1-9f12-345678901234",
    "name": "Jane Smith",
    "email": "jane@example.com",
    "age": 25
  }
]

Type Casting

The tool supports PostgreSQL type casting to ensure your data is properly typed in the database. Without casting, all values are inserted as text strings. Casting tells PostgreSQL to interpret and validate values as the correct type, catching errors early and ensuring data integrity. Common castings include:

  • ::uuid - For UUID fields (validates format)
  • ::timestamp / ::timestamptz - For datetime fields
  • ::integer / ::numeric - For whole or decimal numbers
  • ::boolean - For true/false values
  • ::jsonb / ::json - For JSON data (see below)
  • Custom types - Any PostgreSQL type or enum

JSONB & JSON Columns

When you apply ::jsonb or ::json casting, the tool does something special: instead of inserting a raw JSON string like '{"city":"Portland"}'::jsonb, it generates PostgreSQL builder expressions:

-- Object values become:
jsonb_build_object('city', 'Portland', 'state', 'OR')

-- Array values become:
jsonb_build_array(1, 2, 3)

-- Nested structures are handled recursively:
jsonb_build_object('address', jsonb_build_object('city', 'Portland'))

This approach is more robust than string literals because PostgreSQL validates each key and value individually, making it easier to spot errors. It also handles quoting and escaping automatically, so you don't have to worry about special characters breaking your SQL.

Tips for Best Results

  • Consistent Structure: Ensure all objects in your JSON array have the same property structure.
  • Data Types: Use appropriate type casting for UUIDs, timestamps, and other special data types.
  • Test First: Always test generated SQL on a small dataset before running on production data.
  • Backup: Create database backups before executing large update operations.
  • Validation: Validate your JSON format before conversion to avoid errors.

UPSERT Loop Details

The UPSERT Loop uses PostgreSQL's native INSERT ... ON CONFLICT ... DO UPDATE syntax to handle both inserts and updates automatically. Key points:

  • Conflict Column(s): Specify the column(s) that have a unique constraint or index. For composite keys, use comma-separated column names (e.g., "col1, col2"). When a conflict occurs, the row is updated instead of inserted.
  • Automatic Updates: All columns except the conflict column(s) are automatically updated when a conflict is detected.
  • Type Safety: Type casting is applied to ensure data consistency between insert and update operations.
  • Database Constraint Required: Ensure your table has a unique constraint or index on the conflict column(s) for this pattern to work correctly.
  • Composite Keys: Perfect for tables with composite unique constraints like @@unique([col1, col2]) in Prisma models.

Security & Privacy

All JSON processing happens locally in your browser. Your data never leaves your device, ensuring complete privacy and security. However, always verify generated SQL before executing it on sensitive databases.