Delimited accelerates application development by letting developers configure and embed a pre-built widget to let their users upload CSV files into the application. No more time needs to be spent writing upload, validation, mapping, UI, testing, and maintenance code.

When you, as the developer, configure Delimited, you need to provide a destination for the user's data.

One destination that Delimited supports is integration with an existing API that you may have developed via an HTTP POST request.

After a user uploads a file and Delimited cleans, validates, and maps the data, it will send a POST request to your server's endpoint with a JSON representation of the user's data.

Configuring The Integration

While configuring your schema, there is a location in which you can paste your server's URL.

Security

For enhanced security, we recommend that you only use HTTPS endpoints and include an un-guessable token as part of your API URL. Then you can validate that the token matches on your server-side. For example, construct a URL like so:

https://api.example.com/incoming/delimited/kpqhvduguoknqxkpdwilaq...

-or-

https://api.example.com/incoming/delimited?key=kpqhvduguoknqxkpdwilaq...

Then, on your server, check to ensure that the provided URL key matches your pre-configured value. This URL endpoint will never be exposed publicly by Delimited and is never exposed to users, but is only used to communicate from Delimited's servers to your API.

Testing the Integration

For testing the integration, we recommend generating a one-time testing "web hook" URL from a site like https://webhook.site and using that to preview how uploads are sent to the server. Free testing sites like Webhook.site often have low payload size limits, so if you choose to test with one of these sites, utilize small test files for the best experience (see Limitations, below).

Limitations

  • Size. Often-times servers have data payload response limits. While Delimited is capable of sending extremely large files to your server in a streaming fashion, most server software cannot accept large payloads. We recommend that you use this method of integration only if you are expecting users to upload relatively small files (think hundreds or potentially thousands of rows, but probably not millions).
  • Efficiency. By its very nature, a POSTed JSON document has built-in inefficiencies, including data duplication (the header names being duplicated for each row) and serialization/deserialization requirements. If you are dealing with high-volume, low-latency situations, an integration like a direct database integration is more efficient.

Payload Reference

The payload is JSON structured in the following way (without the explanatory comments, of course):

{
    // The unique identifier for this uploaded data instance
    "uploadId": "<upload-id>",

    // The unique identifier for the schema this upload corresponds to
    "schemaId": "<schema-id>",

    // The UTC ISO 8601 date/time of when the data was uploaded
    "uploadedAt": "2020-05-01T01:02:30.1861391+00:00",

    // The result of the upload. Could be "success" or "error"
    "status": "success",

    // Some statistics about the upload itself
    "statistics": {
        // The number of records processed from the file
        numProcessed: 3,
        // The number of records found to be invalid (did not pass validation)
        numInvalid: 1,
        // The number of records found to be valid
        numValid: 2,
        // The number of bytes processed of the input file. This is the 
        // number that counts against your monthly data limits
        bytes: 7076
    },

    // The validated, mapped, cleaned data from the user
    "data": [

        // Each row in the data file becomes a new object here with
        // property names representing the header names in the file

        {
            "Header1": "Value1"
            // etc...
        }
    ],

    // The version of this JSON delivery structure
    "version": "0.0.1"
}

The simplest way to get started with Delimited is to try creating your very own schema. Give it a try with a free trial!