Skip to main content
All CollectionsData Loading and ProcessingData Loading
Automated upload via Data Collector API
Automated upload via Data Collector API
Updated over a week ago


Data Collector API operates on behalf of a particular user (identified by his API token).

Required roles:

  • Database creator allows users to upload data and create databases.

  • Data collector enables this API and related functionality.

If you are not the owner of the folders or databases you plan to load data into, please ensure you have 'Edit' permissions for the target entities.

OpenAPI Specification:

Acquiring an API Token

The API token is required to authenticate API calls.

  1. Go to the Profile page and scroll down to the Token section.

  2. Generate a new token if required.
    โ€‹Note! By generating a new token you automatically invalidate previous tokens.

  3. Copy the Token and employ it in your data push script.

Automated Data Processing

Data uploaded as a standard HTTP POST of multipart/form-data.

  1. If the POST request is authenticated successfully, it is added to the processing queue. In response, the server provides the ID of the processing job. Use this ID to poll the status of the job.

  2. The server may reject the request due to quota limitations. Quotas do vary for different servers.

  3. The DATATILE server notifies the authenticated user about each status transition by email. The server also pushes corresponding events to the callback URL if the latter is provided.

You can also check the journal of all Processing Jobs in the Drive. Login to the DatTile server and proceed to the "Processing log" section.

Upload Mode

The API is the programmatic proxy of the same operations that a user can perform via the interface.






This can be any folder except the root. DATATILE will create a new database in the target folder.



Full reload of the target database. Existing data will be substituted by those in the uploaded file.



Appends the database. New data will be added to the database.

The user should be the owner or have sufficient operational permissions on the target entities (folders or databases).

Code snippet in Python

The script below creates a new database in the folder with the provided hash.

from requests import post
token = "xxx"
# grab security token from your profile
action = "create"
hash = "yyy"
# supposed to be a target folder hash where the database to be created
url = f"{action}/{hash}"
res = post(url,
headers={"token": token},
files={'dataset': open('laptops.zsav' ,'rb')})

print(f"RESPONSE: {res.status_code}, {res.text}")

API Quotas

  1. DataTile rejects subsequent requests to the same destination when the previous one is still in progress.

  2. The uploaded file's size is limited in the API and User Interface.

  3. The rate of uploads is limited globally on the server level. Contact your Administrator if you believe current quotas are insufficient for your operations.

Quotas can vary to ensure a better experience and sustainability of your DataTile instance.

Did this answer your question?