Skip to main content
All CollectionsAdministration
Hardware recommendations for running a DataTile instance
Hardware recommendations for running a DataTile instance
Updated over a month ago

DataTile Technology Stack

The DataTile instance consists of these main components:

  • Java Web Application

  • PostgreSQL server

  • Nginx

  • Redis

  • Application monitoring agent

  • Update management agent

Requirements

DataTile is a containerized application capable of running on a Kubernetes cluster, virtual environment, or physical hardware.

Server Configuration

Recommended platform and configuration for DataTile applications:

  • Preferred OS: Ubuntu 20.04 LTS / 22.04 LTS

  • Docker: 25+

  • Docker-compose v2: 2.24++

  • Open ports: 80, 443

Minimal Hardware Requirements

Minimal hardware requirements for running DataTile instance on a single physical server or in a virtual environment are as follows:

  • CPU: 4 cores

  • RAM: 4Gb

  • Disk: 40Gb (incl. OS)

Recommended Provision

We recommend allocating the following server resources for large research data warehouses (tens of millions of interviews). The server resources can be increased gradually with the actual load on the server in production environments.

  • CPU: 8 cores

  • RAM: 24 GB

  • System Disk: 50 GB

  • Database Disk: 100 GB NVMe

  • Application Data: 100 GB NVMe

  • Backup Disk: 400 GB or scalable S3 mount

General guidance on performance

DataTile's performance is influenced by the following factors, listed in descending priority:

  • I/O Speed: A possible constraint on calculation speed, as DataTile stores research data in local disk files. For instance, using an SSD provides a performance boost 3-10 times greater than an HDD.

  • RAM: An essential element for performance. DataTile maps frequently accessed data files into memory to minimize latency. Multi-level in-memory calculation caching enhances overall throughput for concurrent sessions, bulk operations, and automated reporting.

  • The number of CPU cores reduces latency when many users work simultaneously. It also drastically influences performance on big data sets by distributing calculations across cores. Consider adding cores if you host large tracking or syndicated studies.

  • CPU Cores: Additional cores decrease latency for parallel sessions and enhance performance on large data sets by distributing calculations across cores. Consider increasing the core count for hosting extensive tracking or syndicated studies, generating substantial reports, or serving many users. Maintain a ratio of at least 2 GB of RAM per core.

After analyzing the usage profile and gathering APM data, DataTile engineers can provide customized recommendations and conduct performance tuning tailored to your needs.

Did this answer your question?