Data Handling
Last updated: September, 2024
Data Handling
This page describes how InfoTiles processes utility data in the course of providing services for commercial engagements. It is meant to provide a general overview; if you have technical questions, please book a call and we will connect you to the relevant expert to answer your queries.
General Data Lifecycle
Depending on the products utilised, some or all of the following steps may be applied to your data.
- Ingestion is the step of acquiring data from our customers and making it available to the InfoTiles Platform.
- Storage makes sure that your data is available for visualisation, further processing, or export.
- Processing occurs when the various proprietary Machine Learning and AI Algorythyms enrich your data and otherwise analyse it to provide actionable insights.
- Visualisation describes the process of displaying dashboards, charts and other visuals based on your data and the enhancements made to it.
- Export is the process of extracting utility data and the enrichments from the InfoTiles platform to make it available for other workloads.
- Destruction of utility data involves purging the data from InfoTiles systems.
Encryption
In Transit
All protocols used by InfoTiles encrypt traffic while it is in transit over the public internet. Additional security such as file level encryption can be accommodated provided it is well documented.
At Rest
InfoTiles uses Microsoft Azure to store all customer data. Whether it is stored in Platform as a Services (PaaS) or Infrastructure as a Service (IaaS), the data is encrypted at disk level in the unlikely event that disk drives are physically obtained. InfoTiles uses Azure Managed encryption keys.
Ingestion
Data can be ingested from different channels. All of them are secured by encryption while in transit from the customer to our Azure endpoints (e.g. SSL / TLS).
- Email file transfer
- Secure FTP (SFTP)
- Secure upload through InfoTiles Applications*
- API (Push or Pull)
- Azure Data Factory
- Azure Blob Storage
As a general rule InfoTiles staff will not download data from customer systems requiring a login. This means that InfoTiles employees will not need to be granted access to customer systems.
* When data is uploaded to InfoTiles via our application, we use a service provided by ComDocks GmbH based in Germany. Customer is handled in the browser and does not pass through their servers. You can view their privacy policy here.
Storage
Customer data is stored exclusively in Microsoft Azure. Depending on the customer’s location, different data centres are used. As of 2024, InfoTiles houses data in Microsoft Azure Data Centres in:
- Norway (norwayeast)
- Netherlands (westeurope)
- Ireland (northeurope)
- London (uksouth)
Data is encrypted at rest using Microsoft Managed Encryption Keys. Customer data is stored and accessed in separate indexes dedicated to data types for individual customers.
For certain customers only using PipeFusion, we utilise a cloud service from Elasticsearch B.V. While data is still stored within the above Azure Data Centres, the storage account is managed by Elasticsearch B.V. You can view their trust statement here. Customer data that leverages the ElasticSearch Cloud service is stored separately in deployments dedicated to a single customer. Customers requiring a different chain of custody may request that data is restricted within InfoTiles’ (or other Azure Subscription) for an additional fee.
Processing
Data processing takes place exclusively within InfoTiles’ Microsoft Azure subscriptions for production data. Depending on the workload, InfoTiles uses Azure Service Bus and Infrastructure as a Service products.
AI & Machine Learning algorithms are developed by InfoTiles Data Scientists and Software Engineers according to our Application Lifecycle Management Policies which include:
- Cybersecurity Policy
- Access Control Policy
- Secure Development Policy
- Secure System Engineering Principles
- Data Destruction Policy
Processing itself takes place on managed infrastructure hosted in Microsoft Azure. We strictly control access to the infrastructure using MS Entra with Multi Factor Authentication and retain access logs. AI/ML workloads are containerised and keep customer data isolated. Utility data is not permanently stored on these infrastructure hosts, rather data is retrieved, processed and then written back to the dedicated storage systems.
Data is not transferred outside of our internal networks during processing and connections to data stores take place over encrypted connections.
Visualisation
Visualisation happens using 3rd party tools hosted in Azure Data Centres. As of 2024, InfoTiles processes visualisation of customer data in Microsoft Azure Data Centres in:
- Norway (norwayeast)
- Netherlands (westeurope)
- Ireland (northeurope)
- London (uksouth)
Depending on the nature of each customer’s engagement the following may apply, your customer success manager will be able to confirm the exact nature for your engagement.
Smaller engagements will typically be visualised using Kibana Cloud hosted in Azure Data Centres on servers within Elasticsearch B.V’s subscription and deployed by InfoTiles.
- Larger engagements will typically be visualised using Kibana hosted in Azure Data Centres on servers within InfoTiles’ Subscription.
- Legacy engagements (Pre July 2024) are visualised using OpenSearch Dashboards hosted in Azure Data Centres on servers within InfoTiles’ Subscription.
The benefits of utilising ElasticSearch and OpenSearch for visualisation mean that any visuals created during an InfoTiles subscription period can be exported along with the underlying enriched data and hosted by customers for no licence fee, reducing the risks associated with vendor lock-in.
Users connect to our visualisations applications from their browsers using encrypted connections (https). User authentication and authorisation to view data is controlled by the customer’s IT department via Microsoft Entra unless otherwise requested.
Export
Export can take the form of:
- Download from the Visualisation Web Application.
- API data requests
- Email notifications
- Bulk file transfer
All the above protocols are conducted over encrypted connections, and additional protections such as file level encryption can be added to Bulk File Transfers.
Download & API exports take place over encrypted SSL connections and respect the authentication and authorisation in place for the underlying data.
Email notifications are classified as a form of ‘export’ because they involve summarised data leaving InfoTiles controlled systems. These services are developed in response to customer requests.
Bulk File transfers are available to customers who wish to export the entirety of their data from InfoTiles for use in other systems. They are arranged upon a duly authorised customer request.
Destruction
Destruction of data may be requested at the conclusion of a project or subscription. Destruction is carried out in accordance with our data destruction policy which is available upon request. As a general rule we do not purge backups, but provide written confirmation of when they will purged according to schedule which is typically less than 60 days from removal of data from InfoTiles systems.
Partners
InfoTiles collaborates with Arup on projects, combining products and functionality to solve unique challenges. These engagements require the sharing of operational telemetry, and may involve AI/ML algorithms that have been co-created being deployed as part of the Processing stage.
Data Breaches
InfoTiles has a public Data Breach Policy and internal Incident Response Plan. The Data Breach Policy outlines how we will respond in the event of a data breach.
We also maintain a Security Reporting system where we can be informed about any security vulnerabilities.
As of September 2024 there have been no data breaches.