Data Flow - Introduction
The Data Flow Module allows to import, contextualizing, and transforming of large data sets in bulk: we summarize these steps as the data fusion process.
- Data Import: import from an external cloud to Aether.
- Data Contextualization: linking automatically a data collection with one or several asset identifiers (ex: a portion of an electric network…).
- Data Transformation: Process of transforming, enriching, and consolidating a data collection (ex Point cloud classification).
The contextualization and transformation steps are optional.
The Data Fusion steps will be done based on a “Data Stream” which is created at the beginning of the process.
- A Data Stream is a set of consistent measures and associated data fusion parameters used to update the asset state. Ex: Collection of Lidar data in.LAS format from the feeder “1111” collected in July 2022 at 70m AGL.
- A Data Stream is specific to a use case: for each use case, a Data Stream template has to be configured by a configurator (today Alteia team).
- A Data Stream template describes the import (such as data source type), contextualization, and transformation parameters to be applied to the data.
Today, the Data Flow Module handles the following use cases:
The vegetation encroachment analysis:
- Import of lidar (.las) data from object storage (Amazon or Azure) or satellite images from SpatioTemporal Asset Catalogs
- Contextualization: link of the lidar data (point cloud) with feeder id
- Transformation: point cloud classification
- After processing, the data stream can be used within the Analysis Module (Analysis - Introduction) and Asset Viewer Module (Insight - Introduction)
Bulk import of data to Your Sites Module:
- Import of raster (example satellite tiles), vector, mesh files, and Point Cloud in a specific Site of the Data Studio Module from object storage (Amazon or Azure)
- No Automatic Contextualization and Transformation
- After the data import, the data stream data can be visualized and used as inputs of analytics within the "Data Studio" Module.
Compatible data source:
- Amazon S3 Object storage
- Azure Blob Storage
Compatible file types:
A specific user role is required to access the Module.
Data Stream templates should be configured for the account.
4. General Overview
To access the Module, click on the icon on the left menu.
5. Related articles
Data stream management:
- Data Flow - Data Stream Creation
- Data Flow - Data Stream Completion
- Data Flow - Cancel a Data Stream
Data stream monitoring: