Introduction
MindBridge’s integration with Azure Data Factory (ADF) enables customers to automate and orchestrate the flow of financial data into MindBridge using cloud-native pipelines. This powerful combination allows organizations to connect to one or more source systems, transform data, and send it to MindBridge for analysis—on their own schedule and terms.
This guide shows how to integrate MindBridge with ADF and provides implementation guidance for data engineering and technical teams at MindBridge customer organizations.
How MindBridge’s Azure Data Factory integration works
MindBridge offers a REST API endpoint (POST /v1/json-tables
) that enables structured financial datasets to be uploaded directly in JSON format. ADF serves as the orchestration layer, managing the data extraction, transformation, and delivery to MindBridge. This allows customers to work within their existing architecture.
Key aspects of the integration
-
Customer-controlled orchestration: Customers retain full control over when and how data is sent to MindBridge.
-
Multiple system support: ADF supports connections to a wide range of ERP, cloud, and on-premises systems.
-
Minimal technical overhead: ADF pipelines work directly with MindBridge’s API—no additional tools are required.
-
Scalable architecture: Works for both small teams and large enterprises, supporting one-time uploads or recurring, automated jobs.
Getting started
To connect Azure Data Factory with MindBridge:
-
Extract: Connect ADF to data sources such as ERP systems, databases, or data lakes.
-
Transform: Use ADF Mapping Data Flows to reshape data into the JSON format required by MindBridge.
-
Ingest: Set up a Web Activity to send the transformed data to the
/v1/json-tables
endpoint via aPOST
request. -
Monitor: Use ADF’s monitoring tools to validate success, handle retries, and track pipeline health.
Example use case
Scenario
A multinational enterprise operates across multiple ERP systems, including Oracle NetSuite and Microsoft Dynamics 365. They want to centralize and automate general ledger ingestion into MindBridge without altering their existing technical architecture.
Azure Data Factory gives the team full control over when and how data is sent to MindBridge. By orchestrating the integration with ADF, they can:
-
Ingest data from multiple ERP systems on a scheduled cadence
-
Transform data in-flight to meet MindBridge’s requirements
-
Avoid manual file uploads and maintain existing security boundaries
Example workflow
Data extraction
ADF connects to ERP systems using native connectors, ODBC/JDBC interfaces, REST/SOAP APIs, and other methods.
Transformation
ADF Mapping Data Flows apply business rules, perform data quality checks, and reshape the data to match MindBridge’s ingestion format. This includes joining data from multiple sources, standardizing fields, applying conditional logic, and ensuring consistency across records.
Upload to MindBridge
A Web Activity in ADF sends a POST
request to /v1/json-tables
, including:
-
Authorization token (Bearer)
-
Organization, engagement, and analysis IDs
-
JSON body with transformed financial data
(Optional) Trigger an analysis run
A follow-up Web Activity can automatically trigger an analysis run via the MindBridge API after data is uploaded successfully.
Monitoring and logging
ADF’s built-in tools provide full visibility into pipeline execution and health, enabling teams to troubleshoot issues quickly and ensure timely delivery.
Benefits
-
Operational efficiency: Automates manual upload and validation tasks.
-
Scalable integration: Handles recurring ingestion across business units or entities.
-
System compatibility: Supports data flows to MindBridge from multiple ERPs with minimal architectural changes.
-
Secure and auditable: Keeps customers in control of when and how their data flows to MindBridge.
Anything else on your mind? Chat with us or submit a request for further assistance.