The IT world is constantly evolving, Companies run and manage data in a variety of ways. Now data integration is no longer done through individual tools optimized to handle a specific type of data. Organizations now use various tools from a variety of vendors to access and retrieve business intelligence which is scattered across many applications and databases. Because data integration plays a very important role in business processes, it is necessary to make sure companies have a concrete solution to manage their data integration needs.
Data integration offers solutions for transforming, moving and consolidating information from many parts of an enterprise so that it may go through cleansing, manipulation, standardizing, de-duplication, and synchronization between sources. Businesses deal with their customer data integration needs by employing specific data integration tools. As enterprises employ various integration solutions to handle certain needs, the number of tools being used increases. The larger the company, the larger the number of tools implemented. This builds problems for businesses as data integration turn out to be fragmented and complicated, lacking consistency throughout the enterprise.
MuleSoft provide an ESB which offer solution to help business with their integration needs. Mule ESB is part of part of Anypoint Platform. It helps businesses connect SaaS, cloud, mobile and on-premises applications as well data sources. Anypoint is package of components which delivers a robust integration solution for business:
For businesses targeting to connect mainly in the cloud, CloudHub is a cloud-based MuleSoft integration platform as a service (iPaaS) offering.
Let’s see Mule ESB data integration features in action. We will extract and process records from files placed on FTP server using FTP Connector, once data is retrieved, we will transform it into Plain Old Java Object (POJO) in order to allow Java-based connector to use the data, therefore we will use Transform Message Component and afterwards process these objects/data through batch processing feature.
We will use a batch job to set processing rules for records in Anypoint Studio, adding data aggregation to the batch process is also supported to easily enable bulk uploads at customizable record size.
We have selected FTP connector’s On New or Updated File operation, which will look for a file, enabled watermark and set it to run for every minute.
Since we have csv format here we need to convert it into Plain Old Java Object (POJO) in order to allow java based connector to use the data, therefore we will use Transform Message Component.
With batch job we have the option of configuring the scheduling strategy and number of records to process.
Here we will set
In order to create a bulk set of records to Salesforce, we need to aggregate a subset of the batch records in the queue.
The aggregator has a subset of batch block size because of this constraint, it should be set to equal or less than the total batch block size. In this use case we are setting it to
To avoid lengthy process of trying to insert millions of record one at a time, Salesforce has bulk operation which we used here to load data into Salesforce.
During this batch step, each record from the batch jobs needs to be processed. In these scenarios, we have only one batch step where we will transform the CSV data to Salesforce object.
This batch processing feature with MuleSoft ESB can be done with ease. For businesses wanting to overcome the challenges of integration, our MuleSoft experts can help in repeatable integration applications. To know more information on Anypoint Platform you can email us at firstname.lastname@example.org or visit www.royalcyber.com.