Mendix supports multiple options to import data in an application, and all of them proves to be really good based on the requirement of the source type.
In this blog, I’ll focus on one such method which is importing and exporting CSV-formatted data with the help of Mendix offered module called CSV module. When we have a lot of data to import, CSV is a great option which we can think of while the source is Excel.
CSV Module offers some cool advantages, as mentioned in the marketplace :
- This module offers a connector-style usage of CSV for import and exports.
- Advantages of connector style: a) Model reflects CSV structure. b) Microflow can be optimized for importing / exporting (e.g. by caching). c) Suitable for high volumes (from 100K’s to M’s). d) Streaming approach for higher performance and less memory consumption.
Woah ! Sounds great already, Right ? Well, what are we waiting for then, Let’s get started
In our Mendix Project, first we need to download below modules compatible to your modeler version from appstore:
OQL Module :https://marketplace.mendix.com/link/component/66876 , since CSV module offers actions supported by OQL module.
Importing CSV Data
The approach that we’ll be following for Import is to upload a CSV format file in our app and utilize java actions from CSV module that will help import data in our application.
We need to create an entity of type FileDocument that will store our csv file, a main entity which will store the processed data (here Employee entity) and a Non- Persistable entity.
The need for the Non-Persistable entity is because of 2 reasons :
- The incoming data supported via CSV is of type String only
- The data is fed to attributes after rearranging them in alphabetical order
Example : If we directly map data to Employee entity, CSV module actions would map Input data ‘2,Jorg,TechAdmin’ after re-arranging attributes in alphabetical order as ‘Designation: 2, EmpId: Jorg, Name: TechAdmin’, hence wrongly mapping data, and also we have ‘EmpId’ as Integer whereas CSV supports only string.
But the choice is really yours, if you have an entity having alphabetically ordered columns with all string values then creating Non- Persistable entity can be skipped.
Microflow and Pages
Firstly, we need to create a page to accept file uploading and trigger our Import action. (This page is called via a microflow, which creates a new object of CSVFile entity and passes that as page parameter to this)
On click of ‘Import’ button we call our microflow to attain the functionality. Let’s have a look into it :
This microflow takes the uploaded CSV File as Input parameter and the main work done here is to call the action ‘Import CSV’ from CSV module and configure it. The parameters taken by this action are :
- Pass the CSV File from which data will be read(In this case, we pass our Input parameter CSV File)
- Separator value should not be left empty and needs to be specified else we’ll run into Null pointer Error
- Skip Lines as Integer value
- The highlight is the microflow to be invoked for Import which does the main processing. The selected microflow in the screenshot is an example microflow provided by CSV module for reuse. We can modify it as per our requirement and let’s see the modification I have done.
I have created a list of Employee object proceeding with java action ‘Read next line’ that reads the data from CSV file and mapped it to Non- Persistable entity ‘Employee_CSV’
For each line read, we create an Employee object and map returned object from Read Next Line action to its attributes.
Once done the list of Employee objects is committed and we return to our main microflow and Import is done.
Exporting CSV Data
Exporting data to a CSV-formatted file is as easy as Importing and can be done in a similar way.
Let’s say we want to export the Employee data, we need to create a microflow as shown below :
Considering the same Domain model as used in Import, create a CSVFile Enity object, call ‘Export CSV’ java action from CSV module, it takes almost similar parameters as ‘Import CSV’ action. But in this, the microflow to be invoked needs to be created by us and at last, download activity to provide the user with the file.
Export CSV action Parameters
- Pass the newly created object of CSV File
- Separator value as per need, in this case its ‘;’
- Return type of this action is Boolean and we’ll use it to decide whether export was done or not
- The Microflow to be invoked in this case is the one we would create ourselves as shown below
In this, we retrieve the Employee data and iterate over it to map it first to Temp entity to process data type to String and also avoid re-arranging and then mapping of columns by the next java action ‘Write Next Line’ , it is used to write data in our CSV file row by row in correct order of column.
Once processing is done, we return back to our main microflow based on the return value of ‘Export CSV’ action the CSV file will get downloaded for the user.
(Note : Again mapping data into Temporary Entity while export is your choice, here its done to avoid re-arranging by ‘Write Next Line’ action)
CSV module is an efficient module offered by Mendix and has lot more features such as ‘ExportOQLtoCSV’, ‘EportSQLtoCSV’, ‘CountNumberofLines’ which can be explored and used to achieve Import/Export of large volume data at high performance and lower memory consumption