cancel
Showing results for 
Search instead for 
Did you mean: 

Write huge CSV File

franciscoduarte
Champ in-the-making
Champ in-the-making

Hello,

I have to write a huge csv file from some information coming from an AFTS query. At this moment, the query is already working and I got the correct information to create the CSV file.

My problem here is creating the CSV file with lots of information (around 300.000 rows).

This is my method to query and write the results:

private void writeInitialReport(String pathResolved, FileInfo csvFileInfo) throws IOException {
writeHeader(csvFileInfo);
StringBuilder sb;

boolean hasMore;
int count = 0;
do {
SearchParameters searchParameters = buildQuery(pathResolved);
searchParameters.setSkipCount(count+=INTERVAL);
searchParameters.setMaxItems(INTERVAL);
ResultSet resultSet = serviceRegistry.getSearchService().query(searchParameters);

if (resultSet != null) {
sb = new StringBuilder();
for (NodeRef nodeRef : resultSet.getNodeRefs()) {
buildRow(sb, nodeRef);
}
hasMore = resultSet.hasMore();

ContentReader contentReader = serviceRegistry.getContentService().getReader(csvFileInfo.getNodeRef(), ContentModel.PROP_CONTENT);
ByteArrayOutputStream content = new ByteArrayOutputStream();
content.write(contentReader.getContentInputStream().readAllBytes());
content.write(sb.toString().getBytes());

ContentWriter contentWriter = serviceRegistry.getContentService().getWriter(csvFileInfo.getNodeRef(), ContentModel.PROP_CONTENT, true);
contentWriter.setMimetype(MimetypeMap.MIMETYPE_TEXT_CSV);
contentWriter.setEncoding("UTF-8");
contentWriter.putContent(new ByteArrayInputStream(content.toByteArray()));
} else {
hasMore = false;
}
} while (hasMore);
}

 I'm using docker. I don't know why, but before I execute this, my docker volume contains 1.7Gb. After this execution, my volume passes to 7Gb. The created excel file has around 20Mb.

So, why is alfresco creating so many data?

Could it be because of the way I'm writing into the file? I can't get all the data in one variable because I get an Out of memory exception.

And the putContent method from the ContentWriter seems to always replace the existing data with the new data. So, I'm reading the old data, appending the new data and then write to the csv.

Can you advise on this?

Best regards,

Francisco Duarte

3 REPLIES 3

cesarista
World-Class Innovator
World-Class Innovator

Hi:

Are you creating many versions of the CSV file ? 

Regards.

--C.

Hello,

I don't know if writing this way creates other versions. Anyway, when I visit the document, it says version 1.0.

Best regards,

Francisco Duarte

Jenlop
Champ in-the-making
Champ in-the-making

I totally like your gave limits as some information which is totally essential for me. PayMyDoctor Portal