Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
823 views
in Technique[技术] by (71.8m points)

file io - Read large CSV in java

I want to read huge data from CSV, containing around 500,000 rows. I am using OpenCSV library for it. My code for it is like this

    CsvToBean<User> csvConvertor = new CsvToBean<User>();
    List<User> list = null;
    try {
        list =csvConvertor.parse(strategy, new BufferedReader(new FileReader(filepath)));
    } catch (FileNotFoundException e) {
        e.printStackTrace();
    }

Upto 200,000 records,data is read into list of User bean objects. But for data more than that I am getting

java.lang.OutOfMemoryError: Java heap space

I have this memory setting in "eclipse.ini" file

-Xms256m
-Xmx1024m

I am thinking a solution of splitting the huge file in separate files and read those files again, which I think is a lengthy solution.

Is there any other way, by which I can avoid OutOfMemoryError exception.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Read line by line

something like this

    CSVReader reader = new CSVReader(new FileReader("yourfile.csv"));
    String [] nextLine;
    while ((nextLine = reader.readNext()) != null) {
        // nextLine[] is an array of values from the line
        System.out.println(nextLine[0] + nextLine[1] + "etc...");
    }

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...