J'ai écrit un programme qui effectue un traitement de données sur une liste d'objets (taille maximale de 800). Le travail effectué sur cette liste est principalement les éléments suivants:Erreur de mémoire saturée dans mon programme
- Beaucoup de requêtes SQL
- Traitement des données interrogées
- Regroupement et assortis
- les écrire dans des fichiers CSV
Tous Cela fonctionnait parfaitement, mais la partie traitement des données et la taille des données SQL augmentaient de jour en jour et le programme commençait à manquer de mémoire et à se bloquer souvent. Pour éviter cela, j'ai décidé de couper cette grande liste en quelques morceaux plus petits, puis d'essayer de faire le même travail sur ces listes plus petites (je vais effacer et annuler la petite liste actuelle avant de passer à la petite liste suivante) en espérant que cela résoudrait le problème. Mais cela n'a pas aidé du tout et le programme manque toujours de mémoire.
Le programme ne manque pas de mémoire dans la première itération de la boucle for Mais dans la deuxième ou la troisième ou plus. Est-ce que je supprime et annule correctement toutes les listes et tous les objets de la boucle for afin que la mémoire soit libre pour la prochaine itération?
Comment résoudre ce problème? J'ai mis mon code ci-dessous.
Toutes les suggestions/solutions seront grandement appréciées.
Merci d'avance. À la votre!
List<someObject> unchoppedList = new ArrayList<someObject>();
for (String pb : listOfNames) {
someObject tccw = null;
tccw = new someObject(...);
unchoppedList.add(tccw);
}
Collections.shuffle(unchoppedList);
List<List<someObject>> master = null;
if (unchoppedList.size() > 0 && unchoppedList.size() <= 175) {
master = chopped(unchoppedList, 1);
} else if (unchoppedList.size() > 175 && unchoppedList.size() <= 355) {
master = chopped(unchoppedList, 2);
} else if (unchoppedList.size() > 355 && unchoppedList.size() <= 535) {
master = chopped(unchoppedList, 3);
} else if (unchoppedList.size() > 535&& unchoppedList.size() <= 800)) {
master = chopped(unchoppedList, 4);
}
for (int i = 0 ; i < master.size() ; i++) {
List<someObject> m = master.get(i);
System.gc(); // I insterted this statement to force GC
executor1 = Executors.newFixedThreadPool(Configuration.getNumberOfProcessors());
generalList = new ArrayList<ProductBean>();
try {
m.parallelStream().forEach(work -> {
try {
generalList.addAll(executor1.submit(work).get());
work = null;
} catch (Exception e) {
logError(e);
}
});
} catch (Exception e) {
logError(e);
}
executor1.shutdown();
executor1.awaitTermination(30, TimeUnit.SECONDS);
m.clear();
m = null;
executor1 = null;
//once the general list is produced the program randomly matches some "good" products to highly similar "not-so-good" products
List<ProductBean> controlList = new ArrayList<ProductBean>();
List<ProductBean> tempKaseList = new ArrayList<ProductBean>();
for (ProductBean kase : generalList) {
if (kase.getGoodStatus() == 0 && kase.getBadStatus() == 1) {
controlList.add(kase1);
} else if (kase.getGoodStatus() == 1 && kase.getBadStatus() == 0) {
tempKaseList.add(kase1);
}
}
generalList = new ArrayList<ProductBean>(tempKaseList);
tempKaseList.clear();
tempKaseList = null;
Collections.shuffle(generalList);
Collections.shuffle(controlList);
final List<List<ProductBean>> compliCases = chopped(generalList, 3);
final List<List<ProductBean>> compliControls = chopped(controlList, 3);
generalList.clear();
controlList.clear();
generalList = null;
controlList = null;
final List<ProductBean> remainingCases = Collections.synchronizedList(new ArrayList<ProductBean>());
IntStream.range(0, compliCases.size()).parallel().forEach(i -> {
compliCases.get(i).forEach(c -> {
TheRandomMatchWorker tRMW = new TheRandomMatchWorker(compliControls.get(i), c);
List<String[]> reportData = tRMW.generateReport();
writeToCSVFile(reportData);
// if the program cannot find required number of products to match it is added to a new list to look for matching candidates elsewhere
if (tRMW.getTheKase().isEverythingMathced == false) {
remainingCases.add(tRMW.getTheKase());
}
compliControls.get(i).removeAll(tRMW.getTheMatchedControls());
tRMW = null;
stuff.clear();
});
});
controlList = new ArrayList<ProductBean>();
for (List<ProductBean> c10 : compliControls) {
controlList.addAll(c10);
}
compliCases.clear();
compliControls.clear();
//last sweep where the program for last time tries to match some "good" products to highly similar "not-so-good" products
try {
for (ProductBean kase : remainingCases) {
if (kase.getNoOfContrls() < ccv.getNoofctrl()) {
TheRandomMatchWorker tRMW = new TheRandomMatchWorker(controlList, kase);
List<String[]> reportData = tRMW.generateReport();
writeToCSVFile(reportData);
if (tRMW.getTheKase().isEverythingMathced == false) {
remainingCases.add(tRMW.getTheKase());
}
compliControls.get(i).removeAll(tRMW.getTheMatchedControls());
tRMW = null;
stuff.clear();
}
}
} catch (Exception e) {
logError(e);
}
remainingCases.clear();
controlList.clear();
controlList = null;
master.get(i).clear();
master.set(i, null);
System.gc();
}
master.clear();
master = null;
Voici la méthode hachée
static <T> List<List<T>> chopped(List<T> list, final int L) {
List<List<T>> parts = new ArrayList<List<T>>();
final int N = list.size();
int y = N/L, m = 0, c = y;
int r = c * L;
for (int i = 1; i <= L; i++) {
if (i == L) {
c += (N - r);
}
parts.add(new ArrayList<T>(list.subList(m, c)));
m = c;
c += y;
}
return parts;
}
Voici la trace de la pile comme l'a demandé
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at Controller.MasterStudyController.lambda$1(MasterStudyController.java:212)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291)
at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.execLocalTasks(ForkJoinPool.java:1040)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1058)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.postgresql.core.Encoding.decode(Encoding.java:204)
at org.postgresql.core.Encoding.decode(Encoding.java:215)
at org.postgresql.jdbc.PgResultSet.getString(PgResultSet.java:1913)
at org.postgresql.jdbc.PgResultSet.getString(PgResultSet.java:2484)
at Controller.someObject.findControls(someObject.java:214)
at Controller.someObject.call(someObject.java:81)
at Controller.someObject.call(someObject.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[19:13:35][ERROR] Jarvis: Exception:
java.util.concurrent.ExecutionException: java.lang.AssertionError: Failed generating bytecode for <eval>:-1
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at Controller.MasterStudyController.lambda$1(MasterStudyController.java:212)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291)
at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.execLocalTasks(ForkJoinPool.java:1040)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1058)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.lang.AssertionError: Failed generating bytecode for <eval>:-1
at jdk.nashorn.internal.codegen.CompilationPhase$BytecodeGenerationPhase.transform(CompilationPhase.java:431)
at jdk.nashorn.internal.codegen.CompilationPhase.apply(CompilationPhase.java:624)
at jdk.nashorn.internal.codegen.Compiler.compile(Compiler.java:655)
at jdk.nashorn.internal.runtime.Context.compile(Context.java:1317)
at jdk.nashorn.internal.runtime.Context.compileScript(Context.java:1251)
at jdk.nashorn.internal.runtime.Context.compileScript(Context.java:627)
at jdk.nashorn.api.scripting.NashornScriptEngine.compileImpl(NashornScriptEngine.java:535)
at jdk.nashorn.api.scripting.NashornScriptEngine.compileImpl(NashornScriptEngine.java:524)
at jdk.nashorn.api.scripting.NashornScriptEngine.evalImpl(NashornScriptEngine.java:402)
at jdk.nashorn.api.scripting.NashornScriptEngine.eval(NashornScriptEngine.java:155)
at javax.script.AbstractScriptEngine.eval(AbstractScriptEngine.java:264)
at Controller.someObject.findCases(someObject.java:108)
at Controller.someObject.call(someObject.java:72)
at Controller.someObject.call(someObject.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
[19:13:52][ERROR] Jarvis: Exception:
[19:51:41][ERROR] Jarvis: Exception:
org.postgresql.util.PSQLException: Ran out of memory retrieving query results.
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2157)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:300)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:428)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:354)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:169)
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:117)
at Controller.someObject.lookForSomething(someObject.java:763)
at Controller.someObject.call(someObject.java:70)
at Controller.someObject.call(someObject.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
Pouvez-vous fournir la pile de pile? –
Combien de mémoire avez-vous donné à la JVM? Avez-vous essayé d'utiliser la mémoire avec VisualVM? – brain99
48 Go correspond à la taille que j'attribue avant de démarrer ce programme. Oui, j'ai utilisé VisualVM mais je n'ai pas réussi à reconnaître quoi que ce soit. – krisGoks