java memory management -
i'm c++ programmer , i'm playing around java after finding jpa few of current applications god send. haven't touched java since university , i'm having problem running out of heap space. i'm using code below main part of not-very-serious test of jdbc/jpa/lucene keep on getting random outofmemory exceptions.
entitymanager em = emf.createentitymanager(); query q = em.createquery("select p product p" + " p.productid = :productid"); connection con = drivermanager.getconnection("connection string"); statement st = con.createstatement(); indexwriter writer = new indexwriter("c:\\temp\\lucene", new standardanalyzer(), indexwriter.maxfieldlength.limited); resultset rs = st.executequery("select productid product order productid"); while (rs.next()) { int productid = rs.getint("productid"); q.setparameter("productid", productid); product p = (product)q.getsingleresult(); writer.adddocument(createdocument(p)); } writer.commit(); writer.optimize(); writer.close(); st.close(); con.close();
i won't post of createdocument instantiate new org.apache.lucene.document.document , adds fields via add(new field...) etc. there 50 fields in total , short strings (<32 characters) in length.
in newby-ness there stupid i'm doing (or not) cause things not gc'd?
are there best practices regarding java memory management , tickling gc?
i don't see out of place. if you're working large database, try increasing heap size using -xmx n
option in jvm invocation. not best solution - when know working set size bigger default heap size.
are using complex data structures? if have circular references between objects, might preventing garbage collector cleaning unreachable objects. if have hand-written data structures, make sure explicitly null out references objects removed instead of doing decrementing size variable.
Comments
Post a Comment