Posted by: akolk | January 28, 2009

XMLDOM PLSQL performance tip

I used the xmldom in PLSQL for the first time last year for a customer to generate XML files with product data that get loaded into FREDHOPPER (search engine). There were close to 300K products that needed to be exported into these XML files and each XML file could only have 1000 products (limitation of FREDHOPPER not Oracle). So on average I would create 300 xml files, but I noticed that the PLSQL procedure started to slow down and take more PGA the longer it was running (it went as far as eating up close to 50+ percent of a 12GB system). So there was a memory leak, I checked my code but didn’t find a problem. The only thing I kept doing was opening and closing these DOMS in memory (PGA) so I started to suspect that. I remembered this little PLSQL procedure that would free unused memory that was still allocated to your heaps. It is called dbms_session.free_unused_user_memory.  So now just before I call xmldom.newDocument, I call this dbms_session.free_unused_user_memory procedure and the performance improved, as the heap no longer grew and was steady around 10 percent of the memory. So I was able to improve the performance of the XMLDOM dramatically by releasing the unused memory.

Two weeks ago I got an phone call from a consultant who had a problem with XMLDOM on a windows machine (32 bit). The memory problems are even worse there and they he a small number of processes eat up all the available memory and error out with ora-4030. He asked if anything could be done about this and I remembered my little XMLDOM performance issue so I told him to call dbms_session.free_unused_user_memory every so often. And again the PLSQL program no longer runs into ora-4030 errors.

This is of course an Oracle problem, but at least we have an way to make XMLDOM perform better and use less memory.


  1. Cool workaround.

    Due to the very fast improvement cycle’s and new feature implementations of the XMLDB development team, I would be very interested on which database version (the exact number) you implemented this / this workaround was applied on…?

  2. The version of the database was

  3. Gesundheit bij de weg.


  4. I ran into the same problem (memory leak in XMLDOM) at an old gig with on Solaris. Wish I had known this fix then! We ended up just restarting the sessions that did XML processing every so often… inexact, but it worked.

    Thanks for the blog!


  5. I bet on the (and at least on the following script leaks memory like a sieve.
    PROCEDURE p_inner_proc
    l_xml_msg dbms_xmldom.domdocument;
    l_xml_msg := dbms_xmldom.newdomdocument;
    END p_inner_proc;

    DBMS_APPLICATION_INFO.SET_MODULE(module_name => ‘domdoc_mem_leak’,
    action_name => ‘PROCESSED: 0’);
    for i in 1..200000
    IF MOD(i, 10000) = 0 THEN
    END IF;
    end loop;
    action_name => NULL);

    Depending upon either your DB version or OS, memory will be released when the script finished ( on Linux RedHat) or when you close the session ( on Solaris). There is a patch for this from Oracle but I can’t find the associated bug number since our customer opened the SR since it was a production issue. I’m pretty sure the above script is in the bug text Oracle has on this issue (with the module name having 5 additional leading characters).

  6. Hi, I have a similar situation where we are trying to write about 170K records into an XML file using XMLDOM. All the records are appended to the XML document, and then written out to the file all at once. This is taking huge amounts of time. Any suggestions as to how this can be improved. Your help will be greately appreciated. Thanks.

  7. Senas,

    Not sure how much internal memory you have and how large the records, but I think that the memory overhead of XMLDOM is substantial. I wouldn’t be surprised if you are swapping because of this.

    One suggestion would be to have larger number of smaller files (10 files of 17K records) and then combine them on the OS level.


  8. Anjo, Thanks for your quick reply. Would it also help to write to the XML file one record per time, within the cursor loop as opposed to writing all the records at once. Will this improve the performance. Thanks.

  9. “Performance” is dictated by doing is smarter and or doing it via lesser data.

    – Doing it smarter would be to NOT use dbms_xmldom (which is very memory and CPU intensive). The XMLTYPE operators like EXTRACTVALUE, EXTRACT etc, are a better way off doing it programmatically. If appropiate and usable, go with a SAX parser.

    – AFAIK a loop will not clean up you memory structures. Anyway. All though it sound “strange” (aka stupid…) I know the following will / should work. Create an intermediate table. Create the needed XML content and store it in that table (use an XMLType based on CLOB storage). The save it on disk via one off the methods as described here.

    (Forget the remark and / or post from Nigel – his remarks about “whitepace” and performance are not valid)

  10. Hi,
    interesting topic!
    I hit yesterday the same memory leak problem and i’m even getting ORA-07445 and core dumps in the alert log on Oracle XE for linux.
    I guess the problem is more with DBMS_XMLDOM.FREEDOCUMENT than with NEWDOMDOCUMENT.
    At any rate i’m curious to try out your workaround.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: