System.OutOfMemory processing big file

Sep 12, 2010 at 1:36 AM

Hi,

I try process big file (1.5 GB) using last version  v2.0.2,  but PAL exit with mensage OutOfMemory when try generate Charts. I have virtual machine with 2 GB "Physical Memory" + 4GB "Disk Memory".

The PAL v1.3.6 can process file ok,  using  much less memory but more slow.

 

 

Coordinator
Sep 14, 2010 at 6:08 PM

Processes have no idea how much physical RAM is on the computer. The kernel manages the phyiscal RAM. Therefore, Windows "fakes" out processes by making them think that they have their own virtual address space and the kernel manages all of the real physical memory usage of the process under the covers. When a process is out of memory, then it ran out of its own virtual address space. On 32-bit computers, each process has 2GBs of usable virtual address space. On x64, each process has 8TB of usable virtual address space. What I am saying is that the process ran out of virtual address space regardless of the amount of physical RAM you have. I can assume you are on 32-bit because the virtual address space on x64 is unimaginably large, so to fix this, simply run it on 64-bit (x64 - which is really 48-bit).

With all of that said, I didn't really help you... In PAL v2.0, I put the entire counter log into memory to speed up access to the counters. I tried several other ways to get the counter data, but this one seemed the best even though it is memory hungry. You can fix this by using a PAL threshold file that does not use the Process(*) object. Currently, the only one is the Quick System Overview. In PAL v2.1, I plan on adding the ability to choose to not have the Process(*) counter object analyzed. For example, in the current release of PAL v2.0, if you pick Exchange 2007, then it inherits from System Overview which processes all of the Process counters making it a very resource intensive analysis. In PAL v2.1, you will have the option to picking Exchange 2007, but have it inherit from Quick System Overview instead which doesn't process every instance of Process(*) making processing a *lot* faster.

Nov 17, 2010 at 10:03 PM

Ahhhhh! That explains alot. I just upgraded from 1.4.x and for my first analysis on 2.0.5 I was working on an Exchange 2007 server and I am merging about 8 perfmon.blg files. My machine has been mostly unusable for about 36 hours now. Very frustrating. And it is still chunking....

I have 4gb of ram and I can see in Resource Monitor that it is using every available inch of ram I have.

I will post a follow up question in the forum regarding this and if there is anything I can do to mitigate it other than go back to a pre 2.x release...