Friday, March 30, 2012

Memory allocation error while processing cubes

I get this error while processing my cubes:
Error: Memory error: Allocation failure : Not enough storage is available to process this command.
There is plenty of space left on the hard disks on this machine, I can't find a remedy for this problem on Google either. Any help would be appreciated.

Thanks in advance.

Preston, which version of SQL Server Analysis Services are you running? Did you install the latest service pack?

--Artur

|||I am using Analysis Services 2005 Enterprise Edition with (to my knowledge) the latest service pack, version 9.00.2047.00.
|||

The error you getting means Analysis Server estimated amount of memory it would need to perform processing operation and there is not enought memory.

What you can do about this:

1. Force Analysis Server to disregard the estimate and proceed with processing operation. You might get a memory allocation later down the road during processing. For that change MemoryLimitErrorEnabled server property. For description of server properties see http://www.microsoft.com/technet/prodtechnol/sql/2005/ssasproperties.mspx

2. Take a look at processing operation, see if you can process fewer things at the time. For that you can change CoordinatorExecutionMode. Take a look at the processing architechture whitepaper http://msdn2.microsoft.com/en-us/library/ms345142.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

what says your experience about changing CoordinatorExecutionMode?

Is it worth to set it 0?

|||

I am seeing the same disk space issue as described in original post.

Each partition is about 8GB, total 14 partitions. each partition about 50M rows.

The cube was processed before so it exists with about 110GB total used space.

Free space about 200GB.

When re-processing the same cube, the free space seems to slowly disappear, for example only after processing 2 partitions, the free space went from 200GB to 40GB. This is while the total used space in this cube directory is 120GB. The used space makes sense - 110GB + 8G for 2nd copy of partition that's being processed.

The free space gone to neverland is what doesn't make sense!

I am processing partitions from BIDS (highlighted all partitions). Processing each partition separately, each with its own transaction.

I've seen the same type of Disk space stealing by SSAS server when for example deleting a cube. Somehow when I delete the cube, it does not show that disk space is free, while the files are gone!, - that is until you restart SSAS server.

can anyone shed a light on this?

(win 2003 ent, sql 2005 ent sp2 fin, 32 bit, 16gb ram, 4 cpu)

|||

.net pukes when serializing large data files, Reporting services chokes on large data files, and now SSAS can't allocate memory.

Does anyone see a pattern here?

|||

Does SSAS actually use the Disk for temporary cache of data during aggregation?

can anyone from MS SSAS team clarify?

if that's true, which "directory" is it, and can I change it to be a different disk? I so happen to have more free space available on a different drive (not OLAP Data drive).

any ideas?

|||

Hi

I am experiencing the same memory allocation error whenever I try to deploy a cube. This is without even trying to process it. The only way I have been able to resolve it is to get administrators to restart the AS service on the server. I did not have this issue prior to loading SQL Server SP2. Can someone please advise if they have found the same issue and perhaps one of the fixes in SP2 broke the deployment? I have tried making the ProcessPlan/MemoryLimitErrorEnabled to false and it still gives me the same issue.

Someone please assist.

|||

I am having the same problems trying to load my cube. I have tried all the little tricks listed in the various post to no avail.

Please help!!!!!

|||

Edward, can you unmark this as answered? - so that it gets proper attention?

|||

Hi,

I'm getting the same issue processing data. The error message is turned off and from looking at the performance of the server it is not going anywhere near the memory limit.

Should it page to disk if it starts to run out of memory? I thought this is what 2000 did.

Cheers

M

|||I've just encountered the same issue (right after upgrading to SP2). My last three loads have all failed with this same error...never saw it prior to SP2.|||

Yes it means you actually have to know what you are doing when working with large datasets, when to put something on the stack vs the heap. Other people make these tools work, the pattern seems to be your competance as a professional.

No comments:

Post a Comment