Friday, March 30, 2012

Memory allocation in MS-SQL 2000 instances

Hi,
From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
instances. It is a nice machine with 4 Gb of memory but the memory allocation
is very weird:
Instance A: 1400Mb
Instance B: 1000Mb
Instance C: 80Mb (!)
Instance C is performing badly under a bit of pressure which seems not strange
considering these allocations.
With that in mind, is there a way to check and re-allocate memory? I'd like to
see if the instances really need these amounts of memory and if not, to move
some over to other instances.
Thanks!
Dries Besselsdries wrote:
> Hi,
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory allocation
> is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not strange
> considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd like to
> see if the instances really need these amounts of memory and if not, to move
> some over to other instances.
> Thanks!
> Dries Bessels
Could it be that there simply isn't any more memory for the C instance?
You doesn't specify which SQL and OS versions you are running, but there
could be settings that leave e.g. 1 GB for the OS and then you haven't
got any more left for SQL. SQL server will grap memory but doesn't
release it again. This means that if the first two instances already had
grapped the memory then there is nothing left for instance C.
If you have the change, you could try to re-boot the server and then
check if the memory consumption becomes more equal between the three
instances.
Regards
Steen|||dries (dries@.bessels.stop-spam.org) writes:
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory
> allocation is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not
> strange considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd
> like to see if the instances really need these amounts of memory and if
> not, to move some over to other instances.
You cannot reallocate memory between instances per se, but you configure
per server the max amount of memory an instance can use.
SQL Server's general strategy on memory is to grab as much it needs
and that is available. The use for this memory is cache. The more data
that is in cache, the faster data can be accessed. When there is competition
for memory, SQL Server will yield. In this it appears that the three
instances are competing.
3 instance on 4GB may be a tad low. You are running with the /3GB switch
in window?
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Memory allocation in MS-SQL 2000 instances

Hi,
From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
instances. It is a nice machine with 4 Gb of memory but the memory allocatio
n
is very weird:
Instance A: 1400Mb
Instance B: 1000Mb
Instance C: 80Mb (!)
Instance C is performing badly under a bit of pressure which seems not stran
ge
considering these allocations.
With that in mind, is there a way to check and re-allocate memory? I'd like
to
see if the instances really need these amounts of memory and if not, to move
some over to other instances.
Thanks!
Dries Besselsdries wrote:
> Hi,
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory allocat
ion
> is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not str
ange
> considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd lik
e to
> see if the instances really need these amounts of memory and if not, to mo
ve
> some over to other instances.
> Thanks!
> Dries Bessels
Could it be that there simply isn't any more memory for the C instance?
You doesn't specify which SQL and OS versions you are running, but there
could be settings that leave e.g. 1 GB for the OS and then you haven't
got any more left for SQL. SQL server will grap memory but doesn't
release it again. This means that if the first two instances already had
grapped the memory then there is nothing left for instance C.
If you have the change, you could try to re-boot the server and then
check if the memory consumption becomes more equal between the three
instances.
Regards
Steen|||dries (dries@.bessels.stop-spam.org) writes:
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory
> allocation is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not
> strange considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd
> like to see if the instances really need these amounts of memory and if
> not, to move some over to other instances.
You cannot reallocate memory between instances per se, but you configure
per server the max amount of memory an instance can use.
SQL Server's general strategy on memory is to grab as much it needs
and that is available. The use for this memory is cache. The more data
that is in cache, the faster data can be accessed. When there is competition
for memory, SQL Server will yield. In this it appears that the three
instances are competing.
3 instance on 4GB may be a tad low. You are running with the /3GB switch
in window?
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

Memory allocation in MS-SQL 2000 instances

Hi,
From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
instances. It is a nice machine with 4 Gb of memory but the memory allocation
is very weird:
Instance A: 1400Mb
Instance B: 1000Mb
Instance C: 80Mb (!)
Instance C is performing badly under a bit of pressure which seems not strange
considering these allocations.
With that in mind, is there a way to check and re-allocate memory? I'd like to
see if the instances really need these amounts of memory and if not, to move
some over to other instances.
Thanks!
Dries Bessels
dries wrote:
> Hi,
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory allocation
> is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not strange
> considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd like to
> see if the instances really need these amounts of memory and if not, to move
> some over to other instances.
> Thanks!
> Dries Bessels
Could it be that there simply isn't any more memory for the C instance?
You doesn't specify which SQL and OS versions you are running, but there
could be settings that leave e.g. 1 GB for the OS and then you haven't
got any more left for SQL. SQL server will grap memory but doesn't
release it again. This means that if the first two instances already had
grapped the memory then there is nothing left for instance C.
If you have the change, you could try to re-boot the server and then
check if the memory consumption becomes more equal between the three
instances.
Regards
Steen
|||dries (dries@.bessels.stop-spam.org) writes:
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory
> allocation is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not
> strange considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd
> like to see if the instances really need these amounts of memory and if
> not, to move some over to other instances.
You cannot reallocate memory between instances per se, but you configure
per server the max amount of memory an instance can use.
SQL Server's general strategy on memory is to grab as much it needs
and that is available. The use for this memory is cache. The more data
that is in cache, the faster data can be accessed. When there is competition
for memory, SQL Server will yield. In this it appears that the three
instances are competing.
3 instance on 4GB may be a tad low. You are running with the /3GB switch
in window?
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx

Memory allocation in MS-SQL 2000 instances

Hi,
From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
instances. It is a nice machine with 4 Gb of memory but the memory allocation
is very weird:
Instance A: 1400Mb
Instance B: 1000Mb
Instance C: 80Mb (!)

Instance C is performing badly under a bit of pressure which seems not strange
considering these allocations.

With that in mind, is there a way to check and re-allocate memory? I'd like to
see if the instances really need these amounts of memory and if not, to move
some over to other instances.

Thanks!
Dries Besselsdries (dries@.bessels.stop-spam.org) writes:
> From a previous sys-admin I inherited a a MS-SQL (2000) machine with 3
> instances. It is a nice machine with 4 Gb of memory but the memory
> allocation is very weird:
> Instance A: 1400Mb
> Instance B: 1000Mb
> Instance C: 80Mb (!)
> Instance C is performing badly under a bit of pressure which seems not
> strange considering these allocations.
> With that in mind, is there a way to check and re-allocate memory? I'd
> like to see if the instances really need these amounts of memory and if
> not, to move some over to other instances.

You cannot reallocate memory between instances per se, but you configure
per server the max amount of memory an instance can use.

SQL Server's general strategy on memory is to grab as much it needs
and that is available. The use for this memory is cache. The more data
that is in cache, the faster data can be accessed. When there is competition
for memory, SQL Server will yield. In this it appears that the three
instances are competing.

3 instance on 4GB may be a tad low. You are running with the /3GB switch
in window?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspxsql

Memory Allocation for Jobs Running under SQL Agent

I use the job processing functionality to do a variety of things by running
DTS packages. One of my DTS packages has been failing as run a .NET
application to convert/modify the data extracted from an Oracle Database in a
previous step. I get an exception "There is insufficient system memory to
run this query." when the application runs under the Agent but not when I run
the same package interactively. Is there some restriction on memory use for
jobs running under the SQL Agent? If this is so is there someway to modify
it?
thanks for any assistance
Mike Mattix
CP Kelco, Inc
Okmulgee, OK
Hi
No. But how much RAM do you have and how much is allocated the SQL Server?
Regards
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Mike Mattix" <MikeMattix@.discussions.microsoft.com> wrote in message
news:20484DEE-330A-4ABB-B4D1-29C6CE1AB488@.microsoft.com...
>I use the job processing functionality to do a variety of things by running
> DTS packages. One of my DTS packages has been failing as run a .NET
> application to convert/modify the data extracted from an Oracle Database
> in a
> previous step. I get an exception "There is insufficient system memory to
> run this query." when the application runs under the Agent but not when I
> run
> the same package interactively. Is there some restriction on memory use
> for
> jobs running under the SQL Agent? If this is so is there someway to
> modify
> it?
> thanks for any assistance
> --
> Mike Mattix
> CP Kelco, Inc
> Okmulgee, OK
|||Mike,
Thanks for the reply. The server has 3GB and SQL is setup to dynamically
manage it memory. Min=0, Max=3GB, Min Query= 1GB
Mike Mattix
"Mike Epprecht (SQL MVP)" wrote:

> Hi
> No. But how much RAM do you have and how much is allocated the SQL Server?
> Regards
> --
> Mike Epprecht, Microsoft SQL Server MVP
> Zurich, Switzerland
> IM: mike@.epprecht.net
> MVP Program: http://www.microsoft.com/mvp
> Blog: http://www.msmvps.com/epprecht/
> "Mike Mattix" <MikeMattix@.discussions.microsoft.com> wrote in message
> news:20484DEE-330A-4ABB-B4D1-29C6CE1AB488@.microsoft.com...
>
>

Memory Allocation for Jobs Running under SQL Agent

I use the job processing functionality to do a variety of things by running
DTS packages. One of my DTS packages has been failing as run a .NET
application to convert/modify the data extracted from an Oracle Database in
a
previous step. I get an exception "There is insufficient system memory to
run this query." when the application runs under the Agent but not when I ru
n
the same package interactively. Is there some restriction on memory use for
jobs running under the SQL Agent? If this is so is there someway to modify
it?
thanks for any assistance
Mike Mattix
CP Kelco, Inc
Okmulgee, OKHi
No. But how much RAM do you have and how much is allocated the SQL Server?
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Mike Mattix" <MikeMattix@.discussions.microsoft.com> wrote in message
news:20484DEE-330A-4ABB-B4D1-29C6CE1AB488@.microsoft.com...
>I use the job processing functionality to do a variety of things by running
> DTS packages. One of my DTS packages has been failing as run a .NET
> application to convert/modify the data extracted from an Oracle Database
> in a
> previous step. I get an exception "There is insufficient system memory to
> run this query." when the application runs under the Agent but not when I
> run
> the same package interactively. Is there some restriction on memory use
> for
> jobs running under the SQL Agent? If this is so is there someway to
> modify
> it?
> thanks for any assistance
> --
> Mike Mattix
> CP Kelco, Inc
> Okmulgee, OK|||Mike,
Thanks for the reply. The server has 3GB and SQL is setup to dynamically
manage it memory. Min=0, Max=3GB, Min Query= 1GB
Mike Mattix
"Mike Epprecht (SQL MVP)" wrote:

> Hi
> No. But how much RAM do you have and how much is allocated the SQL Server?
> Regards
> --
> Mike Epprecht, Microsoft SQL Server MVP
> Zurich, Switzerland
> IM: mike@.epprecht.net
> MVP Program: http://www.microsoft.com/mvp
> Blog: http://www.msmvps.com/epprecht/
> "Mike Mattix" <MikeMattix@.discussions.microsoft.com> wrote in message
> news:20484DEE-330A-4ABB-B4D1-29C6CE1AB488@.microsoft.com...
>
>

Memory Allocation for Jobs Running under SQL Agent

I use the job processing functionality to do a variety of things by running
DTS packages. One of my DTS packages has been failing as run a .NET
application to convert/modify the data extracted from an Oracle Database in a
previous step. I get an exception "There is insufficient system memory to
run this query." when the application runs under the Agent but not when I run
the same package interactively. Is there some restriction on memory use for
jobs running under the SQL Agent? If this is so is there someway to modify
it?
thanks for any assistance
--
Mike Mattix
CP Kelco, Inc
Okmulgee, OKHi
No. But how much RAM do you have and how much is allocated the SQL Server?
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Mike Mattix" <MikeMattix@.discussions.microsoft.com> wrote in message
news:20484DEE-330A-4ABB-B4D1-29C6CE1AB488@.microsoft.com...
>I use the job processing functionality to do a variety of things by running
> DTS packages. One of my DTS packages has been failing as run a .NET
> application to convert/modify the data extracted from an Oracle Database
> in a
> previous step. I get an exception "There is insufficient system memory to
> run this query." when the application runs under the Agent but not when I
> run
> the same package interactively. Is there some restriction on memory use
> for
> jobs running under the SQL Agent? If this is so is there someway to
> modify
> it?
> thanks for any assistance
> --
> Mike Mattix
> CP Kelco, Inc
> Okmulgee, OK|||Mike,
Thanks for the reply. The server has 3GB and SQL is setup to dynamically
manage it memory. Min=0, Max=3GB, Min Query= 1GB
Mike Mattix
"Mike Epprecht (SQL MVP)" wrote:
> Hi
> No. But how much RAM do you have and how much is allocated the SQL Server?
> Regards
> --
> Mike Epprecht, Microsoft SQL Server MVP
> Zurich, Switzerland
> IM: mike@.epprecht.net
> MVP Program: http://www.microsoft.com/mvp
> Blog: http://www.msmvps.com/epprecht/
> "Mike Mattix" <MikeMattix@.discussions.microsoft.com> wrote in message
> news:20484DEE-330A-4ABB-B4D1-29C6CE1AB488@.microsoft.com...
> >I use the job processing functionality to do a variety of things by running
> > DTS packages. One of my DTS packages has been failing as run a .NET
> > application to convert/modify the data extracted from an Oracle Database
> > in a
> > previous step. I get an exception "There is insufficient system memory to
> > run this query." when the application runs under the Agent but not when I
> > run
> > the same package interactively. Is there some restriction on memory use
> > for
> > jobs running under the SQL Agent? If this is so is there someway to
> > modify
> > it?
> >
> > thanks for any assistance
> >
> > --
> > Mike Mattix
> > CP Kelco, Inc
> > Okmulgee, OK
>
>

Memory allocation for complex cluster setup

Greetings. Having some difficulty consolidating on the appropriate memory
setup for the following configuration:
2 node cluster, Windows 2003 Enterprise Edition SP1 with SQL 2000 Enterprise
Edition SP4. There are 6 virtual SQL servers installed. Generally we run
one (primary) virtual server all by itself on the first node, and the other 5
(secondary) all bunched up on the second node. The primary virtual server
has recently begun having problems with performance. We are hung up on a
debate over the proper memory configuration. Both nodes have 10.0 GB of RAM.
We already have /PAE set on both nodes. We intentionally have not set /3GB
switch in the past, because in the event of a node failure, all 6 SQL
instances will end up on one node, possibly using 12.0GB of RAM, and causing
it's own problems. The virtual SQL servers all actually max out at about 1.7
GB (at least I have never seen them any higher), I assume that is some SQL or
OS quirk that keeps it close to 2.0GB max.
Now we are at the point where the primary virtual servers performance, even
when it is sitting all by itself, is becoming an issue. The Microsoft
documentation says for 8GB to use /3GB and /PAE. For 16GB to use /PAE
without /3GB, because if you accidentally cut off OS RAM to 1GB, you will
cause problems for the /PAE when addressing 16GB. Also, I have seen some
Microsoft documents that say "Generally you will not need to use AWE and /3GB
at the same time." My understanding is that /3GB actually is sort of "false
advertising", in that it was intended for 4GB systems. It really just tells
the OS to only reserve 1GB for itself, and give the rest to applications
(generally 3GB).
So here are the actual questions:
If I use /PAE WITHOUT /3GB on a system with 10.0GB, and then turn on SQL AWE
and set the max SQL RAM to 8.0GB, is that any different than if I use /PAE
AND /3GB on the same system, turn on AWE and set the max SQL RAM to 8.0GB?
What if I do both scenarios with a max SQL RAM of 9.0GB? Since I only have
10.0GB, will SQL be refused the extra 1.0GB in the scenario without the /3GB
switch because 2.0GB are reserved for the OS?
And finally, in my real world case, SQL is only ever using that 1.7 GB right
now... Should I start by just adding the /3GB switch without turning on AWE
in SQL to see if can even find use for the extra 1.0GB available before going
over the top and assigning an extra 6.0 GB or 7.0 GB? Will SQL actually make
use of all that RAM?
I know that is mostly theoretical. Hoping an expert can share some
understanding with me on what is otherwise guesswork on my part.
- Dan
dbanick@.axentis.comThe /3GB really has nothing to do with AWE other than the fact you should
not turn it on when you are using 12GB or more. THis is because the OS
needs more memory to handle the PTE's and such with the more memory above
4GB it has to manage. 16GB is a hard cut off but 12 is my practical limit.
Your issue is not so much the /3GB but the AWE. If you turn on AWE you will
have effectively choked out the other instances if they fail over to that
machine. AWE memory in SQL2000 is not dynamic. That is one of the main
reasons why you need to set an upper limit with the MAX Memory setting. YOu
have to be extremely careful when using AWE in a cluster due to that fact.
In SQL2005 AWE can be dynamic and you don't have to worry as much about it.
--
Andrew J. Kelly SQL MVP
"Dan Banick" <Dan Banick@.discussions.microsoft.com> wrote in message
news:31AE3BA6-C00D-4435-A876-A2A4B3042ADB@.microsoft.com...
> Greetings. Having some difficulty consolidating on the appropriate memory
> setup for the following configuration:
> 2 node cluster, Windows 2003 Enterprise Edition SP1 with SQL 2000
> Enterprise
> Edition SP4. There are 6 virtual SQL servers installed. Generally we run
> one (primary) virtual server all by itself on the first node, and the
> other 5
> (secondary) all bunched up on the second node. The primary virtual server
> has recently begun having problems with performance. We are hung up on a
> debate over the proper memory configuration. Both nodes have 10.0 GB of
> RAM.
> We already have /PAE set on both nodes. We intentionally have not set
> /3GB
> switch in the past, because in the event of a node failure, all 6 SQL
> instances will end up on one node, possibly using 12.0GB of RAM, and
> causing
> it's own problems. The virtual SQL servers all actually max out at about
> 1.7
> GB (at least I have never seen them any higher), I assume that is some SQL
> or
> OS quirk that keeps it close to 2.0GB max.
> Now we are at the point where the primary virtual servers performance,
> even
> when it is sitting all by itself, is becoming an issue. The Microsoft
> documentation says for 8GB to use /3GB and /PAE. For 16GB to use /PAE
> without /3GB, because if you accidentally cut off OS RAM to 1GB, you will
> cause problems for the /PAE when addressing 16GB. Also, I have seen some
> Microsoft documents that say "Generally you will not need to use AWE and
> /3GB
> at the same time." My understanding is that /3GB actually is sort of
> "false
> advertising", in that it was intended for 4GB systems. It really just
> tells
> the OS to only reserve 1GB for itself, and give the rest to applications
> (generally 3GB).
> So here are the actual questions:
> If I use /PAE WITHOUT /3GB on a system with 10.0GB, and then turn on SQL
> AWE
> and set the max SQL RAM to 8.0GB, is that any different than if I use /PAE
> AND /3GB on the same system, turn on AWE and set the max SQL RAM to 8.0GB?
> What if I do both scenarios with a max SQL RAM of 9.0GB? Since I only
> have
> 10.0GB, will SQL be refused the extra 1.0GB in the scenario without the
> /3GB
> switch because 2.0GB are reserved for the OS?
> And finally, in my real world case, SQL is only ever using that 1.7 GB
> right
> now... Should I start by just adding the /3GB switch without turning on
> AWE
> in SQL to see if can even find use for the extra 1.0GB available before
> going
> over the top and assigning an extra 6.0 GB or 7.0 GB? Will SQL actually
> make
> use of all that RAM?
> I know that is mostly theoretical. Hoping an expert can share some
> understanding with me on what is otherwise guesswork on my part.
> - Dan
> dbanick@.axentis.com

Memory allocation for complex cluster setup

Greetings. Having some difficulty consolidating on the appropriate memory
setup for the following configuration:
2 node cluster, Windows 2003 Enterprise Edition SP1 with SQL 2000 Enterprise
Edition SP4. There are 6 virtual SQL servers installed. Generally we run
one (primary) virtual server all by itself on the first node, and the other
5
(secondary) all bunched up on the second node. The primary virtual server
has recently begun having problems with performance. We are hung up on a
debate over the proper memory configuration. Both nodes have 10.0 GB of RAM
.
We already have /PAE set on both nodes. We intentionally have not set /3GB
switch in the past, because in the event of a node failure, all 6 SQL
instances will end up on one node, possibly using 12.0GB of RAM, and causing
it's own problems. The virtual SQL servers all actually max out at about 1.
7
GB (at least I have never seen them any higher), I assume that is some SQL o
r
OS quirk that keeps it close to 2.0GB max.
Now we are at the point where the primary virtual servers performance, even
when it is sitting all by itself, is becoming an issue. The Microsoft
documentation says for 8GB to use /3GB and /PAE. For 16GB to use /PAE
without /3GB, because if you accidentally cut off OS RAM to 1GB, you will
cause problems for the /PAE when addressing 16GB. Also, I have seen some
Microsoft documents that say "Generally you will not need to use AWE and /3G
B
at the same time." My understanding is that /3GB actually is sort of "false
advertising", in that it was intended for 4GB systems. It really just tells
the OS to only reserve 1GB for itself, and give the rest to applications
(generally 3GB).
So here are the actual questions:
If I use /PAE WITHOUT /3GB on a system with 10.0GB, and then turn on SQL AWE
and set the max SQL RAM to 8.0GB, is that any different than if I use /PAE
AND /3GB on the same system, turn on AWE and set the max SQL RAM to 8.0GB?
What if I do both scenarios with a max SQL RAM of 9.0GB? Since I only have
10.0GB, will SQL be refused the extra 1.0GB in the scenario without the /3GB
switch because 2.0GB are reserved for the OS?
And finally, in my real world case, SQL is only ever using that 1.7 GB right
now... Should I start by just adding the /3GB switch without turning on AWE
in SQL to see if can even find use for the extra 1.0GB available before goin
g
over the top and assigning an extra 6.0 GB or 7.0 GB? Will SQL actually mak
e
use of all that RAM?
I know that is mostly theoretical. Hoping an expert can share some
understanding with me on what is otherwise guesswork on my part.
- Dan
dbanick@.axentis.comThe /3GB really has nothing to do with AWE other than the fact you should
not turn it on when you are using 12GB or more. THis is because the OS
needs more memory to handle the PTE's and such with the more memory above
4GB it has to manage. 16GB is a hard cut off but 12 is my practical limit.
Your issue is not so much the /3GB but the AWE. If you turn on AWE you will
have effectively choked out the other instances if they fail over to that
machine. AWE memory in SQL2000 is not dynamic. That is one of the main
reasons why you need to set an upper limit with the MAX Memory setting. YOu
have to be extremely careful when using AWE in a cluster due to that fact.
In SQL2005 AWE can be dynamic and you don't have to worry as much about it.
Andrew J. Kelly SQL MVP
"Dan Banick" <Dan Banick@.discussions.microsoft.com> wrote in message
news:31AE3BA6-C00D-4435-A876-A2A4B3042ADB@.microsoft.com...
> Greetings. Having some difficulty consolidating on the appropriate memory
> setup for the following configuration:
> 2 node cluster, Windows 2003 Enterprise Edition SP1 with SQL 2000
> Enterprise
> Edition SP4. There are 6 virtual SQL servers installed. Generally we run
> one (primary) virtual server all by itself on the first node, and the
> other 5
> (secondary) all bunched up on the second node. The primary virtual server
> has recently begun having problems with performance. We are hung up on a
> debate over the proper memory configuration. Both nodes have 10.0 GB of
> RAM.
> We already have /PAE set on both nodes. We intentionally have not set
> /3GB
> switch in the past, because in the event of a node failure, all 6 SQL
> instances will end up on one node, possibly using 12.0GB of RAM, and
> causing
> it's own problems. The virtual SQL servers all actually max out at about
> 1.7
> GB (at least I have never seen them any higher), I assume that is some SQL
> or
> OS quirk that keeps it close to 2.0GB max.
> Now we are at the point where the primary virtual servers performance,
> even
> when it is sitting all by itself, is becoming an issue. The Microsoft
> documentation says for 8GB to use /3GB and /PAE. For 16GB to use /PAE
> without /3GB, because if you accidentally cut off OS RAM to 1GB, you will
> cause problems for the /PAE when addressing 16GB. Also, I have seen some
> Microsoft documents that say "Generally you will not need to use AWE and
> /3GB
> at the same time." My understanding is that /3GB actually is sort of
> "false
> advertising", in that it was intended for 4GB systems. It really just
> tells
> the OS to only reserve 1GB for itself, and give the rest to applications
> (generally 3GB).
> So here are the actual questions:
> If I use /PAE WITHOUT /3GB on a system with 10.0GB, and then turn on SQL
> AWE
> and set the max SQL RAM to 8.0GB, is that any different than if I use /PAE
> AND /3GB on the same system, turn on AWE and set the max SQL RAM to 8.0GB?
> What if I do both scenarios with a max SQL RAM of 9.0GB? Since I only
> have
> 10.0GB, will SQL be refused the extra 1.0GB in the scenario without the
> /3GB
> switch because 2.0GB are reserved for the OS?
> And finally, in my real world case, SQL is only ever using that 1.7 GB
> right
> now... Should I start by just adding the /3GB switch without turning on
> AWE
> in SQL to see if can even find use for the extra 1.0GB available before
> going
> over the top and assigning an extra 6.0 GB or 7.0 GB? Will SQL actually
> make
> use of all that RAM?
> I know that is mostly theoretical. Hoping an expert can share some
> understanding with me on what is otherwise guesswork on my part.
> - Dan
> dbanick@.axentis.comsql

Memory Allocation for 64 bit

I have a client experiencing problems with SQL Server 2005 64 bit
(sp1). The server has 4 gigs of memory available, however I can not
increase the memory allocated via management studio past 2 gigs. When
I look at the .exe in task manager, I see that service is using about
3.5 gigs. Can I just go into sp_configure and set it to what I need?
Not sure if this matters...the processors are made by AMD. Please
advise...
I think if you check that value in management studio you will see that is
not 2 gigs but a much higher number.
Andrew J. Kelly SQL MVP
<abraceyiii@.yahoo.com> wrote in message
news:1175211178.077029.26440@.p15g2000hsd.googlegro ups.com...
>I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
>
|||Hello,
SQL Server 2005 supports dynamic allocation of AWE mapped memory on Windows
Server 2003. In Windows 2003 with SQL Server 2005 64 bit, no need to
do any changes to utilize more than 4 GB RAM.
See the below URL which details.
http://msdn2.microsoft.com/en-us/library/ms190731.aspx
Thanks
Hari
<abraceyiii@.yahoo.com> wrote in message
news:1175211178.077029.26440@.p15g2000hsd.googlegro ups.com...

>I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
>
|||On 30.03.2007 01:32, abraceyiii@.yahoo.com wrote:
> I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
Are you maybe trying to administer a 64 bit version with a 32 bit
client? Maybe that has some built in limitations. Just a wild guess.
robert
|||On Apr 2, 2:05 am, Robert Klemme <shortcut...@.googlemail.com> wrote:
> On 30.03.2007 01:32, abracey...@.yahoo.com wrote:
>
> Are you maybe trying to administer a 64 bit version with a 32 bit
> client? Maybe that has some built in limitations. Just a wild guess.
> robert
thanks...i will look elsewhere

Memory Allocation for 64 bit

I have a client experiencing problems with SQL Server 2005 64 bit
(sp1). The server has 4 gigs of memory available, however I can not
increase the memory allocated via management studio past 2 gigs. When
I look at the .exe in task manager, I see that service is using about
3.5 gigs. Can I just go into sp_configure and set it to what I need?
Not sure if this matters...the processors are made by AMD. Please
advise...I think if you check that value in management studio you will see that is
not 2 gigs but a much higher number.
Andrew J. Kelly SQL MVP
<abraceyiii@.yahoo.com> wrote in message
news:1175211178.077029.26440@.p15g2000hsd.googlegroups.com...
>I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
>|||Hello,
SQL Server 2005 supports dynamic allocation of AWE mapped memory on Windows
Server 2003. In Windows 2003 with SQL Server 2005 64 bit, no need to
do any changes to utilize more than 4 GB RAM.
See the below URL which details.
http://msdn2.microsoft.com/en-us/library/ms190731.aspx
Thanks
Hari
<abraceyiii@.yahoo.com> wrote in message
news:1175211178.077029.26440@.p15g2000hsd.googlegroups.com...

>I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
>|||On 30.03.2007 01:32, abraceyiii@.yahoo.com wrote:
> I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
Are you maybe trying to administer a 64 bit version with a 32 bit
client? Maybe that has some built in limitations. Just a wild guess.
robert|||On Apr 2, 2:05 am, Robert Klemme <shortcut...@.googlemail.com> wrote:
> On 30.03.2007 01:32, abracey...@.yahoo.com wrote:
>
> Are you maybe trying to administer a 64 bit version with a 32 bit
> client? Maybe that has some built in limitations. Just a wild guess.
> robert
thanks...i will look elsewhere

Memory Allocation for 64 bit

I have a client experiencing problems with SQL Server 2005 64 bit
(sp1). The server has 4 gigs of memory available, however I can not
increase the memory allocated via management studio past 2 gigs. When
I look at the .exe in task manager, I see that service is using about
3.5 gigs. Can I just go into sp_configure and set it to what I need?
Not sure if this matters...the processors are made by AMD. Please
advise...I think if you check that value in management studio you will see that is
not 2 gigs but a much higher number.
--
Andrew J. Kelly SQL MVP
<abraceyiii@.yahoo.com> wrote in message
news:1175211178.077029.26440@.p15g2000hsd.googlegroups.com...
>I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
>|||Hello,
SQL Server 2005 supports dynamic allocation of AWE mapped memory on Windows
Server 2003. In Windows 2003 with SQL Server 2005 64 bit, no need to
do any changes to utilize more than 4 GB RAM.
See the below URL which details.
http://msdn2.microsoft.com/en-us/library/ms190731.aspx
Thanks
Hari
<abraceyiii@.yahoo.com> wrote in message
news:1175211178.077029.26440@.p15g2000hsd.googlegroups.com...
>I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
>|||On 30.03.2007 01:32, abraceyiii@.yahoo.com wrote:
> I have a client experiencing problems with SQL Server 2005 64 bit
> (sp1). The server has 4 gigs of memory available, however I can not
> increase the memory allocated via management studio past 2 gigs. When
> I look at the .exe in task manager, I see that service is using about
> 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> Not sure if this matters...the processors are made by AMD. Please
> advise...
Are you maybe trying to administer a 64 bit version with a 32 bit
client? Maybe that has some built in limitations. Just a wild guess.
robert|||On Apr 2, 2:05 am, Robert Klemme <shortcut...@.googlemail.com> wrote:
> On 30.03.2007 01:32, abracey...@.yahoo.com wrote:
> > I have a client experiencing problems with SQL Server 2005 64 bit
> > (sp1). The server has 4 gigs of memory available, however I can not
> > increase the memory allocated via management studio past 2 gigs. When
> > I look at the .exe in task manager, I see that service is using about
> > 3.5 gigs. Can I just go into sp_configure and set it to what I need?
> > Not sure if this matters...the processors are made by AMD. Please
> > advise...
> Are you maybe trying to administer a 64 bit version with a 32 bit
> client? Maybe that has some built in limitations. Just a wild guess.
> robert
thanks...i will look elsewhere

Memory Allocation Failure When Deploying AS2005 Project

Hi,

I have an AS2005 project that I am trying to deploy. It contains one fact table with approx 20 million rows. I have never had any problems deploying the project unitl recently. What is happening is when between 3 - 5 million of the rows have been read it will fall over, giving the following error message:

"Memory error: Allocation failure : Not enough storage is available to process this command. . Errors in the OLAP storage engine:"

I have read about this occurring during the processing of a data mining model, and a hotfix exists to solve this. However, my project contains no data mining at all.

Can anyone shed any light on this?

Dave

Hard drive failure? Hard drive full/not enough storage space?

If you use incremental updating for your cube you will need additional storage space temporarely.

Regards

Thomas Ivarsson

|||

Hard drive is fine, and has around 200 Gb free. I think this is a memory problem with the server. The server has 3Gb, and has been able to process the cube fine for the last 2 months so I cannot understand it.

What do you mean by incremental processing? I have 3 partitions set up, each with around 7 million rows. I have set the options to 'Sequential processing', and 'Separate transactions'. Is there anything else I can do?

Dave

|||

Looks like your machine doesnt have enoght memory to execute a processing command. Analysis Server tries estimate the memory needed to complete processing command and that is the error message you are seeing.

Try process separately every dimension and partition. If you still see the error message during partition processing try to reduce size of the partition by partitioning your data some more.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Edward,

I could understand this being the case if the there had always been this problem. However, the fact table does not have any more rows now it is erroring than it did when it was processing fine. In addition, the error occurs when only a small proportion of the rows have been read, so I cannot see how the machine could have run out of memory in this early stage of processing.

David

|||

This is not a situation where machine runs out memory. Analysis Server wouldnt even start processing if you dont have enoght memory available to comlete the operation. So what is happening, it tries to estimate how much memory it needs to complete the processing, then it requests from OS what is the memory avaliable at the moment and compares the two. If there isn't enoguh, you will see the error.

Now, In your case I am not clear what had happen before.

The two things to check.

1. As I suggested in previous reply. Try and process less things in parralel, try to partition some more. Anything causing Analysis Server to lower the estimates.

2. See the memory situation on your machine at the time of processing. Is there someone is consuming the memory you need for processing?

HTH.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

I would recommend to create a job that will do a full process of each dimension first and a full process of all the cubes at last. Sorry if you have already done this.

You can set up this by choosing to process each dimension in management studio and script that action.

I have seen that processing with the option to process all related objects do not always work a espected.

Have you any hints if you set up a trace in profiler?

Regards

Thomas Ivarsson

|||

Thanks for for advice guys. I did a couple of things you mentioned which I thought I had already tried, but the project processed ok.

Firstly I processed each dimension individually. Then I processed the cube (which I have split into 3 partitions). I changed the processing settings to 'Sequential Processing' and 'One transaction at a time'. It took about an hour and a half (it used to take one hour), but I can live with that.

I would like to write a script for this but am not sure what form it would take. Would it be in SQL, MDX or XMLA? Could you give me a brief outline of the syntax required, to say process a single dimension. This would give me an idea of what I am aiming for.

Many thanks,

Dave

|||

Can anybody help with this?

Dave

|||

Right click on a dimension in management studio and choose process. In the next dialogue you will have the option to script that command to a new query window or to memory. Then you can see the XMLA syntax for this.

Regards

Thomas Ivarsson

Memory Allocation Failure When Deploying AS2005 Project

Hi,

I have an AS2005 project that I am trying to deploy. It contains one fact table with approx 20 million rows. I have never had any problems deploying the project unitl recently. What is happening is when between 3 - 5 million of the rows have been read it will fall over, giving the following error message:

"Memory error: Allocation failure : Not enough storage is available to process this command. . Errors in the OLAP storage engine:"

I have read about this occurring during the processing of a data mining model, and a hotfix exists to solve this. However, my project contains no data mining at all.

Can anyone shed any light on this?

Dave

Hard drive failure? Hard drive full/not enough storage space?

If you use incremental updating for your cube you will need additional storage space temporarely.

Regards

Thomas Ivarsson

|||

Hard drive is fine, and has around 200 Gb free. I think this is a memory problem with the server. The server has 3Gb, and has been able to process the cube fine for the last 2 months so I cannot understand it.

What do you mean by incremental processing? I have 3 partitions set up, each with around 7 million rows. I have set the options to 'Sequential processing', and 'Separate transactions'. Is there anything else I can do?

Dave

|||

Looks like your machine doesnt have enoght memory to execute a processing command. Analysis Server tries estimate the memory needed to complete processing command and that is the error message you are seeing.

Try process separately every dimension and partition. If you still see the error message during partition processing try to reduce size of the partition by partitioning your data some more.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Edward,

I could understand this being the case if the there had always been this problem. However, the fact table does not have any more rows now it is erroring than it did when it was processing fine. In addition, the error occurs when only a small proportion of the rows have been read, so I cannot see how the machine could have run out of memory in this early stage of processing.

David

|||

This is not a situation where machine runs out memory. Analysis Server wouldnt even start processing if you dont have enoght memory available to comlete the operation. So what is happening, it tries to estimate how much memory it needs to complete the processing, then it requests from OS what is the memory avaliable at the moment and compares the two. If there isn't enoguh, you will see the error.

Now, In your case I am not clear what had happen before.

The two things to check.

1. As I suggested in previous reply. Try and process less things in parralel, try to partition some more. Anything causing Analysis Server to lower the estimates.

2. See the memory situation on your machine at the time of processing. Is there someone is consuming the memory you need for processing?

HTH.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

I would recommend to create a job that will do a full process of each dimension first and a full process of all the cubes at last. Sorry if you have already done this.

You can set up this by choosing to process each dimension in management studio and script that action.

I have seen that processing with the option to process all related objects do not always work a espected.

Have you any hints if you set up a trace in profiler?

Regards

Thomas Ivarsson

|||

Thanks for for advice guys. I did a couple of things you mentioned which I thought I had already tried, but the project processed ok.

Firstly I processed each dimension individually. Then I processed the cube (which I have split into 3 partitions). I changed the processing settings to 'Sequential Processing' and 'One transaction at a time'. It took about an hour and a half (it used to take one hour), but I can live with that.

I would like to write a script for this but am not sure what form it would take. Would it be in SQL, MDX or XMLA? Could you give me a brief outline of the syntax required, to say process a single dimension. This would give me an idea of what I am aiming for.

Many thanks,

Dave

|||

Can anybody help with this?

Dave

|||

Right click on a dimension in management studio and choose process. In the next dialogue you will have the option to script that command to a new query window or to memory. Then you can see the XMLA syntax for this.

Regards

Thomas Ivarsson

sql

Memory allocation error while processing cubes

I get this error while processing my cubes:
Error: Memory error: Allocation failure : Not enough storage is available to process this command.
There is plenty of space left on the hard disks on this machine, I can't find a remedy for this problem on Google either. Any help would be appreciated.

Thanks in advance.

Preston, which version of SQL Server Analysis Services are you running? Did you install the latest service pack?

--Artur

|||I am using Analysis Services 2005 Enterprise Edition with (to my knowledge) the latest service pack, version 9.00.2047.00.
|||

The error you getting means Analysis Server estimated amount of memory it would need to perform processing operation and there is not enought memory.

What you can do about this:

1. Force Analysis Server to disregard the estimate and proceed with processing operation. You might get a memory allocation later down the road during processing. For that change MemoryLimitErrorEnabled server property. For description of server properties see http://www.microsoft.com/technet/prodtechnol/sql/2005/ssasproperties.mspx

2. Take a look at processing operation, see if you can process fewer things at the time. For that you can change CoordinatorExecutionMode. Take a look at the processing architechture whitepaper http://msdn2.microsoft.com/en-us/library/ms345142.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

what says your experience about changing CoordinatorExecutionMode?

Is it worth to set it 0?

|||

I am seeing the same disk space issue as described in original post.

Each partition is about 8GB, total 14 partitions. each partition about 50M rows.

The cube was processed before so it exists with about 110GB total used space.

Free space about 200GB.

When re-processing the same cube, the free space seems to slowly disappear, for example only after processing 2 partitions, the free space went from 200GB to 40GB. This is while the total used space in this cube directory is 120GB. The used space makes sense - 110GB + 8G for 2nd copy of partition that's being processed.

The free space gone to neverland is what doesn't make sense!

I am processing partitions from BIDS (highlighted all partitions). Processing each partition separately, each with its own transaction.

I've seen the same type of Disk space stealing by SSAS server when for example deleting a cube. Somehow when I delete the cube, it does not show that disk space is free, while the files are gone!, - that is until you restart SSAS server.

can anyone shed a light on this?

(win 2003 ent, sql 2005 ent sp2 fin, 32 bit, 16gb ram, 4 cpu)

|||

.net pukes when serializing large data files, Reporting services chokes on large data files, and now SSAS can't allocate memory.

Does anyone see a pattern here?

|||

Does SSAS actually use the Disk for temporary cache of data during aggregation?

can anyone from MS SSAS team clarify?

if that's true, which "directory" is it, and can I change it to be a different disk? I so happen to have more free space available on a different drive (not OLAP Data drive).

any ideas?

|||

Hi

I am experiencing the same memory allocation error whenever I try to deploy a cube. This is without even trying to process it. The only way I have been able to resolve it is to get administrators to restart the AS service on the server. I did not have this issue prior to loading SQL Server SP2. Can someone please advise if they have found the same issue and perhaps one of the fixes in SP2 broke the deployment? I have tried making the ProcessPlan/MemoryLimitErrorEnabled to false and it still gives me the same issue.

Someone please assist.

|||

I am having the same problems trying to load my cube. I have tried all the little tricks listed in the various post to no avail.

Please help!!!!!

|||

Edward, can you unmark this as answered? - so that it gets proper attention?

|||

Hi,

I'm getting the same issue processing data. The error message is turned off and from looking at the performance of the server it is not going anywhere near the memory limit.

Should it page to disk if it starts to run out of memory? I thought this is what 2000 did.

Cheers

M

|||I've just encountered the same issue (right after upgrading to SP2). My last three loads have all failed with this same error...never saw it prior to SP2.|||

Yes it means you actually have to know what you are doing when working with large datasets, when to put something on the stack vs the heap. Other people make these tools work, the pattern seems to be your competance as a professional.

Memory allocation error while processing cubes

I get this error while processing my cubes:
Error: Memory error: Allocation failure : Not enough storage is available to process this command.
There is plenty of space left on the hard disks on this machine, I can't find a remedy for this problem on Google either. Any help would be appreciated.

Thanks in advance.

Preston, which version of SQL Server Analysis Services are you running? Did you install the latest service pack?

--Artur

|||I am using Analysis Services 2005 Enterprise Edition with (to my knowledge) the latest service pack, version 9.00.2047.00.
|||

The error you getting means Analysis Server estimated amount of memory it would need to perform processing operation and there is not enought memory.

What you can do about this:

1. Force Analysis Server to disregard the estimate and proceed with processing operation. You might get a memory allocation later down the road during processing. For that change MemoryLimitErrorEnabled server property. For description of server properties see http://www.microsoft.com/technet/prodtechnol/sql/2005/ssasproperties.mspx

2. Take a look at processing operation, see if you can process fewer things at the time. For that you can change CoordinatorExecutionMode. Take a look at the processing architechture whitepaper http://msdn2.microsoft.com/en-us/library/ms345142.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

what says your experience about changing CoordinatorExecutionMode?

Is it worth to set it 0?

|||

I am seeing the same disk space issue as described in original post.

Each partition is about 8GB, total 14 partitions. each partition about 50M rows.

The cube was processed before so it exists with about 110GB total used space.

Free space about 200GB.

When re-processing the same cube, the free space seems to slowly disappear, for example only after processing 2 partitions, the free space went from 200GB to 40GB. This is while the total used space in this cube directory is 120GB. The used space makes sense - 110GB + 8G for 2nd copy of partition that's being processed.

The free space gone to neverland is what doesn't make sense!

I am processing partitions from BIDS (highlighted all partitions). Processing each partition separately, each with its own transaction.

I've seen the same type of Disk space stealing by SSAS server when for example deleting a cube. Somehow when I delete the cube, it does not show that disk space is free, while the files are gone!, - that is until you restart SSAS server.

can anyone shed a light on this?

(win 2003 ent, sql 2005 ent sp2 fin, 32 bit, 16gb ram, 4 cpu)

|||

.net pukes when serializing large data files, Reporting services chokes on large data files, and now SSAS can't allocate memory.

Does anyone see a pattern here?

|||

Does SSAS actually use the Disk for temporary cache of data during aggregation?

can anyone from MS SSAS team clarify?

if that's true, which "directory" is it, and can I change it to be a different disk? I so happen to have more free space available on a different drive (not OLAP Data drive).

any ideas?

|||

Hi

I am experiencing the same memory allocation error whenever I try to deploy a cube. This is without even trying to process it. The only way I have been able to resolve it is to get administrators to restart the AS service on the server. I did not have this issue prior to loading SQL Server SP2. Can someone please advise if they have found the same issue and perhaps one of the fixes in SP2 broke the deployment? I have tried making the ProcessPlan/MemoryLimitErrorEnabled to false and it still gives me the same issue.

Someone please assist.

|||

I am having the same problems trying to load my cube. I have tried all the little tricks listed in the various post to no avail.

Please help!!!!!

|||

Edward, can you unmark this as answered? - so that it gets proper attention?

|||

Hi,

I'm getting the same issue processing data. The error message is turned off and from looking at the performance of the server it is not going anywhere near the memory limit.

Should it page to disk if it starts to run out of memory? I thought this is what 2000 did.

Cheers

M

|||I've just encountered the same issue (right after upgrading to SP2). My last three loads have all failed with this same error...never saw it prior to SP2.|||

Yes it means you actually have to know what you are doing when working with large datasets, when to put something on the stack vs the heap. Other people make these tools work, the pattern seems to be your competance as a professional.

Memory allocation error while processing cubes

I get this error while processing my cubes:
Error: Memory error: Allocation failure : Not enough storage is available to process this command.
There is plenty of space left on the hard disks on this machine, I can't find a remedy for this problem on Google either. Any help would be appreciated.

Thanks in advance.

Preston, which version of SQL Server Analysis Services are you running? Did you install the latest service pack?

--Artur

|||I am using Analysis Services 2005 Enterprise Edition with (to my knowledge) the latest service pack, version 9.00.2047.00.
|||

The error you getting means Analysis Server estimated amount of memory it would need to perform processing operation and there is not enought memory.

What you can do about this:

1. Force Analysis Server to disregard the estimate and proceed with processing operation. You might get a memory allocation later down the road during processing. For that change MemoryLimitErrorEnabled server property. For description of server properties see http://www.microsoft.com/technet/prodtechnol/sql/2005/ssasproperties.mspx

2. Take a look at processing operation, see if you can process fewer things at the time. For that you can change CoordinatorExecutionMode. Take a look at the processing architechture whitepaper http://msdn2.microsoft.com/en-us/library/ms345142.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

what says your experience about changing CoordinatorExecutionMode?

Is it worth to set it 0?

|||

I am seeing the same disk space issue as described in original post.

Each partition is about 8GB, total 14 partitions. each partition about 50M rows.

The cube was processed before so it exists with about 110GB total used space.

Free space about 200GB.

When re-processing the same cube, the free space seems to slowly disappear, for example only after processing 2 partitions, the free space went from 200GB to 40GB. This is while the total used space in this cube directory is 120GB. The used space makes sense - 110GB + 8G for 2nd copy of partition that's being processed.

The free space gone to neverland is what doesn't make sense!

I am processing partitions from BIDS (highlighted all partitions). Processing each partition separately, each with its own transaction.

I've seen the same type of Disk space stealing by SSAS server when for example deleting a cube. Somehow when I delete the cube, it does not show that disk space is free, while the files are gone!, - that is until you restart SSAS server.

can anyone shed a light on this?

(win 2003 ent, sql 2005 ent sp2 fin, 32 bit, 16gb ram, 4 cpu)

|||

.net pukes when serializing large data files, Reporting services chokes on large data files, and now SSAS can't allocate memory.

Does anyone see a pattern here?

|||

Does SSAS actually use the Disk for temporary cache of data during aggregation?

can anyone from MS SSAS team clarify?

if that's true, which "directory" is it, and can I change it to be a different disk? I so happen to have more free space available on a different drive (not OLAP Data drive).

any ideas?

|||

Hi

I am experiencing the same memory allocation error whenever I try to deploy a cube. This is without even trying to process it. The only way I have been able to resolve it is to get administrators to restart the AS service on the server. I did not have this issue prior to loading SQL Server SP2. Can someone please advise if they have found the same issue and perhaps one of the fixes in SP2 broke the deployment? I have tried making the ProcessPlan/MemoryLimitErrorEnabled to false and it still gives me the same issue.

Someone please assist.

|||

I am having the same problems trying to load my cube. I have tried all the little tricks listed in the various post to no avail.

Please help!!!!!

|||

Edward, can you unmark this as answered? - so that it gets proper attention?

|||

Hi,

I'm getting the same issue processing data. The error message is turned off and from looking at the performance of the server it is not going anywhere near the memory limit.

Should it page to disk if it starts to run out of memory? I thought this is what 2000 did.

Cheers

M

|||I've just encountered the same issue (right after upgrading to SP2). My last three loads have all failed with this same error...never saw it prior to SP2.|||

Yes it means you actually have to know what you are doing when working with large datasets, when to put something on the stack vs the heap. Other people make these tools work, the pattern seems to be your competance as a professional.

sql

Memory allocation error while processing cubes

I get this error while processing my cubes:
Error: Memory error: Allocation failure : Not enough storage is available to process this command.
There is plenty of space left on the hard disks on this machine, I can't find a remedy for this problem on Google either. Any help would be appreciated.

Thanks in advance.

Preston, which version of SQL Server Analysis Services are you running? Did you install the latest service pack?

--Artur

|||I am using Analysis Services 2005 Enterprise Edition with (to my knowledge) the latest service pack, version 9.00.2047.00.|||

The error you getting means Analysis Server estimated amount of memory it would need to perform processing operation and there is not enought memory.

What you can do about this:

1. Force Analysis Server to disregard the estimate and proceed with processing operation. You might get a memory allocation later down the road during processing. For that change MemoryLimitErrorEnabled server property. For description of server properties see http://www.microsoft.com/technet/prodtechnol/sql/2005/ssasproperties.mspx

2. Take a look at processing operation, see if you can process fewer things at the time. For that you can change CoordinatorExecutionMode. Take a look at the processing architechture whitepaper http://msdn2.microsoft.com/en-us/library/ms345142.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

what says your experience about changing CoordinatorExecutionMode?

Is it worth to set it 0?

|||

I am seeing the same disk space issue as described in original post.

Each partition is about 8GB, total 14 partitions. each partition about 50M rows.

The cube was processed before so it exists with about 110GB total used space.

Free space about 200GB.

When re-processing the same cube, the free space seems to slowly disappear, for example only after processing 2 partitions, the free space went from 200GB to 40GB. This is while the total used space in this cube directory is 120GB. The used space makes sense - 110GB + 8G for 2nd copy of partition that's being processed.

The free space gone to neverland is what doesn't make sense!

I am processing partitions from BIDS (highlighted all partitions). Processing each partition separately, each with its own transaction.

I've seen the same type of Disk space stealing by SSAS server when for example deleting a cube. Somehow when I delete the cube, it does not show that disk space is free, while the files are gone!, - that is until you restart SSAS server.

can anyone shed a light on this?

(win 2003 ent, sql 2005 ent sp2 fin, 32 bit, 16gb ram, 4 cpu)

|||

.net pukes when serializing large data files, Reporting services chokes on large data files, and now SSAS can't allocate memory.

Does anyone see a pattern here?

|||

Does SSAS actually use the Disk for temporary cache of data during aggregation?

can anyone from MS SSAS team clarify?

if that's true, which "directory" is it, and can I change it to be a different disk? I so happen to have more free space available on a different drive (not OLAP Data drive).

any ideas?

|||

Hi

I am experiencing the same memory allocation error whenever I try to deploy a cube. This is without even trying to process it. The only way I have been able to resolve it is to get administrators to restart the AS service on the server. I did not have this issue prior to loading SQL Server SP2. Can someone please advise if they have found the same issue and perhaps one of the fixes in SP2 broke the deployment? I have tried making the ProcessPlan/MemoryLimitErrorEnabled to false and it still gives me the same issue.

Someone please assist.

|||

I am having the same problems trying to load my cube. I have tried all the little tricks listed in the various post to no avail.

Please help!!!!!

|||

Edward, can you unmark this as answered? - so that it gets proper attention?

|||

Hi,

I'm getting the same issue processing data. The error message is turned off and from looking at the performance of the server it is not going anywhere near the memory limit.

Should it page to disk if it starts to run out of memory? I thought this is what 2000 did.

Cheers

M

|||I've just encountered the same issue (right after upgrading to SP2). My last three loads have all failed with this same error...never saw it prior to SP2.|||

Yes it means you actually have to know what you are doing when working with large datasets, when to put something on the stack vs the heap. Other people make these tools work, the pattern seems to be your competance as a professional.

Memory allocation error while processing cubes

I get this error while processing my cubes:
Error: Memory error: Allocation failure : Not enough storage is available to process this command.
There is plenty of space left on the hard disks on this machine, I can't find a remedy for this problem on Google either. Any help would be appreciated.

Thanks in advance.

Preston, which version of SQL Server Analysis Services are you running? Did you install the latest service pack?

--Artur

|||I am using Analysis Services 2005 Enterprise Edition with (to my knowledge) the latest service pack, version 9.00.2047.00.
|||

The error you getting means Analysis Server estimated amount of memory it would need to perform processing operation and there is not enought memory.

What you can do about this:

1. Force Analysis Server to disregard the estimate and proceed with processing operation. You might get a memory allocation later down the road during processing. For that change MemoryLimitErrorEnabled server property. For description of server properties see http://www.microsoft.com/technet/prodtechnol/sql/2005/ssasproperties.mspx

2. Take a look at processing operation, see if you can process fewer things at the time. For that you can change CoordinatorExecutionMode. Take a look at the processing architechture whitepaper http://msdn2.microsoft.com/en-us/library/ms345142.aspx

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

what says your experience about changing CoordinatorExecutionMode?

Is it worth to set it 0?

|||

I am seeing the same disk space issue as described in original post.

Each partition is about 8GB, total 14 partitions. each partition about 50M rows.

The cube was processed before so it exists with about 110GB total used space.

Free space about 200GB.

When re-processing the same cube, the free space seems to slowly disappear, for example only after processing 2 partitions, the free space went from 200GB to 40GB. This is while the total used space in this cube directory is 120GB. The used space makes sense - 110GB + 8G for 2nd copy of partition that's being processed.

The free space gone to neverland is what doesn't make sense!

I am processing partitions from BIDS (highlighted all partitions). Processing each partition separately, each with its own transaction.

I've seen the same type of Disk space stealing by SSAS server when for example deleting a cube. Somehow when I delete the cube, it does not show that disk space is free, while the files are gone!, - that is until you restart SSAS server.

can anyone shed a light on this?

(win 2003 ent, sql 2005 ent sp2 fin, 32 bit, 16gb ram, 4 cpu)

|||

.net pukes when serializing large data files, Reporting services chokes on large data files, and now SSAS can't allocate memory.

Does anyone see a pattern here?

|||

Does SSAS actually use the Disk for temporary cache of data during aggregation?

can anyone from MS SSAS team clarify?

if that's true, which "directory" is it, and can I change it to be a different disk? I so happen to have more free space available on a different drive (not OLAP Data drive).

any ideas?

|||

Hi

I am experiencing the same memory allocation error whenever I try to deploy a cube. This is without even trying to process it. The only way I have been able to resolve it is to get administrators to restart the AS service on the server. I did not have this issue prior to loading SQL Server SP2. Can someone please advise if they have found the same issue and perhaps one of the fixes in SP2 broke the deployment? I have tried making the ProcessPlan/MemoryLimitErrorEnabled to false and it still gives me the same issue.

Someone please assist.

|||

I am having the same problems trying to load my cube. I have tried all the little tricks listed in the various post to no avail.

Please help!!!!!

|||

Edward, can you unmark this as answered? - so that it gets proper attention?

|||

Hi,

I'm getting the same issue processing data. The error message is turned off and from looking at the performance of the server it is not going anywhere near the memory limit.

Should it page to disk if it starts to run out of memory? I thought this is what 2000 did.

Cheers

M

|||I've just encountered the same issue (right after upgrading to SP2). My last three loads have all failed with this same error...never saw it prior to SP2.|||

Yes it means you actually have to know what you are doing when working with large datasets, when to put something on the stack vs the heap. Other people make these tools work, the pattern seems to be your competance as a professional.

memory allocation error

using sp_configure we inadvertanly set the network packet size too big, an
didn't get a warning. when we restarted the mssql service we found that the
sqlserv instance wouldn't come up.. we were able to go into the tools >>
options >>
advanced menu and reset the packet size to 4096, which did allow us to
bring up the sqlserv instance, but when we try to bring up sql analyzer to
change the sp_configure paramenter we get 'odbc msg 0 level 16 state1 sql
serve driver (memor allocation failure). We suspect that even though we
change the packet size to 4096 in the tools >> options >> advanced window it
doesnt change the sp_configure value. How can we do this without sql
analyzer ?
Don
Something is not right here. First of all, whatever the value you set for
'network packet size' usig sp_configure should not prevent the SQL instance
from coming up online. There is a max value which is 32767. If you try to set
a bigger value, SQL Server will reject it.
When you use Tools-> Options-->Advaced to change packet size, you are not
changing the option on the server side. Rather, you are changing a client
setting.
When your SDQL instance did not come online, what error message did you get?
Linchi
"don" wrote:

> using sp_configure we inadvertanly set the network packet size too big, an
> didn't get a warning. when we restarted the mssql service we found that the
> sqlserv instance wouldn't come up.. we were able to go into the tools >>
> options >>
> advanced menu and reset the packet size to 4096, which did allow us to
> bring up the sqlserv instance, but when we try to bring up sql analyzer to
> change the sp_configure paramenter we get 'odbc msg 0 level 16 state1 sql
> serve driver (memor allocation failure). We suspect that even though we
> change the packet size to 4096 in the tools >> options >> advanced window it
> doesnt change the sp_configure value. How can we do this without sql
> analyzer ?
> --
> Don
|||we get :odbc msg0 level 16 state 1 sqlserv driver(memory allocation failure)
trying to connect to sqlquery.. which means i cant run any sql commands to
reset sp_configure, unless you know of some other method of running that
process. Also when we try to start the sqlserver agent we get this error: 165
odc error 0, memory allocation failure [sqlstate HY001]
Don
"Linchi Shea" wrote:
[vbcol=seagreen]
> Something is not right here. First of all, whatever the value you set for
> 'network packet size' usig sp_configure should not prevent the SQL instance
> from coming up online. There is a max value which is 32767. If you try to set
> a bigger value, SQL Server will reject it.
> When you use Tools-> Options-->Advaced to change packet size, you are not
> changing the option on the server side. Rather, you are changing a client
> setting.
> When your SDQL instance did not come online, what error message did you get?
> Linchi
> "don" wrote:
|||we had to install ms sql server management studio (part of client tools) for
mssql 2005. This allowed us to reset the value in the sp_config procedure to
an acceptable value using the object explorer.
Don
"Linchi Shea" wrote:
[vbcol=seagreen]
> Something is not right here. First of all, whatever the value you set for
> 'network packet size' usig sp_configure should not prevent the SQL instance
> from coming up online. There is a max value which is 32767. If you try to set
> a bigger value, SQL Server will reject it.
> When you use Tools-> Options-->Advaced to change packet size, you are not
> changing the option on the server side. Rather, you are changing a client
> setting.
> When your SDQL instance did not come online, what error message did you get?
> Linchi
> "don" wrote:

memory allocation error

using sp_configure we inadvertanly set the network packet size too big, an
didn't get a warning. when we restarted the mssql service we found that the
sqlserv instance wouldn't come up.. we were able to go into the tools >>
options >>
advanced menu and reset the packet size to 4096, which did allow us to
bring up the sqlserv instance, but when we try to bring up sql analyzer to
change the sp_configure paramenter we get 'odbc msg 0 level 16 state1 sql
serve driver (memor allocation failure). We suspect that even though we
change the packet size to 4096 in the tools >> options >> advanced window it
doesnt change the sp_configure value. How can we do this without sql
analyzer '
--
DonSomething is not right here. First of all, whatever the value you set for
'network packet size' usig sp_configure should not prevent the SQL instance
from coming up online. There is a max value which is 32767. If you try to set
a bigger value, SQL Server will reject it.
When you use Tools-> Options-->Advaced to change packet size, you are not
changing the option on the server side. Rather, you are changing a client
setting.
When your SDQL instance did not come online, what error message did you get?
Linchi
"don" wrote:
> using sp_configure we inadvertanly set the network packet size too big, an
> didn't get a warning. when we restarted the mssql service we found that the
> sqlserv instance wouldn't come up.. we were able to go into the tools >>
> options >>
> advanced menu and reset the packet size to 4096, which did allow us to
> bring up the sqlserv instance, but when we try to bring up sql analyzer to
> change the sp_configure paramenter we get 'odbc msg 0 level 16 state1 sql
> serve driver (memor allocation failure). We suspect that even though we
> change the packet size to 4096 in the tools >> options >> advanced window it
> doesnt change the sp_configure value. How can we do this without sql
> analyzer '
> --
> Don|||we get :odbc msg0 level 16 state 1 sqlserv driver(memory allocation failure)
trying to connect to sqlquery.. which means i cant run any sql commands to
reset sp_configure, unless you know of some other method of running that
process. Also when we try to start the sqlserver agent we get this error: 165
odc error 0, memory allocation failure [sqlstate HY001]
--
Don
"Linchi Shea" wrote:
> Something is not right here. First of all, whatever the value you set for
> 'network packet size' usig sp_configure should not prevent the SQL instance
> from coming up online. There is a max value which is 32767. If you try to set
> a bigger value, SQL Server will reject it.
> When you use Tools-> Options-->Advaced to change packet size, you are not
> changing the option on the server side. Rather, you are changing a client
> setting.
> When your SDQL instance did not come online, what error message did you get?
> Linchi
> "don" wrote:
> >
> > using sp_configure we inadvertanly set the network packet size too big, an
> > didn't get a warning. when we restarted the mssql service we found that the
> > sqlserv instance wouldn't come up.. we were able to go into the tools >>
> > options >>
> > advanced menu and reset the packet size to 4096, which did allow us to
> > bring up the sqlserv instance, but when we try to bring up sql analyzer to
> > change the sp_configure paramenter we get 'odbc msg 0 level 16 state1 sql
> > serve driver (memor allocation failure). We suspect that even though we
> > change the packet size to 4096 in the tools >> options >> advanced window it
> > doesnt change the sp_configure value. How can we do this without sql
> > analyzer '
> >
> > --
> > Don|||we had to install ms sql server management studio (part of client tools) for
mssql 2005. This allowed us to reset the value in the sp_config procedure to
an acceptable value using the object explorer.
--
Don
"Linchi Shea" wrote:
> Something is not right here. First of all, whatever the value you set for
> 'network packet size' usig sp_configure should not prevent the SQL instance
> from coming up online. There is a max value which is 32767. If you try to set
> a bigger value, SQL Server will reject it.
> When you use Tools-> Options-->Advaced to change packet size, you are not
> changing the option on the server side. Rather, you are changing a client
> setting.
> When your SDQL instance did not come online, what error message did you get?
> Linchi
> "don" wrote:
> >
> > using sp_configure we inadvertanly set the network packet size too big, an
> > didn't get a warning. when we restarted the mssql service we found that the
> > sqlserv instance wouldn't come up.. we were able to go into the tools >>
> > options >>
> > advanced menu and reset the packet size to 4096, which did allow us to
> > bring up the sqlserv instance, but when we try to bring up sql analyzer to
> > change the sp_configure paramenter we get 'odbc msg 0 level 16 state1 sql
> > serve driver (memor allocation failure). We suspect that even though we
> > change the packet size to 4096 in the tools >> options >> advanced window it
> > doesnt change the sp_configure value. How can we do this without sql
> > analyzer '
> >
> > --
> > Don

Memory Allocation 8192

I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just curious
if any else has seen this problem with this "magic" number.
We have seen this problem, and I believe that a fix is being considered for SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" <Rich@.discussions.microsoft.com> wrote in message news:C90B5CAD-773E-4A75-8A99-06DBFC8A3C6C@.microsoft.com...
I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just curious
if any else has seen this problem with this "magic" number.
|||Ryan, were you not assisting on replication a year or 2 ago ?
"Ryan Stonecipher [MSFT]" <ryanston@.online.microsoft.com> wrote in message
news:u3CXMbKhEHA.1656@.TK2MSFTNGP09.phx.gbl...
We have seen this problem, and I believe that a fix is being considered for
SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" <Rich@.discussions.microsoft.com> wrote in message
news:C90B5CAD-773E-4A75-8A99-06DBFC8A3C6C@.microsoft.com...
I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just
curious
if any else has seen this problem with this "magic" number.
sql

Memory Allocation 8192

I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just curiou
s
if any else has seen this problem with this "magic" number.We have seen this problem, and I believe that a fix is being considered for
SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" <Rich@.discussions.microsoft.com> wrote in message news:C90B5CAD-773E-
4A75-8A99-06DBFC8A3C6C@.microsoft.com...
I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just curiou
s
if any else has seen this problem with this "magic" number.|||Ryan, were you not assisting on replication a year or 2 ago ?
"Ryan Stonecipher [MSFT]" <ryanston@.online.microsoft.com> wrote in messa
ge
news:u3CXMbKhEHA.1656@.TK2MSFTNGP09.phx.gbl...
We have seen this problem, and I believe that a fix is being considered for
SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" <Rich@.discussions.microsoft.com> wrote in message
news:C90B5CAD-773E-4A75-8A99-06DBFC8A3C6C@.microsoft.com...
I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just
curious
if any else has seen this problem with this "magic" number.

Memory Allocation 8192

I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just curious
if any else has seen this problem with this "magic" number.This is a multi-part message in MIME format.
--=_NextPart_000_0016_01C4846C.1D613220
Content-Type: text/plain;
charset="Utf-8"
Content-Transfer-Encoding: quoted-printable
We have seen this problem, and I believe that a fix is being considered =for SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" <Rich@.discussions.microsoft.com> wrote in message =news:C90B5CAD-773E-4A75-8A99-06DBFC8A3C6C@.microsoft.com...
I have windows 2003 with 4 CPU's and 32GB of memory running multiple instances of SQL. Server sees the full 32GB of memory without any =problems with the /PAE option enabled. Problem is when a instance is =configured for the exact size of 8192mb the instance has a hard time starting. It =fails multiple times and what when a normal startup takes 30 seconds can now =10 minutes or more to get going. Any instance goes back to normal by =changing to a different number other than 8192 as the memory allocation. Just =curious if any else has seen this problem with this "magic" number.
--=_NextPart_000_0016_01C4846C.1D613220
Content-Type: text/html;
charset="Utf-8"
Content-Transfer-Encoding: quoted-printable
=EF=BB=BF<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
&

We have seen this problem, and =I believe that a fix is being considered for SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" wrote in message news:C90=B5CAD-773E-4A75-8A99-06DBFC8A3C6C@.microsoft.com...I have windows 2003 with 4 CPU's and 32GB of memory running multiple instances of SQL. Server sees the full 32GB of memory =without any problems with the /PAE option enabled. Problem is when a =instance is configured for the exact size of 8192mb the instance has a hard =time starting. It fails multiple times and what when a normal =startup takes 30 seconds can now 10 minutes or more to get going. =Any instance goes back to normal by changing to a different number =other than 8192 as the memory allocation. Just curious if any else has =seen this problem with this "magic" number.

--=_NextPart_000_0016_01C4846C.1D613220--|||Ryan, were you not assisting on replication a year or 2 ago ?
"Ryan Stonecipher [MSFT]" <ryanston@.online.microsoft.com> wrote in message
news:u3CXMbKhEHA.1656@.TK2MSFTNGP09.phx.gbl...
We have seen this problem, and I believe that a fix is being considered for
SQL 2000 SP4.
Thanks!
Ryan Stonecipher
Microsoft SQL Server Storage Engine
"Rich" <Rich@.discussions.microsoft.com> wrote in message
news:C90B5CAD-773E-4A75-8A99-06DBFC8A3C6C@.microsoft.com...
I have windows 2003 with 4 CPU's and 32GB of memory running multiple
instances of SQL. Server sees the full 32GB of memory without any problems
with the /PAE option enabled. Problem is when a instance is configured for
the exact size of 8192mb the instance has a hard time starting. It fails
multiple times and what when a normal startup takes 30 seconds can now 10
minutes or more to get going. Any instance goes back to normal by changing
to a different number other than 8192 as the memory allocation. Just
curious
if any else has seen this problem with this "magic" number.

Memory allocation ?

I am running SQL Server 2000 Enterprise edition on Windows Server 2003
Enterprise, do I need to do anything special to utilize >2GB of Ram?
We have 4GB, but I don't think we are using very much of it. Is there a
easy way I can find out what we are currently limited to?
I read somewhere on groups that if "SQLServer:Buffer Manager - Page life
expectancy" in perfmon dips below 300 it could be a sign of too little ram.
Thanks!Jesse
Have you tried to switch in the .ini file \3GB? SQL Server will be able to
use 2GB and 1GB for OS.
"Jesse Gardner" <ih8spam@.badcrc.com> wrote in message
news:2sdv2bF1j6t65U1@.uni-berlin.de...
> I am running SQL Server 2000 Enterprise edition on Windows Server 2003
> Enterprise, do I need to do anything special to utilize >2GB of Ram?
> We have 4GB, but I don't think we are using very much of it. Is there a
> easy way I can find out what we are currently limited to?
> I read somewhere on groups that if "SQLServer:Buffer Manager - Page life
> expectancy" in perfmon dips below 300 it could be a sign of too little
ram.
>
> Thanks!

Memory allocation ?

I am running SQL Server 2000 Enterprise edition on Windows Server 2003
Enterprise, do I need to do anything special to utilize >2GB of Ram?
We have 4GB, but I don't think we are using very much of it. Is there a
easy way I can find out what we are currently limited to?
I read somewhere on groups that if "SQLServer:Buffer Manager - Page life
expectancy" in perfmon dips below 300 it could be a sign of too little ram.
Thanks!
Jesse
Have you tried to switch in the .ini file \3GB? SQL Server will be able to
use 2GB and 1GB for OS.
"Jesse Gardner" <ih8spam@.badcrc.com> wrote in message
news:2sdv2bF1j6t65U1@.uni-berlin.de...
> I am running SQL Server 2000 Enterprise edition on Windows Server 2003
> Enterprise, do I need to do anything special to utilize >2GB of Ram?
> We have 4GB, but I don't think we are using very much of it. Is there a
> easy way I can find out what we are currently limited to?
> I read somewhere on groups that if "SQLServer:Buffer Manager - Page life
> expectancy" in perfmon dips below 300 it could be a sign of too little
ram.
>
> Thanks!