Skip to content
Home » Max server memory recommendations are just a suggestion

Max server memory recommendations are just a suggestion

  • by
3¼-inch diskette in red

Some of you might know me because of the Max Server Memory Matrix, a chart I created based off a decade-old algorithm developed by Jonathan Kehayias. The chart has lived at bornsql.ca/memory for a while. There’s even a handy script on GitHub you can run. Check it out. I’ll wait.

When you install SQL Server on Windows these days, the Setup tool comes with a neat recommendation based on the amount of server memory available at that moment in time. This calculation isn’t documented, and I know of several people who have asked for the calculation to be made public.

There are a few things to consider about this request, and if you don’t want to read all the way to the end, the answer is “no, it probably won’t be documented.”

Suggestions and assumptions

The Server memory configuration options article on Microsoft Docs says that the calculation is “based on a percentage of available system memory at the time of installation.” That’s what we call a loaded statement.

The recommendation is just a suggestion. A guideline. And because this is SQL “It Depends” Server we’re talking about, any Max Server Memory calculation is based on several assumptions:

SQL Server is the only thing running on the machine

No, it isn’t. You know this. Even a dedicated machine needs headroom for — at the very least — the operating system and device drivers. The more RAM you have, the more you must leave available to Windows. This often takes people by surprise, especially in servers with more than 512 GB of RAM.

You’re not using columnstore or in-memory OLTP

On Standard Edition you shouldn’t use them. But if you do, there are some considerations. While it is true that the Max Server Memory setting only applies to the buffer pool — which is artificially limited to 128 GB on Standard Edition — that limit excludes the limits for in-memory OLTP and columnstore. Erik Darling has recommended that you increase Max Server Memory to account for this.

You use the “Lock pages in memory” policy

Many words have been written about this setting and whether you should use it. If you have a physical machine (or a dedicated virtual machine with correctly provisioned resources), turn it on. Take care though that if Windows needs some of the RAM that has been given to SQL Server, you may experience issues featuring the colour blue.

You aren’t going to change the memory allocation after installation

Perhaps you set up SQL Server on a large VM while configuring it, and then scale down for periods that aren’t busy during the month. You’ll need to modify the Max Server Memory setting every time you resize the machine.

Correct memory allocation

Every server is different. You need to monitor your server’s memory usage and adjust it accordingly. Microsoft Docs has what I think is a pretty good explanation on how to configure Max Server Memory (yes, I did help a bit with the wording), but it’s not a simple calculation. It’s not meant to be a simple calculation.

For an initial configuration or when there was no opportunity to collect SQL Server process memory usage over time, use the following generalized best practice approach to configure max server memory (MB) for a single instance:

  • From the total OS memory, subtract the equivalent of potential SQL Server thread memory allocations outside max server memory (MB) control, which is the stack size multiplied by calculated max worker threads.
  • Then subtract 25% for other memory allocations outside max server memory (MB) control, such as backup buffers, extended stored procedure DLLs, objects that are created by using Automation procedures (sp_OA calls), and allocations from linked server providers. This is a generic approximation, and your mileage may vary.
  • What remains should be the max server memory (MB) setting for a single instance setup.

Conclusion

I don’t think the SQL Server product group will describe how the SQL Server Setup tool calculates its recommendation, because it’s just a suggestion. It might work for 80% of customers who have accepted the assumptions and never look at this setting again, and we can agree that it’s a great deal better than the default of “all available memory.”

What we don’t want is another magic number, like the ill-fated index maintenance suggestions of 5% and 30%. If you’re a database administrator and you’re wondering how this number is being calculated, your time is better spent on testing, measuring, monitoring, and then implementing an optimal setting for Max Server Memory on your particular server.

For that matter, the same goes for Cost Threshold for Parallelism, and Optimize for Ad Hoc Workloads. Test, measure, monitor, implement.

Share your thoughts in the comments below.

Photo by Fredy Jacob on Unsplash.