Hopefully you know the relevance and importance of setting a correct value for max memory on your SQL Server. By default it will be the value 2147483647 which is not a random number but the 8th Mersenne prime! In a computing sense it is the maximum positive value for a 32-bit signed binary integer and that is big.
Anyways, getting back to the point of this post, I personally believe you should be changing this default value. But be sensible, you do want to see messages like “There is insufficient system memory in resource pool ‘internal’ to run this query”.
Out the box SQL Server will be like this.
Or you could use TSQL to check the details.
SELECT name, value, value_in_use, [description] FROM sys.configurations WHERE name like '%server memory%' ORDER BY name
Like I said before, change it but with some common sense in mind, if you set it too low this could happen.
Naturally you might try and restart SQL Server and then make changes to max server memory setting right?
It might not work.
The error log will show a stack dump too.
How much is enough? I suggest reading this blog post: https://www.sqlskills.com/blogs/jonathan/how-much-memory-does-my-sql-server-actually-need/
By the way I found a cool behaviour which I can’t really show via screen shots. Please see the quick video. Basically what happens here is I logon to the server via management studio and change max memory to a really silly setting like 50MB and look what happens!
Management studio will override your silly setting with 128MB thus giving you a little more room for manoeuvring. Did you know that? I didn’t!