You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was analysing a plan that had a +600 MB memory grant.
On the warnings, it also had some implicit conversion (CONVERT_IMPLICIT) of some columns to nvarchar(max).
Note
Query has a CTE with 4 queries joined by 3 UNION ALL
The warning for the implicit conversion comes from 2 of the UNION queries having a column as nvarchar(max)
By removing those columns, the memory grant went down to a <5 MB.
This type of CONVERT_IMPLICIT isn't being raised.
No indication that we may have big columns that should be checked and tested by remove them.
Proposed Solution
Can we try to check if the implicit conversion exists and is to this data type's max issue a warning too?
Can we add a comment on the Human/Robot Advice for the user to check if big columns are being requested and to double-check if those are really necessary? Something like Check whether big text columns are being selected and if they are really necessary.
Would this be better under [Warning] Excessive Memory Grant or [Warning] Large Memory Grant ?
Use Case
Remind the user that the query may be requesting huge (text) columns, and maybe those aren't needed or can be requested separately from it.