I have a table in a sql server database with a year
column of type shortint
and a qtr
column of type tinyint
. When this table is imported with entity framework, the types of the corresponding properties are short
and byte
as expected. Now when I try and run a query like this:
query = myTable.Where(a => a.year > qyear || (a.year == year && a.qtr >= qqtr));
Where qyear
and qqtr
are again short
and byte
respectively, if I look at the SQL generated (using myContext.Database.Log = s => System.Diagnostics.Debug.Writeline(s);
) I see this:
SELECT
[Extent1].[year] as [year],
[Extent1].[qtr] as [qtr],
/* a bunch of other fields */
FROM [dbo].[myTable] as [Extent1]
WHERE (( CAST( [Extent1].[year] AS int) > @p__linq__0) OR
(((( CAST( [Extent1].[year] AS int) = @p__linq__1) AND
( NOT (( CAST( [Extent1].[year] AS int) IS NULL) OR
(@p__linq__1 IS NULL)))) OR (( CAST( [Extent1].[year] AS int) IS NULL)
AND (@p__linq__1 IS NULL))) AND ( CAST( [Extent1].[qtr] AS int) >= @p__linq__2)))
Why is everything getting cast to an int
for comparisons?
Why is everything getting cast to an int for comparisons?
Because that's what the C# language defines. It doesn't define any operators on byte
, short
etc - they're all promoted to int
before anything happens. That gets propagated to expression trees too. As an example:
using System;
using System.Linq.Expressions;
public class Test
{
public static void Main()
{
Expression<Func<byte, byte, bool>> func = (x, y) => x > y;
Console.WriteLine(func);
}
}
That prints
(x, y) => (Convert(x) > Convert(y))
and if you look at the IL, you'll see that the Convert
calls are effectively
Convert(xParameterExpression, typeof(int))
and
Convert(yParameterExpression, typeof(int))
So the SQL is representing exactly what your C# represents... and I'd expect it to behave exactly the same way too. I would be surprised if this had any cost - any decent query optimizer is going to realize that it can just use the values as they are, without really converting each one in turn.
See more on this question at Stackoverflow