I'm using Entity Framework (code first) and finding the order I specify clauses in my LINQ queries is having a huge performance impact, so for example:
using (var db = new MyDbContext())
{
var mySize = "medium";
var myColour = "vermilion";
var list1 = db.Widgets.Where(x => x.Colour == myColour && x.Size == mySize).ToList();
var list2 = db.Widgets.Where(x => x.Size == mySize && x.Colour == myColour).ToList();
}
Where the (rare) colour clause precedes the (common) size clause it's fast, but the other way round it's orders of magnitude slower. The table has a couple of million rows and the two fields in question are nvarchar(50), so not normalised but they are each indexed. The fields are specified in a code first fashion as follows:
[StringLength(50)]
public string Colour { get; set; }
[StringLength(50)]
public string Size { get; set; }
Am I really supposed to have to worry about such things in my LINQ queries, I thought that was the database's job?
System specs are:
- Visual Studio 2010
- .NET 4
- EntityFramework 6.0.0-beta1
- SQL Server 2008 R2 Web (64 bit)
Update:
Right, to any gluttons for punishment the effect can be replicated as below. The issue seems to be tremendously sensitive to a number of factors so please bear with the contrived nature of some of this:
Install EntityFramework 6.0.0-beta1 via nuget, then generate code first style with:
public class Widget
{
[Key]
public int WidgetId { get; set; }
[StringLength(50)]
public string Size { get; set; }
[StringLength(50)]
public string Colour { get; set; }
}
public class MyDbContext : DbContext
{
public MyDbContext()
: base("DefaultConnection")
{
}
public DbSet<Widget> Widgets { get; set; }
}
Generate the dummy data with the following SQL:
insert into gadget (Size, Colour)
select RND1 + ' is the name is this size' as Size,
RND2 + ' is the name of this colour' as Colour
from (Select top 1000000
CAST(abs(Checksum(NewId())) % 100 as varchar) As RND1,
CAST(abs(Checksum(NewId())) % 10000 as varchar) As RND2
from master..spt_values t1 cross join master..spt_values t2) t3
Add one index each for Colour and Size, then query with:
string mySize = "99 is the name is this size";
string myColour = "9999 is the name of this colour";
using (var db = new WebDbContext())
{
var list1= db.Widgets.Where(x => x.Colour == myColour && x.Size == mySize).ToList();
}
using (var db = new WebDbContext())
{
var list2 = db.Widgets.Where(x => x.Size == mySize && x.Colour == myColour).ToList();
}
The issue seems connected with the obtuse collection of NULL comparisons in the generated SQL, as below.
exec sp_executesql N'SELECT
[Extent1].[WidgetId] AS [WidgetId],
[Extent1].[Size] AS [Size],
[Extent1].[Colour] AS [Colour]
FROM [dbo].[Widget] AS [Extent1]
WHERE ((([Extent1].[Size] = @p__linq__0)
AND ( NOT ([Extent1].[Size] IS NULL OR @p__linq__0 IS NULL)))
OR (([Extent1].[Size] IS NULL) AND (@p__linq__0 IS NULL)))
AND ((([Extent1].[Colour] = @p__linq__1) AND ( NOT ([Extent1].[Colour] IS NULL
OR @p__linq__1 IS NULL))) OR (([Extent1].[Colour] IS NULL)
AND (@p__linq__1 IS NULL)))',N'@p__linq__0 nvarchar(4000),@p__linq__1 nvarchar(4000)',
@p__linq__0=N'99 is the name is this size',
@p__linq__1=N'9999 is the name of this colour'
go
Changing the equality operator in the LINQ to StartWith() makes the problem go away, as does changing either one of the two fields to be non nullable at the database.
I despair!
Update 2:
Some assistance for any bounty hunters, the issue can be reproduced on SQL Server 2008 R2 Web (64 bit) in a clean database, as follows:
CREATE TABLE [dbo].[Widget](
[WidgetId] [int] IDENTITY(1,1) NOT NULL,
[Size] [nvarchar](50) NULL,
[Colour] [nvarchar](50) NULL,
CONSTRAINT [PK_dbo.Widget] PRIMARY KEY CLUSTERED
(
[WidgetId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
CREATE NONCLUSTERED INDEX IX_Widget_Size ON dbo.Widget
(
Size
) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
CREATE NONCLUSTERED INDEX IX_Widget_Colour ON dbo.Widget
(
Colour
) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
insert into Widget (Size, Colour)
select RND1 + ' is the name is this size' as Size,
RND2 + ' is the name of this colour' as Colour
from (Select top 1000000
CAST(abs(Checksum(NewId())) % 100 as varchar) As RND1,
CAST(abs(Checksum(NewId())) % 10000 as varchar) As RND2
from master..spt_values t1 cross join master..spt_values t2) t3
GO
and then compare the relative performance of the following two queries (you may need to adjust the parameter test values in order to get a query which returns a couple of rows in order to observe the effect, i.e. the second query id much slower).
exec sp_executesql N'SELECT
[Extent1].[WidgetId] AS [WidgetId],
[Extent1].[Size] AS [Size],
[Extent1].[Colour] AS [Colour]
FROM [dbo].[Widget] AS [Extent1]
WHERE ((([Extent1].[Colour] = @p__linq__0)
AND ( NOT ([Extent1].[Colour] IS NULL
OR @p__linq__0 IS NULL)))
OR (([Extent1].[Colour] IS NULL)
AND (@p__linq__0 IS NULL)))
AND ((([Extent1].[Size] = @p__linq__1)
AND ( NOT ([Extent1].[Size] IS NULL
OR @p__linq__1 IS NULL)))
OR (([Extent1].[Size] IS NULL) AND (@p__linq__1 IS NULL)))',
N'@p__linq__0 nvarchar(4000),@p__linq__1 nvarchar(4000)',
@p__linq__0=N'9999 is the name of this colour',
@p__linq__1=N'99 is the name is this size'
go
exec sp_executesql N'SELECT
[Extent1].[WidgetId] AS [WidgetId],
[Extent1].[Size] AS [Size],
[Extent1].[Colour] AS [Colour]
FROM [dbo].[Widget] AS [Extent1]
WHERE ((([Extent1].[Size] = @p__linq__0)
AND ( NOT ([Extent1].[Size] IS NULL
OR @p__linq__0 IS NULL)))
OR (([Extent1].[Size] IS NULL)
AND (@p__linq__0 IS NULL)))
AND ((([Extent1].[Colour] = @p__linq__1)
AND ( NOT ([Extent1].[Colour] IS NULL
OR @p__linq__1 IS NULL)))
OR (([Extent1].[Colour] IS NULL)
AND (@p__linq__1 IS NULL)))',
N'@p__linq__0 nvarchar(4000),@p__linq__1 nvarchar(4000)',
@p__linq__0=N'99 is the name is this size',
@p__linq__1=N'9999 is the name of this colour'
You may also find, as I do, that if you rerun the dummy data insert so that there are now two million rows, the problem goes away.
See Question&Answers more detail:
os