Help us improve SQL Server

While working with SQL Server, we have identified a few shortcomings that we have reported to Microsoft. To raise awareness for these feature request we would appreciate your help in voting for their importance. In order to do so, follow the links below, log in with your Microsoft ID and click on the green box with an up arrow.

Introduce Substitute Primary Keys in the Optimization Phase

When parts of a primary key is used in a calculation, the result of that calculation may be used instead of a column that is part of the primary key, and still fulfill all the requirements of being a primary key. For example, if the primary key in a table is (id, stamp) and a view is created as select id, stamp, row_number() over (partition by id order by stamp desc) as ver… then (id, ver) can be used as a substitute primary key for (id, stamp). The query optimizer could use such information in order to do table (join) elimination in the case that the view is joined on id, ver instead of id, stamp.

Consolidate Execution Plan Paths

When the execution plan contains multiple paths with identical sections, these could be consolidated and only executed once, instead of once per path. See the example below where three paths could be consolidated, with a substantial gain in performance.

Elimination of Multistatement Table-valued Functions from Execution Plan

In certain circumstances, when a primary key is defined in the table returned from a Multistatement Table-valued Function, the table does not have to be materialized at all, since it does not take part in the execution. This is similar to the already existing table elimination done between tables that have primary keys (and foreign keys) defined. Please see the attached script for a scenario in which the table is “unnecessarily” materialized.

Creating/altering triggers is using huge amounts of RAM

There seems to be some sort of issue with trigger logic using up massive amounts of RAM when they are created or altered. The following large DDL-script taken from a DW used in production uses more than 20GB of RAM when executed, and takes minutes to run, even though it only contains DDL statements. If you run it on a server with too little RAM it will result in an out-of-memory error. The expected behavior would be that it would run in a very short time and not consume huge amounts of RAM. The example contains other DDL statements as well, but the issue clearly lies with the triggers, since if they are omitted the problems with the script disappear.

Example here (note that the file is 3.5MB):

Also, if you can figure out workarounds, let us know. Thanks!

Published by

Lars Rönnbäck

Co-developer of the Anchor Modeling technique. Programmer of the online modeling tool. Site maintainer. Presenter and trainer.

Leave a Reply

Your email address will not be published. Required fields are marked *