You are on page 1of 4

this is to say about th com.

fix performance problems in your applications, make


sure Key performance . ok is th. counters (such as the one that indicates the p
ercentage of time spent performing garbage collections) are also very useful for
finding out where applications are spending the majority of their time. Yet the
places where time is spent are often quite unintuitive.
There are two types of performance improvements described in this article: large
optimizations, such as using the ASP.NET Cache, and tiny optimizations that rep
eat themselves. These tiny optimizations are sometimes the most interesting. You
make a small change to code that gets called thousands and thousands of times.
With a big optimization, you might see overall performance take a large jump. Wi
th a small one, you might shave a few milliseconds on a given request, but when
compounded across the total requests per day, it can result in an enormous impro
vement.
ok
When it comes to performance-tuning an application, there is a single litmus tes
t you can use to prioritize work: does the code access the database? If so, how
often? Note that the same test could be applied for code that uses Web services
or remoting, too, but I'm not covering those in this article.
If you have a database request required in a particular code path and you see ot
her areas such as string manipulations that you want to optimize first, stop and
perform your litmus test. Unless you have an egregious performance problem, you
r time would be better utilized trying to optimize the time spent in and connect
ed to the database, the amount of data returned, and how often you make round-tr
ips to and from the database.
With that general information established, let's look at ten tips that can help
your application perform better. I'll begin with the changes that can make the b
iggest difference.
Tip 1 Return Multiple Resultsets
Review your database code to see if you have request paths that go to the databa
se more than once. Each of those round-trips decreases the number of requests pe
r second your application can serve. By returning multiple resultsets in a singl
e database request, you can cut the total time spent communicating with the data
base. You'll be making your system more scalable, too, as you'll cut down on the
work the database server is doing managing requests.
While you can return multiple resultsets using dynamic SQL, I prefer to use stor
ed procedures. It's arguable whether business logic should reside in a stored pr
ocedure, but I think that if logic in a stored procedure can constrain the data
returned (reduce the size of the dataset, time spent on the network, and not hav
ing to filter the data in the logic tier), it's a good thing.
Using a SqlCommand instance and its ExecuteReader method to populate strongly ty
ped business classes, you can move the resultset pointer forward by calling Next
Result. Figure 1 shows a sample conversation populating several ArrayLists with
typed classes. Returning only the data you need from the database will additiona
lly decrease memory allocations on your server.
Figure 1 Extracting Multiple Resultsets from a DataReader
// read the first resultset
reader = command.ExecuteReader();
// read the data from that resultset
while (reader.Read()) {
suppliers.Add(PopulateSupplierFromIDataReader( reader ));
}
// read the next resultset
reader.NextResult();
// read the data from that second resultset
while (reader.Read()) {

products.Add(PopulateProductFromIDataReader( reader ));


}
Tip 2 Paged Data Access
The ASP.NET DataGrid exposes a wonderful capability: data paging support. When p
aging is enabled in the DataGrid, a fixed number of records is shown at a time.
Additionally, paging UI is also shown at the bottom of the DataGrid for navigati
ng through the records. The paging UI allows you to navigate backwards and forwa
rds through displayed data, displaying a fixed number of records at a time.
There's one slight wrinkle. Paging with the DataGrid requires all of the data to
be bound to the grid. For example, your data layer will need to return all of t
he data and then the DataGrid will filter all the displayed records based on the
current page. If 100,000 records are returned when you're paging through the Da
taGrid, 99,975 records would be discarded on each request (assuming a page size
of 25). As the number of records grows, the performance of the application will
suffer as more and more data must be sent on each request.
One good approach to writing better paging code is to use stored procedures. Fig
ure 2 shows a sample stored procedure that pages through the Orders table in the
Northwind database. In a nutshell, all you're doing here is passing in the page
index and the page size. The appropriate resultset is calculated and then retur
ned.
Figure 2 Paging Through the Orders Table
CREATE PROCEDURE northwind_OrdersPaged
(
@PageIndex int,
@PageSize int
)
AS
BEGIN
DECLARE @PageLowerBound int
DECLARE @PageUpperBound int
DECLARE @RowsToReturn int
-- First set the rowcount
SET @RowsToReturn = @PageSize * (@PageIndex + 1)
SET ROWCOUNT @RowsToReturn
-- Set the page bounds
SET @PageLowerBound = @PageSize * @PageIndex
SET @PageUpperBound = @PageLowerBound + @PageSize + 1
-- Create a temp table to store the select results
CREATE TABLE #PageIndex
(
IndexId int IDENTITY (1, 1) NOT NULL,
OrderID int
)
-- Insert into the temp table
INSERT INTO #PageIndex (OrderID)
SELECT
OrderID
FROM
Orders
ORDER BY
OrderID DESC
-- Return total count
SELECT COUNT(OrderID) FROM Orders

-- Return paged results


SELECT
O.*
FROM
Orders O,
#PageIndex PageIndex
WHERE
O.OrderID = PageIndex.OrderID AND
PageIndex.IndexID > @PageLowerBound AND
PageIndex.IndexID < @PageUpperBound
ORDER BY
PageIndex.IndexID
END
In Community Server, we wrote a paging server control to do all the data paging.
You'll see that I am using the ideas discussed in Tip 1, returning two resultse
ts from one stored procedure: the total number of records and the requested data
.
The total number of records returned can vary depending on the query being execu
ted. For example, a WHERE clause can be used to constrain the data returned. The
total number of records to be returned must be known in order to calculate the
total pages to be displayed in the paging UI. For example, if there are 1,000,00
0 total records and a WHERE clause is used that filters this to 1,000 records, t
he paging logic needs to be aware of the total number of records to properly ren
der the paging UI.
Tip 3 Connection Pooling
Setting up the TCP connection between your Web application and SQL Server can be
an expensive operation. Developers at Microsoft have been able to take advantage
of connection pooling for some time now, allowing them to reuse connections to
the database. Rather than setting up a new TCP connection on each request, a new
connection is set up only when one is not available in the connection pool. Whe
n the connection is closed, it is returned to the pool where it remains connecte
d to the database, as opposed to completely tearing down that TCP connection.
Of course you need to watch out for leaking connections. Always close your conne
ctions when you're finished with them. I repeat: no matter what anyone says abou
t garbage collection within the Microsoft .NET Framework, always call Close or Di
spose explicitly on your connection when you are finished with it. Do not trust
the common language runtime (CLR) to clean up and close your connection for you
at a predetermined time. The CLR will eventually destroy the class and force the
connection closed, but you have no guarantee when the garbage collection on the
object will actually happen.
To use connection pooling optimally, there are a couple of rules to live by. Fir
st, open the connection, do the work, and then close the connection. It's okay t
o open and close the connection multiple times on each request if you have to (o
ptimally you apply Tip 1) rather than keeping the connection open and passing it
around through different methods. Second, use the same connection string (and t
he same thread identity if you're using integrated authentication). If you don't
use the same connection string, for example customizing the connection string b
ased on the logged-in user, you won't get the same optimization value provided b
y connection pooling. And if you use integrated authentication while impersonati
ng a large set of users, your pooling will also be much less effective. The .NET
CLR data performance counters can be very useful when attempting to track down
any performance issues that are related to connection pooling.
Whenever your application is connecting to a resource, such as a database, runni
ng in another process, you should optimize by focusing on the time spent connect
ing to the resource, the time spent sending or retrieving data, and the number o
f round-trips. Optimizing any kind of process hop in your application is the fir
st place to start to achieve better performance.
The application tier contains the logic that connects to your data layer and tra

nsforms data into meaningful class instances and business processes. For example
, in Community Server, this is where you populate a Forums or Threads collection
, and apply business rules such as permissions; most importantly it is where the
Caching logic is performed.
Tip 4 ASP.NET Cache API
One of the very first things you should do before writing a line of application
code is architect the application tier to maximize and exploit the ASP.NET Cache
feature.
If your components are running within an ASP.NET application, you simply need to
include a reference to System.Web.dll in your application project. When you nee
d access to the Cache, use the HttpRuntime.Cache property (the same object is al
so accessible through Page.Cache and HttpContext.Cache).
There are several rules for caching data. First, if data can be used more than o
nce it's a good candidate for caching. Second, if data is general rather than sp
ecific to a given request or user, it's a great candidate for the cache. If the
data is user- or request-specific, but is long lived, it can still be cached, bu
t may not be used as frequently. Third, an often overlooked rule is that sometim
es you can cache too much. Generally on an x86 machine,

You might also like