ASP.NET Data Access
In this tutorial you will learn about new features in Data Access, Data binding, The XPathBinder, XPathBinder.Eval, Connection Strings, The factory class, Batch operations, Asynchronous Commands, SQL Bulk Copy, ColumnMappings, Enhancements to SQL Server 2005, ADO.NET Class Enhancements, DataTableReader, Serializing a DataTable, The XMLSerializer, RemotingFormat and Managing Views of Data.
ASP.NET 2.0 performs a kind of magic. It pushes ADO.NET objects to the backend and leaves the programmer free to use the access code inbuilt into a bunch of data source controls. Microsoft encapsulates a significant amount of code in the new Data source controls. In this tutorial we shall look at some of the enhanced data binding syntax that has been made available, storage of database connection strings and also enhancements to the ADO.NET object model.
Data Binding is a process of connecting a Web control to a data source element. The verbose DataBinder class of ASP.NET 1.x has been replaced by a class that demands a more compact syntax. It supports the generation and parsing of data binding expressions. The overloaded static method Eval uses reflection to parse and evaluate an expression against a runtime object. The Eval method includes RAD tools that enable the declarative placement of calls to the method feed properties’ dynamically changing values. The Eval as it appears in ASP.NET 2.0 is simply a wrapper around the DataBinder.Eval method and the base method is invoked using the default container’s data item. The method that is incorporated into the Page class is as under:
Protected object Eval(string expression)
When this DataBind method is called the controls are pushed into a stack based mechanism that traces the current binding context for the page and the control is popped from the stack when the DataBind method returns. The syntax of the GetDataItem assumes a default Container.DataItem context object and the GetDataItem returns that object
The new Bind method provides a read and writes operation. Unlike Eval it works both directions and when a text property is read, the value is stored in a collection. Enabled data bound controls can then retrieve these values and use them to compose parameter lists of the insert or the edit command. This runs against the data source. The parameter name in the command must match the argument passed to Bind.
The XPathBinder class is similar to Bind but works on XML data. A significant improvement is that it can be associated with raw xml data unlike the version of ASP.NET 1.x. When bound to a templated control such as a DataList or Repeater and the same is bound to an XML data source, then, individual xml fragments can be bound inside the template using the XPathBinder object.
The new XPathBinder.Eval method accepts an XmlNode object along an XPath expression and evaluates it to return the result. The format string will have to be specified and the XPathBinder.Eval method casts the container object o IXPathNavigable. This is essential for the query to run and the XPath API is built to work with objects that provide the navigator class. The XPathBinder syntax assumes a default container context. This container context is the object that is tracked for the data binder. The output value is converted into a string and the XPathBinder.Eval gets a navigator object from the data source and evaluates the expression.
The select method of the XPath executes a query and retrieves a nodeset which is an enumerable collection of XML nodes. The collection can be assigned as a late bound value to a databound control. The results of the XPath query run on the container object are indicated by the XPathSelect keyword. XPathSelect assumes a default data item context object and a IXPathNavigable object has to be implemented if no exception is to be thrown.
.NET providers have been enhanced in Whidbey to provide support to factories and batch operations. Connection Strings are critical application parameters and contain sensitive data. Whidbey configuration files define a new section for holding connection strings called
< connectionStrings >
< add name=”ExForSysDB”
connectionString=”SERVER=…;DATABASE=…;UID=…;PWD=…;” / >
< add name=”ExForSysAccessDB”
conectionString=”~/DATA/ExForSysDB.mdb” / >
< / connectionStrings >
The contents of this section can be manipulated by using < add >,< remove > and < clear >. Each web.config file can be placed in the application directory and the collection of connectionStrings can be customized. Each string has an identified name and references the connection parameters throughout the application. These strings are also used to link to the other sections of the configuration files such as the
Connection Strings can be retrived programmatically since they are defined in the Web.config file and loaded into the new ConfigurationSettings.ConnectionStrings collection. Any changes to the connection string parameters are not reflected in the page or business objects directly.
In ASP.NET 2.0 connectionStings can be stored declaratively. We will be dealing with this in greater detail in the next lesson of this series. For the present is it necessary to understand that the expression builder object helps create expressions that have syntax similar to the data binding syntax and are evaluated when the page compiles. The content of the expression is extracted, transformed into code and injected into the C# code of the page. The expression has two parts—prefix and value. It also has an underlying object –the expression builder. The expression builder decides on the prefix and also determines how to parse the value and expand it. It accepts a ConnectionString as a prefix and the name of the configuration entry as the value and expands the expression.
ASP.NET 2.0 gives a lot of importance to protecting sensitive data stored in the configuration system. Industry standard XML encryption is used to encrypt specific sections of the configuration file. However, this is optional and it can be enabled by referencing the section to be encrypted in the
< protectedData > section of the web.config file. The type of encryption can also be specified by selecting the required provider from a list of providers to be found in the < providers > section of the web.config file. The ASP.NET configuration API can be used to encrypt the content or the aspnet_regiis.exe can be used. Though managed configuration sections are eligible for protection, the
ADO.NET 2.0 introduces a new class called the factory class. This class is derived from the base class DbProviderFactory. It represents a common entry point for a variety of services specific to the provider. The CreateCommand returns a provider specific command object. The CreateCommandBuilder returns a provider specific command builder object. The CreateConnection returns a provider specific connection object. The CreateDataAdapter returns a provider specific data adapter object and the CreateParameter returns a provider specific parameter object.
The Factory class also has a few static methods which are used to get the factory of a particular provider. The GetFactory method takes a string that represents the invariant name of the provider. This is hardcoded for each provider in the configuration file where it is registered. It gets the assembly class name information for the matching invariant name. Since the factory class is not directly instantiated, the GetFactory method uses a reflection to retrieve the values of the static Instance property on the factory class. The instance of the property is returned and thereafter any methods of the factory class can be called.
ADO.NET 2.0 enhances Batch operations. A new property has been introduced on the data adapter object that controls multiple records grouped together and sent to the DBMS. The UpdateBatchSize property has been coded in the DbDataAdapter class and is inherited by all data adapter objects. The default value of the property is 1. However, it is set to a higher value all modified rows are sent in a single batch. It is recommended that a bench mark size be set before a higher value is assigned to this property, as it can result in performance loss.
The profiler tool can be used to check on what happens when a batch is submitted. When batch processing is turned off a RowUpdating and a RowUpdated event is generated for each batch. Turning on batch processing, results in the behaviour of these events undergoing a change. Several rows are processed before an update is actually performed and several RowUpdating events are performed before a RowUpdated event is performed. The row property is null. The information can be accessed using CopyToRows method. This method fills an array of DataRow objects. The right size of the array can be determined by examining the RowCount property in the RowUpdatedEventArgs class. This returns the number of rows processed in a batch.
The SQL Server provider has received special attention. The features of the SQL Server 2005 have been fully exploited. Asynchronous commands, bulk copy, notifications, multiple active resultsets and support for user defined CLR types are some of the features that have been enhanced.
Data base operations are normally synchronous. However, asynchronous commands used in ADO.NET 1.x showed that performance advantage could be gained. ADO.NET 2.0 provides true asynchronous mode as other commands are performed while this command is completed. Moreover, the user interface is not blocked while this command is performed. The support for this has been built into the SQLCommand class. This class uses three methods to build commands that work asynchronously—non blocking, polling and callback. Nonblocking is used to start the operation and then do something unrelated and then to return and get the results. The client can also check for the status of the asynchronous operation and poll for completion. Lastly he can start the database operation and continue with the application without waiting and can receive the callback when the operation is complete.
The asynchronous command begins with the BeginExecuteXXX and this returns a IAsyncResult object which is later used to complete the call. The EndExecuteXXX is called to wind up the operation. This command syncs up the results with the rest of the application. But this blocks up the code whenever the results are not ready. This can be avoided by polling for completion. The callback function can also be passed to the BeginExecuteXXX method along with the state of the particular call. To ensure that the code stops automatically until another thread terminates and invokes a callback, a synchronizer can be placed. The specified callback function is invoked at the end of the operation. The context of the call specified in the second argument to BeginExecuteXXX is packaged into the AsyncState property of the IAsyncResult object.
SQL Bulk Copy:
Bulk Copy funcationality provides a way of transferring large amounts of data into the SQL Server table. The SqlBulkCopy class is more powerful than the ADO.NET 2.0 SqlBulkCopy class and the Bulk INSERT statement. Data can be copied to the SQL Server from a ADO.NET reader or a DataTable object. The SqlBulkCopy class consists of a number of properties. The BatchSize as the name suggests specifies the number of rows in each batch. If the size is 0 then the copy occurs in a single step. The BulkCopyTimeOut as the name suggests specifies the amount of time in seconds before the bulk copy operation times out. The default time is 30 seconds. ColumnMappings is a collection object that defines the mapping between the data source and the destination table. The collection is empty by default and must be filled only if mapping between source and destination is required. It has two methods Close and WriteToServer. The WriteToServer method has a number of overloads. The IDataReader copies all the rows from a specified data reader. The DataRow copies all the rows from the specified array of DataRow objects. DataTable copies all the rows form the specified Data Table. DataTable, DataRowState copies all the rows form the specified DataTable that are in a specified state.
DestinationTableName specifies the name of the destination table. NotifyAfter specifies the number of rows to process before the SqlRowsCopied notification event is generated. By default only one event is raised. The NotifyAfter property can be used to track the progress of a bulk operation if the property is set to greater than one. The application receives the SqlRowsCopied event which notifies that a block of rows have been processed.
If an error occurs during bulk copy the operation is aborted and a SqlException is raised. A destination table and a source table must exist and primary key constraints are ignored during this operation. Unique key constraints results in an exception being thrown. If a bulk operation is closed using a Close method an exception will be thrown. If an ongoing operation has to be aborted the Abort of the SqlRowsCopiedEventArgs property has to be set. An instance of the class is received as an argument of the SqlRowsCopied event. Abort is a Boolean property. It has to be set to True to abort bulk copy. It must be noted that nothing happens to the rows that have already been committed or rolled back automatically in a bulk copy when the batch terminates. The uncommitted rows of a bulk copy transaction initiated by the user will have to be specifically committed or rolled back within the user transaction.
Enhancements to SQL Server 2005.
The .NET data provider for SQL Server has been enhanced to match the features that have been built into the SQL Server 2005 and its MADC 9.0 libraries.
1. The enhancements that have been made to SQL Server 2005 include data type support, query dependency and notification and multiple active result sets.
2. SQL Server 2005 allows for a unified programming model for regular and large values. Values can be stored as varchar, nvarchar and varbinary columns.
3. SQL Server 2005 supports all CLR types. All valid .NET objects can be retrieved from the SQL server tables. For example a Point or a user defined class.
4. SQL server supports XML as a native type. XML data can be stored in columns by declaring the type of a given column as native XML.
5. SQL Notifications are made to the client whenever there is a change in the data at the SQL server level and the result set generated by a given query is modified. The SQL Server provider in ADO.NET uses the SqlNotificationRequest and SqlDependency classes to expose this functionality.
6. Multiple active resultsets of the SQL server 2005 is a natively implemented feature that allows the application to have more than one SqlDataReader open on a connection. Each SqlDataReader could have been started on a single connection using a different command. This gives a performance boost as multiple readers are less expensive than multiple connections. This feature is enabled by default when SQL server 2005 is the database server
ADO.NET Class Enhancements
The System.Data namespace consists of classes such as DataSet, DataView and DataTable. The changes to these are minimal but all the classes have been refined and enhanced. The compact serialization format used for sending the DataView and DataTable over the .NET Remoting channel is a feature that worth mentioning.
The DataTableReader class which has been introduced retrieves the content of the DataTable or a DataSet object in the form of one or more read only, forward only result sets. The contents can be read quickly in a cursor fashion and by scrolling through the contents of the in memory objects. This class can be invoked by using the GetDataReader method of the DataTable or a DataSet object.
The Load method of the DataTable and the DataSet can be invoked to fill the DataTable or the DataSet with the contents of the DataReader object.
Serializing a DataTable to XML has been made possible in the ADO.NET 2.0 using the ReadXML and WriteXML methods. The WriteXML method has been given a long list of overloads and the contents of the DataTable can be written to a stream, file or XML writer or text writer. Both data and schema can be written together or separately. The output format can be controlled by setting the XmlWriteMode parameter. Unlike the DataSet object markup, the root node defaults to DocumentElement. The ReadXML method reads the XML into a DataTable object by using the specified stream, file or reader. The method takes one parameter and returns a value drawn from the XMLReadMode enumeration.
The XMLSerializer converts the objects public properties and fields to an XML format for storage and transport. This is a generic class that works with other classes that have no circular references and implement the XMlSerializable interface. The serialization process assumes importance when ASP.NET uses the XMLSerializer class to encode XML web service messages.
The .NET Framework 2.0 has provided DataSet and DataTable classes with a new property called RemotingFormat. The purpose of this property is to accept values from the SerializationFormat enumerated type—Binary or XML and influence the serialization of the content. It prevents large chunks of data from being moved when an instance of these classes is stored in a out of process session or moved through the .NET Remoting.
Managing Views of Data
A DataView class is a databound view of data stored in a DataTable. This can be used for sorting, filtering and searching data rows. If configured a DataView can be used for editing data also. A number of DataViews can be attached to a DataTable object. In ADO.NET 2.0 the ToTable method can be used to create a new DataTable from the current DataView. The new DataTable can then be subject to XML serialization.
ADO.NET 2.0 has really changed the way data is handled. Most of these features are associated with the changes that have been made in SQL Server .NET Provider and the new functionalities of SQL Server 2005. The changes in MADC 9.0 libraries have also induced changes in Batch processing and bulk copy objects. The DataSet and DataTable classes can be read using the readers and they can be filled using the reader object. XML serialization has also become possible for DataTable object.
However, the biggest change in ASP.NET is the introduction of the DataSource object. We will be studying this in detail in the next lesson of this series.