Create a Custom Trace Listener, Part 1

Visibility into components can be a real challenge in a web application. Often, there is functionality in a low level component that we need to trace because it is misbehaving or throwing a new error that it didn't before. Typically some kind of scaffolding needs to be created to put output somewhere so we can see it. This require work to be done in every component.

A better solution would be to have a lightweight tracing process that can be incorporated into a solution easily, preferably with minimal coding. Since this is such a universal need, Microsoft has provided a great means to do exactly this through built-in functionality in .NET.

.NET includes tracing capability out of the box in every component through the Debug.Trace function, which can write output into the Visual Studio IDE. This is nice for local development, but not much value when your component is running on a remote machine. Thankfully, .NET allows us to hijack the output of Debug.Trace and do anything we want with it.

Often in a framework, I will have many levels of components, and have a need to debug one small part of one component. Using a custom trace listener, I am able to put tracing informaion into a component, recompile it, and place it in the GAC on the server in question. Immediately, I get the trace output, regardless of where in the many layers of the framework it may live. This is a huge time savings--there is no need to develop custom functions to write or respond to the tracing output, and I have complete control over where the messages go.

Developing a custom trace listener will require the following:

1) Building a self-contained .dll that can be GACed which will consume the trace messages
2) Changing the web.config file of the application to tell it to use custom tracing


I take this a step further, and output the trace messages to a database, and check to see if a component's tracing setting is turned on or off. This allows for turning tracing on or off in an environment with no coding changes. But this requires a few more steps:

3) Set up a database to receive the test messages
4) Set up a mechanism to determine if an application's trace setting is on or off


Next time, I will discuss how to develop a custom trace listener and allow an application to use it. Then I will discuss the steps to turn this on and off in an environment.

Data Access Using Delegates, Part 2

Before jumping into the code for the data access, I want to post a quick overview of delegates for those of you not familiar with them.

Delegates are essentially function pointers that you can pass into a method like a parameter. This delegate is a function the called method can call back to, to perform work, to provide updates, etc.

Consider this example. Suppose I have a method that is a long-running process, and I have a status bar that I want to update. The method could take as a parameter a delegate that will get status updates that could be displayed. The code could look like this:


Delegate Sub ProcessUpdates(ByRef UpdateMessage As String)

Public Function LongRunningProcess(ByRef ProcessUpdatesDelegate As ProcessUpdates)

    ProcessUpdatesDelegate.Invoke("Starting...")
    ' do something
    ProcessUpdatesDelegate.Invoke("Step 1 Completed")
    ' do something
    ProcessUpdatesDelegate.Invoke("Step 2 Completed")
    ' do something
    ProcessUpdatesDelegate.Invoke("Process Completed")

End Function


Next, we would make a call into LongRunningProcess to kick it off, providing the function to call back to to process the update messages. We would do that this way:


Public Sub ProcessUpdates(ByRef UpdateMessage As String)
    txtStatus.Text = UpdateMessage
End Sub

' do this somewhere in the main line code:
LongRunningProcess(AddressOf ProcessUpdates)



That's all there is to it! Notice that the signature of the ProcessUpdates method has to match the delegate signature exactly. By following this straightforward process, we can make a delegate, pass it to a method, and have the called method make calls back to the delegate as needed.

This is exactly the method we will use next time in our data access layer!

Clean Up Your Session Access With Session-Backed Classes

One of my favorite techniques to refactor ugly web code into cleaner, OO code is creating session-backed classes. Often, developers go directly after session values, setting and getting them throughout their web pages. I hate this for a number of reasons--the biggest two are (1) it is very difficult to deal with decentralized session access, i.e. if you need to change a variable name, or manage everywhere a variable is called, it can be spread out all over your site; and (2) if you misspell the variable name somewhere (clearly session variable names bypass compile-time checking), your code will not behave the way you expected.

Session-backed classes free you from this problem, give you a nice quasi-OO wrapper over the session, centralize access to the variables, and even provide intellisense when you are developing against them. Additionally, since creating a session-backed class is a trivial process, you can spend about an hour and create a session-backed class generator to take out the grunt work.

Let's look at the fundamentals you need for a session-backed class:

1) You need a new class that will represent a logic unit (a user, an employee, a sale)
2) You need a constructor to take a reference to your session management; this is optional if you want to assume you can use the current HTTP context
3) You need a list of properties that will be made available by this class
4) A best practice is to use a prefix for the variables to allow the values to be easily grouped, and to clearly show what class created the variables


Now, let's take a look at some session-backed class code. Here is a simple class with one property, making use of the current HTTP context for session management:


Public Class Employee

    Public Property Name() As String
        Get
            Return HttpContext.Current.Session(Prefix & "Name")
        End Get
        Set(ByVal value As String)
            HttpContext.Current.Session(Prefix & "Name") = value
        End Set
    End Property

    Private Function Prefix() As String
        Return "Classes.Employee."
    End Function

End Class


That's really all there is to it. Referencing this variable from anywhere else in your application will give you a great OO-style interface into the session:


    SessionValue = Employee.Name


Meanwhile, you are not responsible for any additional code behind the scenes to serialize and deserialize the object, which is perfect for classes that are working with values used by legacy apps that call directly into session (i.e. apps that you can't easily change).

Service Unavailable In IIS 6.0

If you are not aware, IIS 6.0 has a special feature that monitors failures on your web site. If enough errors occur in a certain period of time, your app pool is shut down. The default is 5 errors in 5 minutes.

I have not seen a whole lot of errors that have registered in this category, i.e. enough of an issue for the app pool to shut itself down. Our dev box gets beaten up pretty well with our apps in progress, and today was the first time I noticed the app pool being down.

This is an important thing to know if you notice a "Service Unavailable" error when your page is being served up. It is also a good thing to be aware of, since it is probably the setting on your production and test boxes, if it has not been modified by your admins.

You can adjust this setting or turn it off in IIS.

A special thanks to Elliot Swan for making my job easier today. I just discovered his tool, Postable. Thanks, Elliot!

Reflection and Data Output, Part 1

Reflection is extremely beneficial for outputting data dynamically. Often, we need to be able to change column order or even what displays quickly. Reflection gives us the mechanism to perform late-binding on columns that will be output.

Let's consider an example. Suppose you want to output a table with a few columns--a user's name, country, and ID. My data access layer is going to provide me a list of objects with properties of the same names. I can hardcode my output process to use these names. But what if I expect them to change a lot? Or what if users are going to be able change them themselves? Hardcoding is not an option.

Reflection allows us to solve this problem by reading configuration information and outputting our data columns based on this data.

You will need to add the following declaration:


Private pInfo() As System.Reflection.PropertyInfo
Private PropertyIndexes() As Integer


Then populate this array:


Dim t As Type = MyObjectCollection(0).GetType
pInfo = t.GetProperties


Then set the indexes of the fields to display:


ReDim PropertyIndexes(TotalColumns - 1)
' loop through your array of property names to display
PropertyIndexes(i) = FindPropertyName(PropertyNameToDisplay(i))
' next


Finally, for each of your objects, you will loop through the list of fields to display, using this code to pull the correct fields out of the data access object:


Dim output As Object
output = pInfo(PropertyIndexes(i)).GetValue(MyObjectCollection, Nothing)


Yes, I am throwing a lot at you, and this may not make sense right now. If you are an expert with reflection, you probably see exactly what I am doing. If not, I will dive more deeply in the details next time.

Data Access Using Delegates, Part 1

Data access layer code is almost universally the same--you get a connection string, you open a connection, you execute your query, you turn your data into objects, you do some clean up of objects you created. You have probably written hundreds of components to do this.

One of my biggest annoyances with data access code is that it is usually decentralized, with each DAL opening and closing the connection, creating and destroying objects, etc. Maybe I want to go back and add in logging to each component and be able to turn it on and off. I can't really do that if I have hundreds of tiny DALs. Also, how do I know that all the developers on the project have been diligent about closing and disposing of all their objects? I don't want to have to verify this is true across everyone's components.

These exact concerns led me to a different design approach--one that would centralize the tedious, repetitive pieces into a component that could be easily managed, while providing all the necessary functionality and efficiency to the developers. This is not an easy task because the simplest approach to solve a data problem (datatables, for example) are typically not the fastest.

My requirements were to design a solution that would:
1) Centralize connection string management, so connection strings existed in exactly one place
2) Centralize object creation, and closing and disposing of objects, to ensure this always occurs
3) Centralize significant events, such as opening a connection or executing a command, to allow capturing of information for troubleshooting or profiling
4) Decentralize necessary functions, such as adding parameters to a stored procedure call
5) Allow for efficient data retrieval, rather than require an expensive retrieval method to meet the above criteria


Looking at these requirements, it didn't take long to realize that delegates were the best solution. Delegates allow efficient calls to be made to functions outside the centralized code, while providing a means to keep all the important code within a main data access component.

I will discuss delegates briefly, then jump into how you can use this approach to simplify and enhance your data access components.

Hint - Use String.Format

If you are not currently using String.Format, you owe to yourself (and your co-workers) to start.

String.Format changes code from looking like this (in VB.NET):
Dim MyString as String = _
"Value of " & MyVariableName & " equals " & _
MyVariableValue & "."


To looking like this:
Dim MyString as String = _
String.Format("Value of {0} equals {1}.", _
MyVariableName, MyVariableValue)

Let's look at the differences. First, the String.Format route is much more readable. It is pretty clear what the result of the concatenation will be without needing to understand the code. Second, the String.Format approach is significantly less error prone--no more have to recompile because you missed an ampersand, quote, etc.

Plus, your coworkers will thank you. Whoever needs to read or maintain your code after you will save time and be able to make changes more easily. Plus, they may pick up the important skill of using String.Format....

Date Validation in ASP.NET

I don't think ASP.NET uses the most intiative validator for data type validation. I find every time I need to add date validation to a form, I have to go back and look up which validator I need to use (compare validator? range validator?) and what fields to set.

So, for reference, here is what you need to do:

  • Use a Compare validator

  • Set the ControlToValidate to point to the date textbox

  • Set the Type to "Date"

  • Set the Operator to "DataTypeCheck"


  • That's all there is to it!

    Using Web Service for Legacy Data

    Web Services are a tremendous way to encapsulate data access. If I think my data access layer may be hit from multiple types of clients, I create a web service to deliver the data. This ensures easy access, no matter who may want to consume the data.

    One of my current clients has an existing classic ASP site. This site handles the majority of their sales traffic. It has been in place for some time, and with plenty of initiatives for new systems, they have no desire to rewrite it in .NET. In addition to classic ASP, their site contains a number of COM components for data access. So, my goal was to use .NET to develop data access that could be used by the classic ASP and COM components, but allow a transition to new technologies when it made sense.

    Note first that threading can be a big issue when it comes to connecting legacy technologies to .NET. Per Microsoft, you should not have a COM component calling a .NET component directly. This is another way that the Web Service protects you.

    Architecturally, Web Services also give you a great means of load balancing. My client, like any large company, had a good-sized web farm that was handling requests. By exposing the data access layer on HTTP, the server that is most able to handle the request will do so--even if it is not the server that is processing the COM component or ASP page. In other words, a server may actually hand of this request to a more idle server, and both machines have a hand in processing the web page.

    As I mentioned above, Web Services are my tool of choice when it comes to providing data to a myriad of types of clients, so this is the perfect choice in this case. Next, the problem is how to make the connection between the old technologies and a web service.

    Long ago, the SOAP toolkit was the way to solve this problem. But with this technology deprecated by Microsoft, it is clearly not the solution.



    So when I need to connect classic ASP or COM components to a web service, I use a direct XMLHTTP request using the MSXML component. This is a very straightforward process, and setting up the configuration information is largely a matter of copying and pasting information from the .asmx page into your program.



    First, you need to set up a few functions that you will use to make the call. I put the following code in a include file for classic ASP:




    strWSUrl = "http://server/webservices/MaterialRelease/service.asmx"
    strAction = "http://tempuri.org/GetAvailableForecasts"
    strXml = _
    "<?xml version=""1.0"" encoding=""utf-8""?>" & _
    "<soap:Envelope xmlns:xsi=""http://www.w3.org/2001/XMLSchema-instance"" xmlns:xsd=""http://www.w3.org/2001/XMLSchema"" xmlns:soap=""http://schemas.xmlsoap.org/soap/envelope/"">" & _
    " <soap:Body>" & _
    " <GetAvailableForecasts xmlns=""http://tempuri.org/"">" & _
    " <SupplierId>" & SupplierID & "</SupplierId>" & _
    " </GetAvailableForecasts>" & _
    " </soap:Body>" & _
    "</soap:Envelope>"

    result = PostWebService(strWSUrl, strAction, strXml)

    Hide a SharePoint Field

    Often for workflows, or to manage documents, it is useful to have a hidden field. I have seen a lot of tricks to do this (programatically add the field as hidden, set up a UI to be able to edit a list's hidden attributes, even changing the javascript on the edit page), but is there a way to do this out of the box, without needing to program?

    Fortunately, the answer is yes, and it is pretty easy. You just need to do the following:

    1) You need to allow management of content types. This option is in the List advanced settings. Make sure this is set to "Yes." Making this change will present additional configuration options in the List settings pages.

    2) Now, find the column you want to hide in the List settings page. When you click on the column, you are presented with options for the column, including the ability to hide the column.


    It's that simple! Now you can create a column for many purposes--I frequently use this feature to hold a status to facilitate workflows. The status can even be displayed in a view if needed, without it being editable in a New or Edit item page.

    Good luck out there!