Tuesday, 24 January 2012

Publish using MSBuild like VS2010

A few weeks ago I talked about automating the process of updating, compiling and deploying from the source code to running web application. I’ve done some improvements since then; the most relevant was to apply the web.config transformation phase. I admit it, it’s something I’ve read once at a glance and almost forgot it, in part because I didn’t understand very well on the fly. But this I really needed it and made a second review and Voilà! By the way, Great job in the book “Pro ASP.NET MVC 3 Framework” from Apress. All I needed was to invoke MSBuild with the appropriate arguments so it makes exactly the same as in the Publish Dialog from Visual Studio 2010.

After spending some time making Google-research and consequently StackOverflowing a lot, almost everybody agreed that the solution was modify the .csproj or .vbproj and include something like this:

<Target Name="PublishToFileSystem" DependsOnTargets="PipelinePreDeployCopyAllFilesToOneFolder">
    <Error Condition="'$(PublishDestination)'==''" 
 Text="The PublishDestination property must be set to the intended publishing destination." />
        <MakeDir Condition="!Exists($(PublishDestination))" 
  Directories="$(PublishDestination)" />
        <ItemGroup>
            <PublishFiles Include="$(_PackageTempDir)\**\*.*" />
        </ItemGroup>
        <Copy SourceFiles="@(PublishFiles)"
  DestinationFiles="@(PublishFiles->'$(PublishDestination)\%(RecursiveDir)%(Filename)%(Extension)')"
  SkipUnchangedFiles="True" />
</Target>
 

And invoke the MSBuild with /t: PublishToFileSystem and include in the property parameter /p: PublishDestination with the location where to put the final compiled files. Everything looks great? But there is small problem, I don’t want to modify the .csproj file, I just want to do exactly what VS2010 does when you right-click and select Publish, as I said before. I want my script to be reusable and do not have to remember in each project to include obscure XML fragments. So the quest begins.

I started to learn a little about MSBuild syntax and quickly went to the file Microsoft.WebApplication.targets, which is located very deep in MSBuild Folder but easily located by opening any .csproj file. Well, what’s inside this odd file? This file defines all target used by VS2010 when compiling or deploying a project. Since the new target defined by stackoverflow.com people depends on PipelinePreDeployCopyAllFilesToOneFolder target, I went to hunt it and I discovered vital information about the parameters that target use for to copy the files. I started the experimentation phase, and finally I lined up the stars and Voilà encore une fois! The magic combination is as follows:

The MSBuild receives a positional parameter with the .csproj, no change with it. The target as commented before is /t:PipelinePreDeployCopyAllFilesToOneFolder, the properties are /p:Configuration=Release; BuildInParallel=true; PackageAsSingleFile=False; AutoParameterizationWebConfigConnectionStrings=false. All this for the following: build in release mode, optimal settings for production server, take advantages of multi core processing, do not package as a .zip file the result and do not set replaceable garbage in web.config, this is not necessary because I don’t want to import the result with IIS wizard. Another important property is /p:IntermediateOutputPath=..\TempObjWeb\ this is specified to avoid a mysterious compilation error after a deploy, it’s something related with the web.config remains inside the project file and it can’t be parsed because it’s not inside a folder configured as virtual directory or application in IIS. The most important property is /p:_PackageTempDir= compilation target, that’s I’ve found following the trace from target to dependent target inside the Microsoft.WebApplication.targets file, when setting this property, the destination path (like the dialog in VS2010) for the compiled web elements.

Other improvements I’ve done in my script are at clean phase, this time all the temporary folders are deleted totally before the compilation begins. Continuing with the delete elements, when I copy the final compilation results (using aspnet_compiler) this time I make a smart delete excluding vital folders for specific use in my application, using the -Exclude in powershell. I also added some new parameters to promote its maintenance. I think there are still too many things to do but as I am still learning I continuously improve my toolbox. I let you the entire script code.

$aspnetcompiler = $env:SystemRoot + "\Microsoft.NET\Framework\v4.0.30319\aspnet_compiler.exe"
$msbuild = $env:SystemRoot + "\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe"
$repo = "C:\Users\Administrator\Desktop\livost-repo"
$webapp = $repo + "\LivostWeb"
$webtarget = $repo + "\LivostWebPublish"
$fullcomptarget = $repo + "\WebCompiled"
$comptarget = "..\WebCompiled\"
$sitename = "IIS:\Sites\mvclivost"
$urltest = "http://localhost:8000/Home"
#$destitny = "Debug"
$destitny = "Release"

import-module .\PScommon\psake\psake.psm1
import-module .\PSCommon\WebAdministration
Set-Alias -Name ipsake -Value Invoke-psake

$old_pwd = pwd
cd $repo
hg pull
hg update

$msbuild_arg0 = $repo + "\Livost.sln"
$msbuild_arg1 = "/p:Configuration=$destitny;BuildInParallel=true"
$msbuild_arg2 = "/t:Clean"
$msbuild_arg3 = "/m"
$msbuild_arg4 = "/v:m"
$msbuild_arg5 = "/nologo"
$msbuild_arg6 = "/clp:Verbosity=minimal"
$msbuild_args = @($msbuild_arg0, $msbuild_arg1, $msbuild_arg2, $msbuild_arg3, 
$msbuild_arg4, $msbuild_arg5, $msbuild_arg6)

Write-Host "Cleaning the solution"
Write-Host "Executing $msbuild $msbuild_args"
& $msbuild $msbuild_args > out.txt

# removing temporary folders
$cleanwebtarget = $webtarget + "\*"
rm $cleanwebtarget -Force -Recurse
$cleanfullcomptarget = $fullcomptarget + "\*"
rm $cleanfullcomptarget -Force -Recurse
$cleantempobj = $repo + "\TempObjWeb\*"
rm $cleantempobj -Force -Recurse

# keep the rebuild step 
$msbuild_arg2 = "/t:Rebuild"
$msbuild_args = @($msbuild_arg0, $msbuild_arg1, $msbuild_arg2, $msbuild_arg3, 
$msbuild_arg4, $msbuild_arg5, $msbuild_arg6)

Write-Host "Rebluilding the solution"
Write-Host "Executing $msbuild $msbuild_args"
& $msbuild $msbuild_args >> out.txt


# modified here (publish instead of rebuild) this change allows to apply the 
web.config transformation 
$msbuild_arg0 = '"' + $webapp + '\LivostWeb.csproj"'
$msbuild_arg1 = "/t:PipelinePreDeployCopyAllFilesToOneFolder"
$msbuild_arg2 = "/p:Configuration=$destitny;BuildInParallel=true;
PackageAsSingleFile=False;
AutoParameterizationWebConfigConnectionStrings=false"
$msbuild_arg3 = "/p:IntermediateOutputPath=..\TempObjWeb\"
$msbuild_arg4 = "/p:_PackageTempDir=$comptarget"
$msbuild_arg5 = "/nologo"
$msbuild_arg6 = "/clp:Verbosity=minimal"
$msbuild_args = @($msbuild_arg0, $msbuild_arg1, $msbuild_arg2, $msbuild_arg3, 
$msbuild_arg4, $msbuild_arg5, $msbuild_arg6)

Write-Host "Building the Web Application"
Write-Host "Executing $msbuild $msbuild_args"
& $msbuild $msbuild_args >> out.txt

$anc_arg0 = "-v"
$anc_arg1 = "/"
$anc_arg2 = "-p"
$anc_arg3 = $fullcomptarget
$anc_arg4 = "-f"
$anc_arg5 = $webtarget
$anc_arg6 = "-c"

$asncargs = @($anc_arg0, $anc_arg1, $anc_arg2, $anc_arg3, $anc_arg4, $anc_arg5, 
$anc_arg6)

if (-not (Test-Path $webtarget)) {
    mkdir $webtarget > $null
}

Write-Host "Precompiling web application"
Write-Host "Executing $aspnetcompiler $asncargs"
& $aspnetcompiler $asncargs >> out.txt

$webSite = Get-Item $sitename

$poolName = $webSite.applicationPool
$pool = Get-Item "IIS:\AppPools\$poolName"
    
if ((Get-WebAppPoolState -Name $poolName).Value -ne "Stopped") {
    Write-Host "Stopping the Application Pool"
    $pool.Stop()
    Start-Sleep 3
}

Write-Host "Smart delete..."
$todelete = $webSite.physicalPath + "\*"
rm $todelete -Force -Recurse -Exclude @("Files","_temp_upload", "App_Data")

Write-Host "Copying files..."
$source = $webtarget + "\*"
cp $source $webSite.physicalPath -Force -Recurse

Write-Host "Waiting a few seconds..."
Start-Sleep 5
Write-Host "Starting the Application Pool"
$pool.Start()

& "${env:ProgramFiles(x86)}\Internet Explorer\iexplore.exe" $urltest

cd $old_pwd
 

Sunday, 22 January 2012

Good practice for ASP.NET MVC Pager


When the data to be listed in a page becomes too long everybody agrees that it’s necessary to split the long flow into several small pieces, well that’s known in IT world as paging data, as you should know from day-to-day work in web applications. Every web technology gives us in some way the opportunity to do this without too much complexity. ASP.NET MVC is not an exception, after a quick Google-research I’ve found, of course since this problem is so common, tons of implementations, but the most relevant (at least in my modest opinion) was the one we can find at http://en.webdiyer.com/. There is another important implementation at https://github.com/martijnboland/MvcPaging that has also a NuGet package for ease the distribution. Almost all the implementations I found were inspired by ScottGu's PagedList idea.
I inspected many others but the general idea was in a way or another the same mentioned before over and over again. Finally I decided grab the first one and start using it as it comes out of the box. It was really easy to use: just make sure of your viewmodel has a IPagedList object, then specify the controller, action, pagination options, ajax options and html custom attributes.
Long story short, everything went good until I found a scenario where I needed a parameter combination that the author did not foresee. Wow! I really appreciate the amount of overloads made by the author, but in some way it was not enough for my scenario, so that’s Open Source magic. I downloaded the project and inspected it in a more detailed way. It really allowed the most common ways to make the pager but I needed at same time to specify controller, action, route data and ajax options for accomplish my task, after all I think this is not so uncommon scenario, I had a filter for my list and of course when I changed the current page the filter information must keep between round trips to the server.
The solution was actually simple, just make another overload in the Helper class and ready, but I wanted to dig deeper in the project structure and implementation and I discovered something I didn’t like too much (and it’s something I’ve seen in almost every projects in my research). The logic and view were strongly coupled. I am not a software philosopher but I like the good practices and the life had demonstrated to me that they are important. When I say strongly coupled I mean the same class is responsible of these two different tasks and concerns.
The relation between the classes is like this:
+-------------+    Render Pager    +--------------+
| PagerHelper |------------------->| PagerBuilder |
+-------------+                    +--------------+

The idea was to split that big class into two classes PagerModel and PagerView (obvious naming?) where the class PagerModel is responsible for all the logic involved such as the tedious calculations about page numbers, amount of pages, etc. model integrity validations using as transport a class PagerItem, this class holds data about each item to be shown in the pager but NOT how is to be shown, it’s only responsible for return a valid list of PagerItem. This separation of concerns is not just for to be compliant with the “holy bible” of design good practices, I’ve done this because I forecasted that in a short future it was going to be useful.
My refactoring looks like this:
+-------------+ Render Pager   +-----------+  Get Model +------------+
| PagerHelper |--------------->| PagerView |----------->| PagerModel |
+-------------+                +-----------+            +------------+

That future moment is arrived a few days ago, when I had to implement the same pager but this time with a different markup, actually just HTML and links and no ajax compatibility despite the fact I’m using unobtrusive from the default template. With this new distribution it was easy to me to create a new type of view, and why not, a little change of name, PagerView to AjaxPagerView and the new view named HtmlPagerView. It looks this way:
            Render         +---------------+  Get Model             
       +------------------>| AjaxPagerView +-----------------+        
       |                   +---------------+                 V        
+------+------+                                         +------------+
| PagerHelper |                                         | PagerModel |
+------+------+                                         +------------+
       |                   +---------------+                 ^        
       +------------------>| HtmlPagerView +-----------------+        
            Render         +---------------+   Get Model             

This is my vision of making a reusable design; in this case, instead of making a bunch of if-else or switch-case statements I apply a mixture between the Strategy and Decorator Design Pattern where every kind of view is a different “strategy” of rendering and “decorates” the model, which allows me in a future to implement many other types of view without affecting the core functionality and the actual logic: the model. The input options must be processed by the model class and make all the proper decisions returning a data structure which in most cases is named view model, ready for the view engine to read the necessary data from it. The view engine could be dependent in some way of the model element, but the model can NEVER be dependent of the view engine.
I hope this little refactor-reflection had made you think a bit more about real code reuse and the importance of using good practices and design patterns not just for to follow magic guidelines. I’d like to remark the consequences of developing software without keeping on eyes these techniques.
By the way, tell me about my ASCII-UML? Not bad, is it?

Thursday, 12 January 2012

Extending Unobtrusive AJAX behavior

One of the nice features that come with ASP.NET MVC 3.0 with no doubts is the addition of scripts for unobtrusive behavior regarding AJAX and validations. In this case I’ll refer to the AJAX mechanism using this technique. It allows you to write the least JavaScript code and achieve the partial update in a web page and it has a big plus, if JavaScript is disabled in the browser (or any other reason that prevents the js execution), no problem, everything continues working just without partial updates.
There are many scenarios where this is good enough for us and that’s all, write the server side code in the controller and the appropriate Html helpers using @Ajax setting up the target Id element to update after the remote call, maybe the “loading” element or even a confirm dialog in order to prevent unwanted post such as delete elements. The most classical example of this is the typical administration backend for any website where we have a list with a pager element at bottom (usually a table) and a form in another page with the details for edit the values.
Now, if you wish to make something a little different than previous example, the troubles start coming up. Imagine this simple scenario, a form with several fields ready to accomplish whatever task, if the operation succeeds then a partial update is performed below the form with more data to be entered and if the operation doesn’t succeed then another partial update is performed above the form showing the server errors. After this brief a little messy, the point is: there are more than one target id elements to update and only in the server this decision in taken; we don’t know how to specify the target id just declaring the Html helper.
My solution for this issue: conventions to the rescue!  I’ve made my own convention in this case. There is a script that is responsible of receiving all AJAX requests and inspects the content looking for the concrete target id and then it puts the content within the correct target id. I explain in detail, every partial view must have a hidden field with id = partialviewid and the value is the target id to be updated in the client. This solves the issue because we can have n partial views, each one referring to a different target, so in the controller it’s possible to do any sort of if-else statements and return the appropriate partial view in every case.
Let’s talk about the script that parses the content, it consists in a function that resides in a js file as any other and that is included by layout page after all unobtrusive formal scripts. It looks like this:
function receiveAjaxData(data) {

    var tempdiv = $("div[id=temporary-div-ajax]");
    if (!tempdiv.length) {
        $(document.body).append("<div id='temporary-div-ajax' style='display:none'></div>");
        tempdiv = $("div[id=temporary-div-ajax]");
    }

Here I ensure the temporary-div-ajax exists and is a body child, this is the place where the content that arrives from the server is put.
    tempdiv.html(data.responseText);
    var hidden = $("input[type=hidden][id=partialviewid]", tempdiv);
    var loadfunc = hidden.attr("data-function-load");

Then I find the hidden previously mentioned and if I had defined a function to execute is also grabbed here.
    if (hidden.length) {
        // Remove the hidden after to have taken its value
        var pvid = hidden.val();
        hidden.remove();

        // Place the content into the real target
        var destiny = $("#" + pvid);
        destiny.empty();
        destiny.append(tempdiv.children());
      
        // Re-parse the destiny node, in order to the validator work properly
        $.validator.unobtrusive.parse(destiny);

        // if function exists then evalute it
        if (loadfunc) {
            eval(loadfunc + "();");
        }
        return false;
    }
    return true;

If the hidden exists then proceed to place the content at real destiny, as you could have seen there is also the possibility of to specify the name of a function to be executed when the partial view is placed into the target, this is useful when we want to execute startup view-specific logic. One thing important here is the return value, if the result is true that means the jquery.unobtrusive engine continues as usually and if false then it stops and no more internal steps are executed.
How to use it? This question becomes usual, eh?
Include this script after all jquery.unobtrusive specific preferably in layout page.
In every partial view include:
<input id="partialviewid" type="hidden" name="partialviewid" value="target-id" />

In every Ajax action link or form, specify the AjaxOptions parameter
@Ajax.ActionLink("Create", "Create", null,
    new AjaxOptions { OnComplete = "receiveAjaxData"}
)

I hope this simple example to be useful for your day-to-day work without too much headache, till the next!

Saturday, 7 January 2012

Automate source code to running web process on testing server

For those who develop web applications and share code with others developers, the process of keep up to date the test server may be a little tedious and boring, even if you have the server at one click RDP distance. Well, as an aside, I also have it, but with a slight difference in bandwidth (I use a dialup connection @ 56Kbps and the RDP window is 800x600), and believe me, this can be annoying.
The process consists in the following steps, when we (the developers) want to make a test in “production” scenario the first is commit all changes to the repository. I am using Mercurial as VCS, surprise?! I’ve set up as an IIS website following the guide found in http://stackingcode.com/blog/2011/02/24/running-a-mercurial-server-on-iis-7-5-windows-server-2008-r2 and it helped me a lot, I recommend it for the HG fans. Let’s continue, after commit all changes to the repository on the server, we’ve to pull and update the local copy there in the server. Then make a clean and rebuild the solution, and the real deploy with aspnet_compiler for to have a folder with the content fully compiled according to MSDN. The deploy continues with the IIS, go to Application Pools stop the selected Pool, copy the files to the destination folder and finally start the Pool again. Optionally you can start a web browser with the url for verify everything is ok.
As you can see there are several steps involved, and several clicks as well (remember, for me, every click in the server is like a needle ticking in the arm) how to resolve it? There are two requirements for use the solution I propose: the hg executable must be in Windows PATH and the other thing I’ve done I don’t know if It’s really necessary but it worked to me, Replace the content of “C:\Program Files (x86)\MSBuild” folder from a machine with Visual Studio installed to the server (there was something about Microsoft.target not found) I admit it, I didn’t dig in that problem, I someone can understand why that happens, comments will be welcome.
After quickly analysis of ways to do it, I decided to use PowerShell as script engine, due to its flexibility and high integration level with .NET and any Windows component.  This is how my script is done:
$aspnetcompiler = $env:SystemRoot + "\Microsoft.NET\Framework\v4.0.30319\aspnet_compiler.exe"
$msbuild = $env:SystemRoot + "\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe"
$repo = "C:\Users\Administrator\Desktop\livost-repo"
$webapp = $repo + "\LivostWeb"
$webtarget = $repo + "\LivostWebPublish"
$sitename = "IIS:\Sites\Default Web Site\mvclivost"
$urltest = "http://localhost/mvclivost/Home"

These are some variables that may change between servers and projects, such as the MSBuild and aspnet_compiler location, the folder where I have the repository, web and target folders, site name.
import-module .\PScommon\psake\psake.psm1
import-module .\PSCommon\WebAdministration
Set-Alias -Name ipsake -Value Invoke-psake

These imports are taken from this guide that explains how to interact with IIS from powershell http://www.yangq.org/2011/04/09/automate-asp-net-deployment-with-powershell-install-and-update
$old_pwd = pwd
cd $repo
hg pull
hg update

Here I save the original Working directory and move to local repository directory, then pull and update, really easy.
$msbuild_arg0 = $repo + "\Livost.sln"
$msbuild_arg1 = "/p:Configuration=Release;BuildInParallel=true"
$msbuild_arg2 = "/t:Clean"
$msbuild_arg3 = "/m"
$msbuild_arg4 = "/v:m"
$msbuild_arg5 = "/nologo"
$msbuild_arg6 = "/clp:PerformanceSummary;Verbosity=minimal"
$msbuild_args = @($msbuild_arg0, $msbuild_arg1, $msbuild_arg2, $msbuild_arg3, $msbuild_arg4, $msbuild_arg5, $msbuild_arg6)

Write-Host "Cleaning the solution"
Write-Host "Executing $msbuild $msbuild_args"
& $msbuild $msbuild_args > out.txt

Here I set up the parameters for clean and write to the output properly what’s going on and to a file the text waterfall with all msbuild details.
$msbuild_arg2 = "/t:Rebuild"
$msbuild_args = @($msbuild_arg0, $msbuild_arg1, $msbuild_arg2, $msbuild_arg3, $msbuild_arg4, $msbuild_arg5, $msbuild_arg6)

Write-Host "Rebluilding the solution"
Write-Host "Executing $msbuild $msbuild_args"
& $msbuild $msbuild_args >> out.txt

The same for rebuild the solution after to have cleaned up old compiled files.
$anc_arg0 = "-v"
$anc_arg1 = "/"
$anc_arg2 = "-p"
$anc_arg3 = $webapp
$anc_arg4 = "-f"
$anc_arg5 = $webtarget
$anc_arg6 = "-c"

$asncargs = @($anc_arg0, $anc_arg1, $anc_arg2, $anc_arg3, $anc_arg4, $anc_arg5, $anc_arg6)

if (-not (Test-Path $webtarget)) {
    mkdir $webtarget > $null
}

Write-Host "Precompiling web application"
Write-Host "Executing $aspnetcompiler $asncargs"
& $aspnetcompiler $asncargs >> out.txt

Setting up the parameters for aspnet_compiler, for me this was the hardest part; there are so many arguments combinations to try! But finally I got it, I also ensure that the target folder exists before proceed with precompilation.
$webSite = Get-Item $sitename

$poolName = $webSite.applicationPool
$pool = Get-Item "IIS:\AppPools\$poolName"
   
if ((Get-WebAppPoolState -Name $poolName).Value -ne "Stopped") {
    Write-Host "Stopping the Application Pool"
    $pool.Stop()
}

Write-Host "Copying files..."
$source = $webtarget + "\*"
cp $source $webSite.physicalPath -Force -Recurse

Write-Host "Waiting a few seconds..."
Start-Sleep 5
Write-Host "Starting the Application Pool"
$pool.Start()

As I mentioned before I’ve taken from that fellow’s blog, here is when I interact with IIS, first I grab the AppPool object by its name and if it isn’t stopped then stop it. After that copy everything to the target physicalpath and wait about 5 seconds, why? Sometimes these steps may be executed too fast and the services stay in an “irregular state” such as starting-up or stopping and at this stage the services control doesn’t receive control messages, such as start or stop, at least this is what Microsoft says in the documentation for the error I receive sometimes if I don’t wait these few seconds.
& "${env:ProgramFiles(x86)}\Internet Explorer\iexplore.exe" $urltest

cd $old_pwd

And finally start a new instance of Internet Explorer for verify everything is ok and the ASP.NET makes its first and long runtime compilation with me and any other bad lucky guy.
Now, how to use it, in the server must be allowed the scripts execution, this can be done using a powershell console as administrator, then type ‘Set-Executionpolicy remotesigned’, in the example I’ve put a run.cmd with this text ‘powershell -command ./automate.ps1’ where this informs to powershell to execute the script. Important this script must be run with administrator privileges because of its interaction with IIS. Maybe you have to modify the content of run.cmd and instead of ./automate.ps1 to make a cd to the folder where the script is and then execute it as before. This occurs because when you execute a cmd as administrator the current directory is changed automatically and mysteriously to C:\Windows\system32. I hope this post to be useful to you, till the next!