Updating Visual Studio Extensions

Today I wanted to do some overdue updating of Visual Studio extensions. Among others there where an update of NuGet from version 1.6 to 1.7 and for the Visual Studio Achievements from Version 1.6 to 2.0.

But unfortunatly the updated didn’t work as expected. For the Visual Studio Achievements I got an error, that the digital signature did not match and therefore the update could not be installed. For NuGet the message was different, but with the same result. I just got a message, that the update could not be installed.

Doing some quick google research revealt for NuGet, that problemes during updates are somewhat expected. The recommended solution is to uninstall NuGet and to reinstall the new version of NuGet.

I found however a hotfix for Visual Studio, which is supposed to resolve the update problem with the not matching signatures I encountered with the Visual Studio Achievements. This hotfix actually did resolve the problem for the NuGet update as well 🙂

Populating sharepoint discussion boards using code

I recently was tasked to import a legacy discussion board into SharePoint 2010. So I wrote a small application, that could dump the old discussions to XML and then import them into SharePoint 2010 using the SharePoint Object model. Nothing more easy than this. A discussion-board is just some kind of list after all.

Well, kinda. If you look closely, you will notice, that discussion-boards differ in some details from regular lists. For one: a discussion-board has threads and replies. The replies can be shown threaded, so I have to maintain which post is a reply to what other post.

SharePoint does some strange stuff to manage this. Each discussion thread is a folder. With this knowledge one might expect to find all replies in this folder, which sound total reasonable to me. But that’s not the SharePoint way. Instead all replies are siblings. Folders are just the SharePoint-way to distinguish threads from replies. I don’t really know how SharePoint manages the reply-hierarchy, but it does work – somehow.

Well, after these discoveries, let’s write some code. I assume the new discussion board already exists and all there is to do is to import the existing posts from the old forum. First I have to create a new thread like this:

var newThread = SPUtility.CreateNewDiscussion(discussionBoard.Items, oldPost.Title);
newThread[SPBuiltInFieldId.Body] = oldPost.Text;
newThread[SPBuiltInFieldId.Created] = FormatDate(oldPost.Date);
string author = CheckUser(oldPost.Creator, web);
newThread[SPBuiltInFieldId.Author] = author;
newThread[SPBuiltInFieldId.Editor] = author;
newThread.Update();

Then I check the oldPost for replies and append them to my newly created thread.

foreach (var oldReply in oldPost.Replies)
{
    var newReply = SPUtility.CreateNewDiscussionReply(newThread);
    string replyAuthor = CheckUser(oldReply.Creator, web);
    newReply[SPBuiltInFieldId.Body] = oldReply.Text;
    newReply[SPBuiltInFieldId.Created] = FormatDate(oldReply.Date);
    newReply[SPBuiltInFieldId.Author] = replyAuthor ;
    newReply[SPBuiltInFieldId.Editor] = replyAuthor ;
    newReply[SPBuiltInFieldId.Title] = oldReply.Title;
    newReply.Update();
    newThread.Update();
}

The replies can be created recursively for replies to replies as well.

This is actually all that is needed. So with some little magic of the SPUtility namespace I was able to import a couple of thousand messages in no time.

Localizing SpecFlow

On a recent usergroup meeting I got introduced to SpecFlow. This opened a whole new world in formulating tests ans specifications. Although I’ve been trying to formulate my tests in a BDD manor, inspired by JP Boodhoo’s and Stefan Liesern’s BDD examples this feels much better.

So the next logical step would be to move to a natural german specification instead of having the original given-when-then syntax.

Turns out, that switching the language is actually really easy. Even though I didn’t seem to find anything on the web … You just have to adjust the app.config like this

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <configSections>
    <section name="specFlow"
      type="TechTalk.SpecFlow.Configuration.ConfigurationSectionHandler,
            TechTalk.SpecFlow"/>
  </configSections>
  <specFlow>
    <language feature="de-DE" tool="" />
  </specFlow>
</configuration>

And that’s all that’s to it.

Parameterized queries in MySQL

In order to do SQL right in the .Net world, you just don’t concaternate a search-term with a static search-string, because this will open all gates to SQL-injection. So the following schould not be used:

MySqlConnection connection = new MySqlConnection(_connectionString);
MySqlCommand command = connection.CreateCommand();
command.CommandText = "SELECT * FROM Forum where name = '" + forumName + "'";

Instead you should use a parameterized query. Easy, you might say. Just add a placeholder to the SQL-statement and off you go.

MySqlConnection connection = new MySqlConnection(_connectionString);
MySqlCommand command = connection.CreateCommand();
MySqlParameter forumNameParameter = new MySqlParameter("@forumName", forumName);
command.Parameters.Add(forumNameParameter);
command.CommandText = "SELECT * FROM Forum where name = @forumName";

Unfortunatly this doesn’t seem to work alright. At least I didn’t get any results, even though my search-term did exist.

Looking at the actual SQL that was being executed on the server something became obvious.

SELECT * FROM Forum where name = @forumName

That’s not the SQL I was expecting. Somehow the parameter was not being substitued by the actual value. But why?

Just a short test: the same code does work on a MS-SQL database!

Solution

@ is not a valid character for a placeholder in MySQL. Instead  ? should be used. The correct code should look like this:

MySqlConnection connection = new MySqlConnection(_connectionString);
MySqlCommand command = connection.CreateCommand();
MySqlParameter forumNameParameter = new MySqlParameter("?forumName", forumName);
command.Parameters.Add(forumNameParameter);
command.CommandText = "SELECT * FROM Forum where name = ?forumName";

So this finally worked.

Mass-Updating Active Directory

I just love PowerShell! Although I’m not really mature in the syntax yet, I find myself moreoften doing little things in powershell.

Today I figured, that in my previous task of creating 150 sample accounts I missed out on the email-address. So I just wrote a simple line of powershell. First off, I went to the OU just by navigating to the AD-provider cd AD: and then change to cd OU=Test,DC=demo,DC=local. That’s already cool. Then just a simple line like

dir | foreach { $x = Get-AdUser $_; $y=$x.samAccountName; Set-Aduser -identity $x -emailaddress "$y@demo.local"; }

And since I’m on a roll, I also updated the passwords for all users

Get-ADUser -Filter 'Name -like "*"' -SearchBase "OU=Acme,DC=demo,DC=local" | Set-ADAccountPassword -Reset -NewPassword (ConvertTo-SecureString -AsPlainText "demo" -Force)

Sample Domain Data

Sometimes you just need to have a decent amount of sample data. Recently I created an active directory to do some development using SharePoint 2010. In order to have a realistic baseline I needed to have a decent amount of fictive users.

So instead of creating a ton of users like User1 to User150 I thought of something more elaborate. Why not create random user-accounts?

After some google-research I found a blogpost with a list of first- and lastnames as CSV-files. Fantastic! This looks like a promising starting point. To spice everything up a notch I also created list of departments and functions. So this will give me quite a batch of user-data.

To mix everything quite good, I created a little piece of powershell:

# Import list of Last names from Simple Text file
$lastname=import-csv '.\lastname.csv'
# Import list of First names Simple Text file
$firstname=import-csv '.\firstname.csv'
# Import list of roles, prefixes and departments
$roles=import-csv '.\role.csv'
$prefixs=import-csv '.\prefix.csv'
$departments=import-csv '.\department.csv'
# How many names to generate
$totalnames=150
# the Header for our new CSV file
$firstline='Firstname,Lastname,Position,Department,Phone'
# Create a file called “DomainUsers.csv”
Set-content -path 'DomainUsers.csv' -value $firstline
$firstnamecount=$firstname.Count
$lastnamecount=$lastname.Count
$rolecount=$roles.Count-1
$prefixcount=$prefixs.Count-1
$departmentcount=$departments.Count-1
# Go through and Generate some names
foreach ( $namecounter in 1..$totalnames )
{
    # Pick a random first and Last name
    $lastnamenumber=(get-random -min 0 -max ($lastnamecount-1))
    $firstnamenumber=(get-random -min 0 -max ($firstnamecount-1))
    $rolenumber=(get-random -min 0 -max ($rolecount))
    $prefixnumber=(get-random -min 0 -max ($prefixcount))
    $departmentnumber=(get-random -min 0 -max ($departmentcount))
    $FakeName=($firstname[$firstnamenumber].Firstname+','+$lastname[$lastnamenumber].Lastname)+','+
        ($prefixs[$prefixnumber].Prefix+' '+$departments[$departmentnumber].Department+' '+$roles[$rolenumber].Role).Trim()+','+
        $departments[$departmentnumber].Department+','+
        '555-'+(get-random -min 100 -max 999)+'-'+(get-random -min 1000 -max 9999)
    # Echo the New name to the Screen
    write-host $fakename
    # and write to the File
    add-content -path 'DomainUsers.csv' -value $fakename
}

I think the script doesn’t need any further explanation. The result will be a CSV-file with a bunch of random user account data.

The Import into active directory is done in a second powershell-script (just because I had that already).

param([string]$FileName, [string]$adpath)
Import-Module ActiveDirectory
function Import-Users([string]$UserFile)
{
    Import-Csv $UserFile | foreach-object {
        $accountName = $_.Lastname+$_.Firstname.Substring(0,2)
        $displayName=$_.Firstname+" "+$_.LastName
        New-AdUser $accountName -samAccountName $accountName -Company "Acme Corp." -Department $_.Department -DisplayName $displayName -GivenName $_.Firstname -Surname $_.Lastname -OfficePhone $_.Phone -Title $_.Position -CannotChangePassword $true -PasswordNeverExpires $true -Enabled $true -AccountPassword (ConvertTo-SecureString -AsPlainText "demo" -Force) -Path $spou
    }
}
$spou = "OU=Acme,$adpath"
Import-Users $FileName

This is rather boring. To start the script you have to supply the name of the CSV containing the user-data and you have to supply the path to your domain in the form of “dc=acme,dc=local”. This snipplet assumes that there is an OU called Acme, where all the user accounts should be placed.

These are the files I used to generate the sample accounts:

Full Screen Mode in VMWare

I just setup a new VM using a Windows 2008 R2 server to do some SharePoint 2010 development. To really get started I switched my VM to fill-screen-mode, in order to get more desk-space.

But that just gave me a blank, black screen. Resizing the window works perfectly fine, but not full-screen. Even manually triggering the “fit client” event didn’t do anything good.

The solution was trivial (as usual with this kind of errors): you have to enable 3D acceleration in the VM settings. What a bummer.

Logging from multiple processes

When logging with log4net to a file (using the FileAppender), the FileAppender is holding an exclusive lock on the file. This doesn’t cause any problems, not even when the application is running with multiple threads, because log4net should be thread-safe.

This does change however when working with multiple processes, that all share a common log4net configuration and thus all will utilize the same FileAppender. In this case there should not be an exclusive lock by any process. Fortunately log4net has an appropriate configuration-setting.

<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />

But acquiring and release locks is quite costly and thus will slow down the overall performance of the application.

An alternative would be to create an individual log per processes. Luckily log4net supports the expansion of variables to generate log-filenames, so we can add the process-id to the filename-pattern.

<appender name="LogFileAppender" type="log4net.Appender.RollingFileAppender,log4net">
    <file type="log4net.Util.PatternString" value="Log[%processid]" />
[...]
</appender>

Exponention in c#

To compute 10 to the power 2 shouldn’t be any problem

int result = 10 ^ 2;

As if! Visual Studio strongly believes that the result of this computation should be 8. Damn! ^ is a reserved symbol in c# for bitwise exclusive or (XOR). The correct way to computer this would be

int result = Math.Pow(10, 2);

Logging in NHibernat 3.0

With the release of NHibernate 3.0 the way NHibernate handles logging is changed. Up to 2.1.2 NHibernate used log4net exclusive for logging. The usage of log4net was directly tied into each class of NHibernate.

Now this has been decoupled. NHibernate 3.0 introduces a LoggingProvider. So instead of

private static readonly ILog log = LogManager.GetLogger(typeof (Loader));

a new logger is created using

private static readonly IInternalLogger log = LoggerProvider.LoggerFor(typeof (Loader));

Even though this might not seem like a big deal, there is more to the LoggingProvider. The logging provider currently only supports log4net (well, and a NoLoggingLoggerFactory). In order to determine whether log4net is available for logging, the LoggingProvider looks in the current SearchPath of BaseDirectory of the AppDomain like this:

// look for log4net.dll
string baseDir = AppDomain.CurrentDomain.BaseDirectory;
string relativeSearchPath = AppDomain.CurrentDomain.RelativeSearchPath;
string binPath = relativeSearchPath == null ? baseDir : Path.Combine(baseDir, relativeSearchPath);
var log4NetDllPath = Path.Combine(binPath, "log4net.dll");

What go me started was the fact, the my log4net configuration wasn’t creating any log-output, and I was extremely puzzled on why. I checked my config a dozen times without any error. This was working perfectly fine when working with NHibernate 2.1.2.

After looking at the LoggingProvider and the way logging is initialized it struck me: my log4net assembly is located in the GAC – the the lookup for log4net isn’t detecting my log4net assembly! Changing the properties of the reference to log4net to Copy Local resolved this issue.