Wednesday, November 18, 2009

Visual Studio Team System & Shelve Sets

Another often overlooked (or at least under appreciated) feature of Visual Studio Team System is Shelve Sets. A Shelve Set is a way to partially commit your code to the server without affecting anyone else.

Using Shelve Sets is really simple. Go to the Pending Changes window, select the files you would like to ‘shelve’, enter a description in the Comments area, optionally select any work items you want to associate the Shelve Set with, and click the Shelve button in the Pending Changes toolbar.

A new window will appear asking you to give the shelve set a name, choose whether you want to preserve your changes locally (which will undo any changes made to the selected files locally), and if you want any check-in policies evaluated.

Once the shelve set is created, you will now have a copy of all the changes you selected on the server. Like a traditional bookshelf, you can have multiple shelve sets stored on the server.

Now that your changes are on the server, anyone else can search for and pull down these changes to their workspace.

Here are a few of the common shelve set uses:

  • Performing a check-in policy evaluation (such as Code Analysis, StyleCop, etc.) without actually committing the code. This is a nice sanity check now and then while you are writing your code instead of waiting until the end.
  • Isolate changes for different work items. We’ve all been working on a new feature when we get an ‘urgent’ bug that requires us to stop what we are doing and fix it. Create a shelve set for your current changes (removing the checkmark for the preserving your changes) and you now have a completely clean workspace for working on the ‘urgent’ bug.
  • Code reviews. Create a shelve set of your code for another team member to pull down and review on their system. Working with and exercising the code on their system allows for a much better code review than simply looking at a print out of the code or the end result.
  • Code Handoffs. Heading out on vacation for a while or assigned to another team? Create a shelve set of your current progress and someone else can pull it down and start continue to work on it.

Best Practice

The best use that I’ve found for using shelve sets is to backup your current progress before you go home every night (I call this shelve set WIP for Work In Progress). A lot of things can happen between the time you leave work at night and come back in the morning. Roof leaked, power surge, failed hard drive, latest virus outbreak, etc. If you’ve created a shelve set on the server before you went home, your covered. You can use another machine to pull down what you were working on. Just be sure to remove the check for preserve pending changes locally so that any files you have locked are released, allowing you to check them out on another machine without assistance from a TFS Administrator.

Thursday, November 05, 2009

Visual Studio Team System and Cloak

When I first started using Visual Studio with TFS for a new position, I started reading the Team Development with Visual Studio Team Foundation Server guide since I had never used it before. I got about half-way though it before I started the new job and became so busy I no longer had time to finish reading it. Had I kept reading, I would have found a piece that talked about the “cloak” menu item when viewing your Source in the Source Control Explorer.

Say you have a root-level folder in Source Control with the following structure…

  • ProjectA
    • MainLine
  • Branches
    • v1.0
    • v1.0-SP1
    • v2.0

Rather than create a separate workspace mapping for each version, you would create a single workspace at the top level. This makes it easy to keep everything in sync by just performing a Get Latest operation on the ProjectA node and you will instantly have the latest for everything.

There are a couple of problems with the approach. First, what if you are currently only working on the MainLine version? Do you really want the previous 3 versions stored on your disk? If the project is small this might not add up to much, but if it’s a major project this can easily consume tens of gigabytes of space.

The second problem comes into play when performing a Get Latest operation. With everything mapped from the root-level node, TFS needs to contact the server and compare your workspace with the server in order to determine what has changed and what needs to be updated. If you are currently working only on the MainLine branch, do you want or care about the changes in another branch? (Veteran coders will know that you can perform a Get Latest on specific folders, but this sometimes leads to performing a Get Latest too low in the tree and missing required dependency updates from higher up).

The solution is to create a root-level workspace mapping as stated above. When you are prompted to perform a Get Latest, say no. In the Source Control Explorer window, right-click on any folder you don’t want included and select the Cloak menu item. This prevents the folder from being included in your workspace mapping and prevents it from being downloaded and subsequently updated every time you perform a Get Latest.

In our example, someone only working on the MainLine branch would cloak the Branches folder. If someone needs to be working on the MainLine as well as a previous version (like a Service Pack), they could cloak the individual branch folders they don’t care about.

Nice!

Friday, August 28, 2009

Windows Virtual PC Hacking

I recently was able to upgrade my work laptop to Windows 7 RTM and have been quite happy with the experience so far. Everything seems to be running very quickly and I like a lot of the subtle improvements that have been made to the UI.

I did, however, hit a small snag with Virtual PC. You can install Virtual PC 2007 on Windows 7, but there are some known issues and your mileage may vary. I need to be able to run a 32-bit guest OS in order to use our VPN software because it currently only supports 32-bit OS’s. When I was running Vista x64, I just used Virtual PC 2007 without issues. It was a little clunky to have to run a VM in order to remote into work, but it worked.

So I installed Windows Virtual PC (currently a Release Candidate). The first interesting (and confusing) thing to note is that the ‘management’ application is no longer present. When you use the Start menu to open ‘Virtual Machines’, it just takes you to the ‘C:\Users\{username}\Virtual Machines’ folder. But if you look closely to the action area of the window, you’ll notice command like ‘Create Virtual Machine’. The management interface is built into the explorer shell for this folder!

So I just copied my VPC’s (both the .vmc & .vhd files) into the folder and double-clicked one of the VMC files and…

No luck. Despite others claiming that all they had to do was double-click their .vmc files, I just got an error message stating that it couldn’t find the .vhd file. I double-checked the path to the .vhd file in the .vmc file, but it was correct.

I then figured I would create a new VM and see what happens. Turns out that Windows Virtual PC creates a new type of file that has a .vmcx extension and places it into this folder (the .vhd files get created in the ‘C:\Users\{username}\AppData\Local\Microsoft\Windows Virtual PC\Virtual Machines’ folder by default). When i opened this up in my favorite text editor, it turns out that it is a simple XML file that points to the .vhd & .vmc files. After some quick editing, I was able to boot my VM’s as normal. Here is a sample .vmcx file:


<?xml version="1.0" encoding="UTF-16"?>
<!-- Microsoft Virtual Machine Description and Registration Settings -->
<vm_description>
<ram_size type="string">384 MB</ram_size>
<vmstate type="string">Powered down</vmstate>
<primary_disk1 type="string">C:\Users\{Username}\Virtual Machines\{VPC Folder}\{VPC HD}.vhd</primary_disk1>
<secondary_disk1 type="string">D</secondary_disk1>
<notes type="string" />
<vmc_path type="string">C:\Users\{Username}\Virtual Machines\{VPC Folder}\{VPC Configuration}.vmc</vmc_path>
</vm_description>



UPDATE: After sleeping on this for the night, I realized that the ‘Create Virtual Machine’ wizard had a section in it that allowed you to select an existing .vhd file. This is the same functionality that has always existed, but I was so focused on what was different earlier, that I forgot about that option. If you take this (faster and easier route), you will have your old VM’s up and running in no time. Be aware though, that it will create a new .vmc file to use the existing .vhd files. If you want, just edit the newly created .vmcx file and have it point to the old .vmc file (removing the newly created one in the process).



Happy VM-ing!  :)

Monday, June 01, 2009

Vista gets a bad rap

Recently, I was ‘forced’ to upgrade one of my development machines to Vista in order to start playing with the Windows Azure cloud tools due to the requirement of IIS 7. While I’m not sure how valuable publishing enterprise applications in the cloud are going to be, it is a good option for small to medium sized and/or and resource-(con)strained businesses.

So, with a bit of hesitation, I decided to wipe my machine and take the Vista plunge full on and installed Vista x64 SP1. I had some issues using the latest video driver provided by Dell that would not allow me to run the ‘Aero’ theme (by limiting my color depth to 16-bit instead of 32-bit), but once I rolled that back to an earlier version everything was running smoothly.

After a couple of weeks running Vista with all of my development tools and applications installed, I have to say that although there is a bit of a learning curve when you are very accustomed to XP, I haven’t run into any real issues yet. In fact, the machine is running faster than 32-bit XP and has actually been fun to use.

Caveat: My machine (a Dell Latitude D830) has an Intel Core 2 Duo CPU T7700 @ 2.4GHz, 4GB RAM, reasonably fast hard drive, and a middle-tier graphics card, so it is definitely Vista capable and scores a Windows Experience Index of 3.4 (limited by the graphics card; all other stats are 4.8+)

I cannot speak of the experience running Vista x32, but I do know that when Microsoft wrote the x64 versions of Windows they were able to eliminate a lot of the previous pain points simply because of the new architecture.

Based on my experiences so far, I wouldn’t mind moving some of my other (capable machines) over to Vista. However, I might just wait for Windows 7 to be released. I know a few people that are currently running it and have nothing but good things to say about it. Time will tell.

Monday, May 25, 2009

Learning Silverlight

I’ve been very interested in Silverlight since it’s introduction (and more importantly XAML), but have never really had the time to invest in learning it until now.

Recently I’ve been performing a technical evaluation for an upcoming project and Silverlight is one of the proposed technologies that could be used. Naturally I started Googling around the web for some learning resources and stumbled across (yet another) series of videos by Mike Taulty posted on Microsoft’s Channel 9 website entitled “44 Amazing Silverlight 2.0 Screencasts”.

These short (usually under 10 minutes) screencasts are each focused on a particular aspect of Silverlight and start from the very basic and progressively build to more complex topics. Mike has a very easy-going approach to explaining the topic and makes (purposefully) makes mistakes along the way to show you some of the common pitfalls/issues that you could encounter along the way.

If you are at all interested in Silverlight (and I think you should be), I would highly recommend this video series as one of your first stops along the way.

Monday, May 18, 2009

Preparing for Visual Studio 2010 and .NET 4.0

While poking around Microsoft’s Channel 9 website, I stumbled across a series of videos dubbed “10-4”. Those of you that have had your morning coffee have probably already guessed that these videos are about Visual Studio 2010 (also referred to on the net as VSX) and the .NET 4.0 Framework. These videos are quick (less than 20 minutes) that highlight what is new or changed from previous versions.

I’ve watched about half of them so far and they have been helpful in identifying new technologies to incorporate into new or existing applications.

Give them a go on your lunch break or whenever you need a break from your current coding headache!

Wednesday, April 08, 2009

Upcoming "Code Analysis, Metrics, and Style" Presentation

I will be giving a presentation entitled "Code Analysis, Metrics, and Style" at the end of the month to the Visual Developers of Upstate New York (VDUNY) user group, which meets once a month at Microsoft's office in Rochester, NY. The presentation, divided into 3 parts as the title suggests, will focus on using several tools available to developers to help them write better quality code.

The 'Code Analysis' section will highlight using the built-in 'Code Analysis' tools available in Visual Studio Team Editions as well as FxCop, which is the foundation for the built-in Code Analysis engine. These tools analyze the compile IL code and look for common programming errors.

The second section talks about using the Code Metric tools (again, available in the Visual Studio Team Editions) to determine the maintainability, complexity, and dependencies of your code. Using these metrics will help you determine what areas of the code you should focus on in order to make the software easier to maintain.

The last section will focus on using Microsoft's Source Analysis add-on for Visual Studio (StyleCop). In contrast to Code Analysis, Source Analysis analyzes your actual source code looking for style issues, such as all files need to have a copyright header, braces should be on new lines, methods should have doc comments, etc. While this may not seem that important, it can help enforce a common style for your entire codebase making it easier for others to get up to speed with your code.

Hope to see you there.