Roll-out deployments with Visual Studio Team Services Deployment Groups

Disclosure: This post assumes you are familiar with Team Build and Release Management

Recently Microsoft announced Deployment Groups Preview for Visual Studio Team Services (VSTS). Until now, if you were using the latest version of Release Management, the only option for deployments are the agent-less option, referring for the no need for an agent on the deployment machines (of course you need a VSTS Build/RM agent). For me this has it advantages, enabling deployment scenarios in which you are not able to install an agent on the final machines, which usually is preferred by operations guys, and for me also (less things to take care of …)

But this leads us to other disadvantages, in example, if you want to deploy an application to a set of servers using current tasks, you will need first to enable Powershell Remote by hand, also you will have to manage by yourself all the pipelines to do a roll.out deployment (not deploying the latest version to all machines at once), so you don’t have an easy way to just deploy just to a set of machines and gradually deploy the new version to all machines to prevent service stops and also to be more aware of possible fails.

With Deployment Groups, you will need to first install the deployment agent on the final machines, and then you will be able to add the different machines to different deployment groups to be used in the pipelines, but also, instruct Release Management to deploy gradually to the machines belonging to a Deployment Group. Also, this can be used to create different pipelines, to deploy gradually, in a controlled way, the version to different set of servers so you can gradually provide the latest version of the application to different group of users (this will sounds you familiar if you are used on how Facebook, Twitter or even VSTS deploys its new versions).

Let’s start with this, first, of course this is only “supported” for On-Premises scenarios, or IaaS scenarios on any cloud, this is not supported on PaaS, as you won’t be able to install the agent.

To install the agent, if you have your own on-prem servers or you are using IaaS on a cloud different from Azure, you will need to go to your VSTS subscription, to the Deployment Groups section under Build & Release section, and you will see this:image

First time you enter here you will be shown to create a new Deployment Group, when you click “Add deployment group” you will be asked the provide a name and then you will be shown this screen, with a Powershell snippet to be execute on the machine you want the agent to be installed, be sure to select “Use a personal access token” for the case you just want to use a token for authentication, select copy script to clipboard, paste t on the machine on an admin Powershell console and that’s all.

image

If you are using a Virtual Machine on Azure, this is even more simple, as you can install it as an extension, just be sure to have the Deployment Group (just create it with a name) on VSTS, and then on your VM, under extensions, select and configure the Team Agent extension:

image

Now, when creating the Release, in the pipelines for the environments, you can select add a new phase of Deployment Group type:

image

If we add a new phase like this, on its options, we can see this options:

image

Notice we can even use tags for the machines inside the Deployment Groups to even have more fine grained control on which machines we are going to deploy a version of the application. Also notice how in the Deploy To options, we can gradually deploy (1/2 targets, 1/4 targets, 1 target, custom …) the application to the machines belonging to the Deployment Group. In this capture I have selected a Deployment Group with only 1 machine, that’s why shows Not applicable …

This gives us a lot of control, so for example we can deploy to one quarter of the machines of the group at once, taking them out of service, but not affecting the global service, deploy the application here, put them onto service again, stopping the next subset of machines, deploying, starting, and so on to the rest of the machines. Also if the deployment fails on any subset of the machines, it won’t continue thus not affecting the rest of the machines in the Deployment Group, givin us that fine grained control over the deployments.

Now we can start adding the needed tasks for our deployment, but notice this tasks will execute directly on the machines we are deploying, so if in our previous agent-less deployments we used to have Remote Powershell tasks in example, here we will use normal Execute Powershell tasks to execute any Powershell needed on the target machines, as it will be the deployment agent inside the target machines the one which will execute them, this applies also for any file copy tasks and so on, as everything will be launched locally.

Hope this leads you to interest on Deployment groups, and start to thinking on them for your on-prem or IaaS deployments. You can have more info on Deployment Groups here: https://www.visualstudio.com/en-us/docs/build/concepts/definitions/release/deployment-groups/

Test and feedback extension

Earlier in October Microsoft announced General Availability for the Test and Feedback extension for Google Chrome (and yes, there is no Edge version yet). This extension was previously called Exploratory Testing Extension, if you already tried it before.

I have been using this extension with some customers since early versions, and sincerely after seen how it evolved I must send a lot of kudos to the team, it is a great way to share findings on bugs and tests but also for feedback. Basically this extension allow teams (and stakeholders) to record sessions in web applications which they can share the results with the rest of the teams in several work item types: as bugs, tasks of even tests cases with the steps from the exploration.

I found this is a great way for stakeholders who are not used to write long descriptions or feedback acceptance texts to give quick feedback and communicate unexpected behaviors quickly to developers. As well as it allow developers to receive quick and actionable feedback within VSTS and TFS directly.

As you can see in the next image, when we, after recording, choose to create a new Work Item, it stores as steps with automatic image captures, all the steps done during exploration:

image

It is remarkable also that you can add additional information like: notes, screenshots, and even video with voice recording !!! (and yes, my customers love especially this last two ones … it saves a lot of time from writing). Also it has a couple of modes of working:

  • Standalone: you can use it without even a connection to a TFS/VSTS and after the session it produces an HTML report of the session. At this moment I haven’t used this mode so much, as all of of my customers are in TFS or VSTS so I haven’t found this very useful for my scenarios.
  • Connected mode: you connect to a Team Project on TFS or VSTS for the session, everything gets recorded on the exploratory testing sessions for TFS, and also you can create bugs, tasks or test cases with data from the session. This is really the mode I have been using almost all of the time.

And how this works? well first install the Google Chrome extension, and then you will have this button: image on your Google Chrome toolbar, just click it, and select the mode for the first time (you can change it later):

image

Then just click on the play icon image and start recording, as there is already a very good documents on the Visual Studio ALM Blog, let me resume them for you:

  • General announcement and overview: General data about the extension and main announcement on GA.
  • Capture information as screenshots, video, notes, page load data and more: interesting to discover how to use the different type of information which you can capture in your sessions for adding them later in Work Items.
  • Artifact creation: How to create the different type of work items or reports (on standalone mode) after the sessions and including the additional information.
  • Team collaboration: Information about how to use in the two different modes and with different access levels to TFS and VSTS and how to consult the results afterwards in the form of reports, sessions or artifacts captured. To fully understand this one it is important to read the two previous ones. Specially interesting is how to use it with Stakeholders with limited access.

So go ahead, install the extension, and start sharing the findings between teams and Stakeholders, I’m sure you will like it as much as I do.

Sample VSTS Build and Release Management task for Yarn package manager

During this weekend I wanted to try the new package manager Facebook created: Yarn, one of it’s bigger claims is to be faster than normal npm, although it uses the same package repositories as npm. this is done, as far as I know, using improvements in transfer the files along with a local package cache, so you don’t have to go always to npm to restore a package you already restored previously for other project.

As locally it was everything working smoothly, I decided create a new build task for VSTS so I can use it in my builds. First point it is only based to be used with your own agents for several reasons: first it has a demand which requires yarn to be installed (hosted agents does not have it … yet …), second, as said previously it uses a local cache to be faster, so at this moment, it made no sense to me to prepare something to be used with hosted agents, remember hosted agents are created on-the-fly so unless you need to restore packages for  several projects per build (next scenario I will try to cover), Yarn will be not necessary (well I agree it also improves download speed, but …).

So, for this first version, I just looked at the code of the current npm task and modified it to use Yarn, it is pretty simple and straightforward. The code is in my github account: https://github.com/lfraile/YarnTask

Feel free to see it, modify it, play with it and install it, I will be reviewing it for some improvements.

PD: Just as I was writing this post, I noticed something I should have look before … There is already a Yarn task in the Marketplace, hehehe, as I mention in my talk about creating custom tasks: always look if there is something available before build it … so well .. my fault … but, still, I will continue trying to improve it.

Creating custom tasks for VSTS Team Build and Release Management slides

Recently I gave a talk a out creating custom tasks for Team Build and Release Management for VSTS and also for Team Foundation Server, as also I’m starting again to write here (with lots of articles at first with ideas I had in my mind) I thought it also could be a good idea to leave links to content here.

It is in Spanish, but well, there is a bunch of useful links in the PPTX. It was mainly based on creating tasks with Javascript but I cover also Typescript (well both of them are Javascript executed with NodeJS at the end of the day) and PowerShell.

So here are the slides:

And also the source code used in the demos, it was a very basic set of tasks for .NET Core projects to restore packages, build and publish:
Have fun.

Configure Work Item Field as team field in Team Foundation Server

Recently, working on a customer, due to the teams and project structure they have, and the reporting needs for this structure (a correct structure BTW), we came to a situation in which dividing the Teams by areas was not so useful, and didn’t helped us in our work item and reports strategy, as you probably already have observed, it is not so easy to create the reports per team with complex structures, as areas are tree views.

So I came up to this article, which helped in creating a Work Item field to define the teams, I found it very useful for this and other situations, being honest, I find this even more comfortable than using areas for this.

For what is coming in this blog post I assume you already have knowledge about how to divide the work between teams in the same Team Project and feel comfortable with TFS Work Items personalization’s. Also this article is entirely based on Team Foundation Server On Premises.

Basically the procedure (go to the article for the details) is:

  1. Define a Global List for your list of teams.
  2. Add a new field for Features, Epics (the article doesn’t mention this first two but we also added to them=, Product Backlog Items, Bugs, Tasks and Test Plans. At the end of the day, any Work Item type which can be used to work in backlogs. It is important to define the field with the same name in all Work Item Types, and also be sure to make this reportable as dimension if you plan to use it in Reporting.
  3. Specify this field to have values from the Teams Global List as allowed values.
  4. Use witadmin command line tool to export the process, and modify the process to specify the new field as the field defining the team the work item belongs to. (<TypeField refname=”MyCompany.Team” type=”Team” />).

When you go to admin your Team Project (all of this is done at Project Level) you will see the possibility to define the value for the new field for each Team, so the backlogs and panels are filtered correctly.Make sure to specify this value for all of your teams, if you don’t do it, you will receive alerts about your team is not correctly configured. Also remember a particular team in TFS can own different values for this field, which is particularly useful for Product Owners views or management views.

IC686842

Also, there is another point in the article, which allows you to be able to specify the team you want the Work Item to belong during its creation from the Product Backlog view, something like the next image. But this configuration brought me a small problem, when someone from within a team, selects to create the Work Item for a different team, the backlog view produces an error as is it weren’t able to save the Work Item, so I disabled this configuration.

image

And as for final conclusion of this, well I haven’t been still a lot of time with this solution in production let’s say, but at this moment I find it very useful, as it allows me to improves some reports and Queries so I can clearly see the team a work Item belongs to, without any trick to truncate the area path or something to make the information easy to filter and more readable, specially in reports.

If you are going to need this or you will follow this article, please, test it thoroughly before going live, customizations in Work Items are always tricky, and specially in this level in which we are modifying the default behavior of TFS.

Also I tested it in a TFS “15” preview environment, and it also worked successfully as expected, so going forward to next version of TFS is expected to work.

Code Search extension for VSTS and Team Foundation Server “15”

VRecently Microsoft put as Generally Available a very interesting extension in the Visual Studio Marketplace, the Code Search extension. Install it on your VSTS is as simple as go to the previous link and click install, then select the Visual Studio Team Services account in which you want to install the extension, of course you need to be an Administrator to install it.

To install on Team Foundation Server “15”, is just as simple as install it during the installation phase of TFS.

InstallCSOnTFS

But, what enables this extension? Once you install it, on your team projects you will have, in the menu bar, a search box in which you can select to search code:

image

image

When you select code, you will be presented some of the main options to search for code.

But there are even more options you can check in the help page.

The interesting thing of this Code Search extension, is it not only look for text inside the code files, that would be easy, it searches  across all projects or just the ones  you want.

But it also allows you to put filters like for example look only for classes named like the term you are looking for, or comments, references and a lot of more, I’m really impressed on how rich it is. Of course you can refine your queries with AND, OR, NOT terms.

Also it integrates with history, so when you find what you are looking for, you can see its history, compare with previous versions, and you can even see annotations within the code.

Just as a conclusion a pretty nice extension you can start installing and using on your VSTS to search for code, but far beyond the usual look in files functionality.

As technology it uses in the behind, if you look here you will see it uses:

  • Elasticsearch
  • Oracle Server JRE, yes for TFS you will need to install it on the server, but you can install Code Search on a separate server. Of course for VSTS you don’t need to care about this.
  • Mardowndeep.
  • Roslyn (hype increasing on this one)
  • ANTLR

And about languages it supports, currently it supports C#, C++, C, VB.NET, and recently they added support for Java also, and it is ok to think they will be updating the list of languages.

So go ahead, install it and try it, you can find more options and documentation here.

I’m back! Executing Entity Framework migrations from VSTS Release Management

Uff there has been a long, long time since last time I wrote here … but, several people have lately asked me about my blog, so there I go, and better than write a bla bla article about the last time and why I haven’t wrote so much, let’s get technical.

In this article I assume you have the basic knowledge of creating Team Build definitions, and Release Management definitions. I’m not covering this topics in this article, as it would make it so long to read. If you are not familiar with this, I would recommend you to read about it here https://www.visualstudio.com/en-us/docs/release/overview

When we are deploying applications with a database, there are several ways we can do, usually we will deploy differential scripts to update the DB, or DACPAC and other technologies, but we can also use Entity Framework migrations, although I still have to think if it is the best way to do it …

Usually migrations are executed Entity Framework initializers, but if we need to execute them before deployment, so we can be sure we updated the DB even before deploy our application, there is a tool named “migrate.exe” included in the NuGet package of Entity Framework.

There is a couple of steps we need to take to be able to execute this tool from Release Manage:

  1. Build our migrations Assembly
  2. Copy the tool migrate.exe from the tools folder in the EF Nuget package folder to the same directory as the migrations assembly
  3. Execute migrations

Let’s go with the first two steps, which we will do in a Team Build definition which publishes the results, as artifacts, to the Release Management definition. The build step is easy, usually our migrations assembly will be included in the Solution we are building to deploy our application, if not, well include it in the solution or build it in a separate build step in the same build definition, no tricks in this one.

Copy the migrate.exe tool includes a couple more of steps. First I would copy the result binaries from building the migration project to a separate folder we will publish as an artifact. This is done with a Copy task in the build steps, which we will configure this way:

image

The parameters:

  • Source folder: we point to the binaries result of our migration assembly, notice I have oversimplified the directory, with /MigrationAssembly/ be sure to include the full path to it. And I have used a couple of variables $(build.sourcesdirectory) a system variable which points to the root of sources downloaded by the agent, and a custom variable $(buildConfiguration) which points to the current build configuration (i.e.: Debug, Release or whatever you use).
  • Contents: ** so we copy all results.
  • Target folder: I’m copying to a new folder automatically created in the artifacts staging directory, as configured with the system variable $(build.artifactstagingdirectory), you don’t need to create a complex folder structure under this one, but be sure to at least create a folder structure which allows you to separate different results and artifacts.

Next step, copy the migrate.exe file, again we use a copy task:

image

With the parameters:

  • Source folder: we point to NuGet packages folder, which is usually at the same level of the solution we are building, but be sure to check this correctly, probably this path is one of the trickiest paths of this configuration.
  • Contents: “migrate.exe” well, no comments …
  • Target folder: I’m copying to a the same folder we copied the results of the migration assembly in the previous step. This is very important for all of this work, so be sure to check it twice.

And the last step in the build, publish the artifacts, usually as simple as this one, which will publish all the folder structure we have in the artifacts directory to a server artifact:

image

Final steps will be seen like this:

image

Once we have done this, we can queue this build definition, and once finished, in the resulting artifacts just check you have your binaries from the assembly migration along with the migrate.exe tool in the same folder within the artifacts.

For the Release to execute the migrate.exe file it is just a simple task of execute a command line, of course one gotcha of this is to link the build definition with the Release Management definition (again, I assume you are already familiar with this).

So within the desired environment of our Release definition, we will just add a Run on agent task of type Run script, one important point, remember this tasks runs on the agent, so you need to ensure your agent can communicate with your SQL Server or SQL Azure.

We will configure this task in this way, before the deploying task for the application:

image

The parameters we are using:

  • Path: here we configured the path to migrate.exe within the build artifacts we are using, you can take advantage of the “…” button to look for it, again, remember: you must have linked your Release Management definition to your Build definition for this to be available.
  • Arguments: there is different arguments you can use here, even just point to a *.config file with all the values (check full documentation), in this case I just pointed to a custom variable containing the connection string (be sure to make it secret to protect it hehe), and as I pointed to a connection string, it is mandatory to configure the parameter of “Connectionprovidername”, which in my case is just SQL Server.

Some important gotchas here, be sure to test thoroughly your migrations, and be sure to enable the appropriate backups of databases for the rollback cases, this is not easy, and you have to really take care about it, so have different environments of Release Management for test all the deployments and migrations.

Once you have this, next  time you run this Build and Release definition, your database will be (hopefully if you have done it correctly) updated to last migration from Entity Framework.

And hopefully, see you later around here with more articles Smile

[TFSService] Starting to work with Team Foundation Server Service Preview

As sure many of you have already seen, in Build Windows, it has been taught the new Foundation Service Team Preview, aKa TFS in Azure.

For that you have been lucky enough to have an invitation, we will make a brief summary of where to start.

The first thing to remember is that although with some differences, amounts to a Team Foundation Server as we are used to seeing so far in its 2010 version though.

Upon receipt of the invitation and have created your account (you need an invitation and a Live ID), a URL will be: http:// [whatever]. Tfspreview.com, this is the url that we use to connect to our service in the cloud, and start working, and the URL you use to connect to our TFS Service Preview

When you finish creating your account, it enters that url automatically in the administration section

Team Foundation Server Service Administration

From this screen, the first thing we do is create a Project Team to start working which, incidentally, I love the interface Metro, we’re going to see throughout the implementation of Team Foundation Server Service. To create the project simply press the Create project team.

As a Team Foundation Server today, we asked the project name you want, description, and staff, who are also new and we can choose between:

  • Microsoft Visual Studio Scrum 2.0 – Preview 1
  • Microsoft MSF for Agile Software Development 6.0 – Preview 1
  • Microsoft MSF for CMMI Process Improvement – Preview 1

Creating Team Foundation Server Service project

After creating the project, we can move to connect with our Visual Studio 2010 (or version 11 if you have already downloaded)

Before connecting to our project, we must keep in mind one thing, with Fountadion Server Service Team Preview, we will use Live ID account to connect, something that is not ready Visual Studio 2010, so we have to download additional software.

This software will be found in our Service Management Service Team Foundation Server Preview, to access, if you have closed the web, we enter http:// [whatever]. Tfspreview.com, and click on the link above to Administration right.

In the third option we see the link to download the software you need, just simple click, and download for Visual Studio 2010, will need Service Pack 1 for Visual Studio 2010 and the hotfix KB2581206.

Once downloaded and installed, we opened our Visual Studio 2010, the connection procedure is always the same (menu option Connect to Team Foundation Server Team …).

Adding the Team Foundation Server Service Server Preview to those available, you have to put [whatever].tfspreview.com and take care selecting SSL connection, so the resulting URL is: https://[whatever].tfspreview.com/tfs

Connection to Team Foundation Server Service

By connecting what is going to ask? for an account with Live ID that has permissions on this project in Team Foundation Server to start working.

Login Team Foundation Server Service Preview

Terms of connectivity and see that we have a Team Project Collection, and the project we created available on our Team Explorer to start the game working with Team Foundation Server Service Preview …

A difference you can see, and that is because it is a cloud service (and especially Preview) is that we do not have Sharepoint and Reporting, and we only have our beloved Work Items, Source Control and Team Build (and I speak of the latter later).

Team Explorer TFS Service Preview

And as you can see, our TFS is now on tfspreview.com

And from this point, and if time and conditions permit, I hope to go away by making introductions to this new little wonder that we have available.

What do I mean with agile “multidisciplinary team”?

We return with items that I see, live and experience with agile methodologies. And now it’s up to the multidisciplinary teams, something that much has been written, and with many different visions.

If we follow the general concept, and in which many think, a multidisciplinary team is a team of generalists, which everyone knows to do any task needed to complete the sprint. This has a number of benefits that are obvious

  • Avoid bottlenecks
  • Facilitate communication of knowledge across the project team
  • We have multiple views on one point

And more benefits that we could go.

However, there are things we lose, as may be specialized knowledge to a particular topic.

Besides, it is quite difficult to find a good team of generalists in all the tasks that may arise in a sprint, remember that we will perform tasks ranging from front-ends of applications, databases, application deployments, etc. etc.

Therefore, in my view, to achieve a multidisciplinary team I rely on another approach. To me a computer so the computer I have, working as one, and you have all the knowledge needed to sprint, does not mean that everyone knows how to do everything.

Here we take advantage of special knowledge of any member of the team, it would be naive to believe a team of specialists only. In a team of specialists only shudder to think of the possible struggles of egos, and the struggles of each in his field.

Not bad to have a team of some experts in the fields we need, and people more generally. As long as they work as one, collaborate with each other and have all the ultimate goal of course sprint. And of course, all the team must have, at very least, a general knowledge about everything which is needed to accomplish the objective of the sprint, so we avoid big bottlenecks, and “islands” of knowledge.

And eye specialist when I’m not talking about people who can do only one thing, people, the more we know better, but then we have a more specific skill, as I said a friend of mine, specialization is for flies, humans we know many things.

General knowledge is very rewarding, and the more varied in theme, more enriching, but also know a lot about something and have much experience in it, allowing us to save not only familiar situations, if not deal with our guns unknown problems.

Documenting our Test Plans with Microsoft Test Manager

If you already know Microsoft Test Manager, you will know that it is a great tool to manage our plans for functional testing, however, to check the information concerning the plans for testing of a project in Team Foundation Server, we need the tool itself, connectivity to the TFS server, and go checking each of the Test Suites. This is all very well for the day to day, but there are times in which we have to share this information or make documentation of these plans (Yes, there is also documentation on agile projects).

To generate this documentation, our test plan,  and executions, in the visual Studio Gallery, we have a very useful tool: Test Scribe, which you can download it from here: http://visualstudiogallery.msdn.microsoft.com/e79e4a0f-f670-47c2-9b8a-3b6f664bf4ae

Once downloaded and installed on Microsoft Test Manager, we will have a new option in the main menu: Tools

image

From this section, we can make two main actions: a test plan documentation and obtain a report from an execution.

Documenting the test plan

image

In this option that we will do is, select a test plan for the project from Team Foundation Server to which we are connected and press the Generate button.

This leads us to a Microsoft Word document in which we have all the details of the test plan selected, this includes information such as:

  • The hierarchy of Test Suites
  • All the tests and their steps that make up the test plan
  • Settings to try

Ultimately a quite complete overview of the test plan.

Summary of implementation

image

With this option, select an execution of tests, either manual or automated, and will generate us a report that includes information such as:

  • Executed tests
  • A result of the tests
  • Defects found
  • Details of executions

Ultimately these reports they provide very interesting information, and what is best, in a format that podemso share and consult without having to be connected to our Team Foundation Server.