Windows Phone 8.1 Device Migration (Backup & Restore)

Wow. I am very impressed with the work Microsoft has done in this regard. If my phone goes through the wash, or in my case, through the air propelled by an excited child, all I lose is the phone, i.e. a few hundred bucks. All I need to do is get a new phone, plug in my credentials (a few times for some apps) and everything is back as it was. And I mean everything:

  • Current installed apps all automatically installed
  • Start screen fully restored
  • Settings galore, even custom ringtones that I’m not even using but had added at some time are available
  • Bluetooth pairings, WiFi networks
  • Text messages
  • Connected accounts, i.e. Facebook, LinkedIn, Twitter
  • Built-in Podcast subscriptions and which episodes were marked played
  • NextGen reader settings and feed data was all correct

Did I say everything? There were 2 things I noticed missing:

  • Nokia Mix Radio mixes. This isn’t strictly WP fault, but since it is first party I expect better.
  • Cortana Reminders. It is in Beta so I can excuse it for now. But I hope this is resolved and syncs with desktop in Windows 10 Cortana.

One weird bit of behavior I had was the Phone Name. It was coming up on http://windowsphone.com/ as below:

image

What’s that? That’s my new Nokia Lumia 930 named Nokia Lumia 820. As was my previous now broken Lumia 820. To correct this name took a little searching. Plug your phone into a PC and rename it in file explorer like you would a USB Key.

This is the same sync behavior we see in Windows 8.1 between PCs. It will be interesting to see how this integrates with Windows 10 to make it all feel like multiple portals into my one cloud account. Make sure that you have backup enabled, otherwise you’ll have to do it all manually when changing phones.

Active Noise Cancelling Headphone–Sound off

Recently I was fortunate enough to receive, courtesy of DVLUP,  a pair of Nokia BH-940 Purity Pro Wireless Stereo Headset By Monster active noise cancelling headphones. Phew, that is Microsoft inspired naming. The thing is, I was blessed to receive a pair of Bose Quite Comfort 15 Limited Edition active noise cancelling headphones in a draw at a conference a couple of years ago. So I have hundreds of dollars worth of headphones I haven’t paid for. If you know me you’ll know I’m quite a scrooge and splurging on headphones this expensive I would never have done for myself. However, when the Bose failed I was very tempted to replace them. Fortunately their warranty and service was outstanding and in about 4 weeks, they had shipped me a new pair, albeit, not the limited edition. Paying for headphones like this when you use them all day everyday is worth it.

Nokia BH-940 Purity Pro Wireless Stereo Headset By MonsterSo now I have these two great headphones and I am seeing which I like best. I have always wanted to the Bose to be wireless. I catch the train, which tests the noise cancelling, but the cable management is tiresome. The Purity Pro is Bluetooth with NFC pairing. Sweet. The Purity Pro also has a microphone for calls, rechargeable battery for active noise cancelling, controls for music, calls and volume and amazingly pairs to two Bluetooth devices. For me, my phone and laptop on the train. Awesome. I love all those things about these headphones and it has been a massive step up from the Bose. When I open them up they turn on, and when I close them they turn off. It talks to me to tell me they are connected. If leave them open and not on my head, they turn off noise cancelling to save power. If the battery does run flat I can plug them in and still use them without noise cancelling. There has been times with the Bose that I have taken them off and forgot to turn them off. Do this overnight and bets are you have a flat battery and they are completely useless until you find a new battery.

Bose QuietComfort 15 Acoustic Noise Cancelling HeadphonesIt would seem for all of the above that the Purity Pro is much better than the Bose. However, the Quite Comfort, are worlds apart more comfortable. I can wear them all day, truly without knowing it. The cups fit fully over my ears and do not touch them. They create a good seal which makes the active noise cancelling very good. Much better than the Purity Pro. Although the Purity Pro is also over-ear, maybe I have big ears, because they are pushed by the edges and inner of the cups. My ears get a bit sore after a few hours. It would probably be fine if I didn’t know it could be way better. They come with iPhone compatible hands free microphone and controls on the wire. I’m one of the few with a Windows Phone though, so it didn’t work. I even got an inline convertor plug that didn’t work either. If the Bose had even just wireless, without all the controls like the Purity Pro, I think that would be enough to not give the Purity Pro a look in. At the Bose store they said most their customers use them for plane travel so wires are fine. Whether that is true or not, I don’t know, but I still don’t like wires.

Either way both these headphones are awesome and I would, and do, recommend both to anybody. Just pick which set of features is more important to you.

Windows 365 Anyone?

Just some unfounded speculation. All Windows is under Joe Belfiore. Windows Phone for OEMs is free. All Windows Phones 8+  are supported for updates. Windows 8+ for OEMs for all PCs, tablets, 2-in-1s or otherwise, under 9” screens is free. Office Online is free. For Office on your devices Office 365 starts at quite a low per person price. Windows 8 upgrade offer from Windows 7/Vista on release was very cheap.

With Windows Threshold, i.e. 9, I wonder if they are ready to remove the revenue from the Windows platform for consumers completely. Always up to date Windows. Will they go completely free, or package with an Office 365 account at no extra cost?

Update 9 Feb 2015: Although it was announced Windows 10 and free upgrade for the first year of release for consumer versions, Microsoft has just trademarked Windows 365. Perhaps for businesses?

TFS Local Workspace Limit

One of my colleagues hit this last week:

TF401190: The local workspace XXXXXX;XXXXXX has 112202 items in it,
which exceeds the recommended limit of 100000 items. To improve
performance, either reduce the number of items in the workspace,
or convert the workspace to a server workspace.

Fortunately he had a few older release branches he could remove. I didn’t know there was a recommended limit. I found this overview of local workspaces:

Local workspaces have scalability limitations due to their use of the local workspace scanner which checks for edited items. Local workspaces are recommended for most of our customers, because most workspaces fit into the “small” or “medium” category in our view – that is, they have fewer than 50,000 files and folders. If your workspace has more than 50,000 items, you may experience performance problems or TF400030 errors as operations exceed 45 seconds in duration. In this case, splitting your workspace up into multiple smaller workspaces (perhaps one workspace per branch), or switching to server workspaces is recommended.

Server workspaces are optimized for performance at large scale and can support workspaces with up to 10,000,000 items (provided your SQL Server hardware is sufficient).

It seems performance for local workspaces has been improved, or maybe the limit has been adjusted due to average hardware upgrades. I can tell you though, our SQL Server hardware is not sufficient, greatly due to TFS using too much disk space.

Technorati Tags:

Increase TFS 2013 task board work item limit

Today one of our task boards hit this:

Board exceeded number of items allowed

No worries, we’ll just follow the link and increase the limit. That however takes you to Customize the Task Board Page for Visual Studio 2012. From what I have done previously I knew this would not apply to TFS 2013. The command is to export the agile process config and in 2013 this has all been combined to be the one process config. Looking through my process config though I could not find the IterationBacklog element, so I run the command anyway and get:

Warning: This command is obsolete. Use 'witadmin exportprocessconfig' instead.

In the 2013 process config, although there is not an IterationBacklog element, there is PortfolioBacklog, RequirementBacklog and TaskBacklog. The same attribute workItemCountLimit still applies and it goes on the TaskBacklog element. The details can be found in Configure and customize Agile planning tools for a team project.

The steps however are very simple:

  1. Export your process config
    witadmin exportprocessconfig /collection:http://tfs:8080/tfs/DefaultCollection /p:TeamProject /f:ProcessConfiguration.xml
  2. Add workItemCountLimit attribute to your TaskBacklog element

    <TaskBacklog category="Microsoft.TaskCategory" parent="Microsoft.RequirementCategory" pluralName="Tasks" singularName="Task" workItemCountLimit="800">
  3. Import your modified process config

    witadmin importprocessconfig /collection:http://tfs:8080/tfs/DefaultCollection /p:TeamProject /f:ProcessConfiguration.xml

Note that the default is 500 and the maximum allowed is 1500.

Technorati Tags:

No need to use DateTime UTC again!

We were always taught, whenever storing or comparing dates always store them as UTC. That’s how I remember it at least. But that is not actually the correct answer. According to the detailed Coding Best Practices Using DateTime in the .NET Framework the rules state “a developer is responsible for keeping track of time-zone information associated with a DateTime value via some external mechanism”. Which leads to the recommended Storage Strategies Best Practice #1:

When coding, store the time-zone information associated with a DateTime type in an adjunct variable.

I don’t think I have ever seen this actually done. But no worries, since .NET 3.5 and SQL Server 2008 there is new type to use. Today I was just introduced to DateTimeOffset. This solves the issues of storage and calculations ensuring that the time zone offset is always stored with the date.

Represents a point in time, typically expressed as a date and time of day, relative to Coordinated Universal Time (UTC).

The code sample on the documentation page I think shows the difference and usefulness quite well.

using System;

public class DateArithmetic
{
   public static void Main()
   {
      DateTime date1, date2;
      DateTimeOffset dateOffset1, dateOffset2;
      TimeSpan difference;

      // Find difference between Date.Now and Date.UtcNow
      date1 = DateTime.Now;
      date2 = DateTime.UtcNow;
      difference = date1 - date2;
      Console.WriteLine("{0} - {1} = {2}", date1, date2, difference);

      // Find difference between Now and UtcNow using DateTimeOffset
      dateOffset1 = DateTimeOffset.Now;
      dateOffset2 = DateTimeOffset.UtcNow;
      difference = dateOffset1 - dateOffset2;
      Console.WriteLine("{0} - {1} = {2}", 
                        dateOffset1, dateOffset2, difference);
      // If run in the Pacific Standard time zone on 4/2/2007, the example 
      // displays the following output to the console: 
      //    4/2/2007 7:23:57 PM - 4/3/2007 2:23:57 AM = -07:00:00 
      //    4/2/2007 7:23:57 PM -07:00 - 4/3/2007 2:23:57 AM +00:00 = 00:00:00                        
   }
}

When would you prefer DateTime over DateTimeOffset? Introduced here by the BCL team and it is detailed when you may want to use the DateTime over the DateTimeOffset. Summarized I would say it is when you are doing interop with OLE or when you don’t care about time only date, like birthdays.

DateTimeOffset is the new preferred type to use for the most common date time scenarios.

Shortcomings due to the loss of time zone information and only using the offset is the main issue. If you are really serious about dates and time and are heavily using them, you may consider Noda Time started by Jon Skeet. My only issue with using DateTimeOffset virtually all the time in place of DateTime is that I have only found out about this years later.

Technorati Tags: ,

Analyzing Installer Log Files–Getting Started with Windows Installer Verbose Log Analyzer

The Windows Installer engine, msiexec.exe, is a bit of a black box. To see inside that box you can run your installers (msi’s) from the command line with verbose logging using:

msiexec /i MySetup.msi /lv* install.log

Opening up this text file and looking over it directly is not super helpful. Fortunately there is a tool for this but it is obscure and you need to know what it is called and where it is. It is the Windows Installer Verbose Log Analyzer (WiLogUtl.exe). How is that for a name! It comes with the Windows SDK and is found in C:\Program Files\Microsoft SDKs\Windows\v<x>.0\Bin, where <x> is you SDK version, e.g. 7. I’ve upload the WiLogUtl.exe, which is all you need, to OneDrive here for easier accessibility.

Windows Installer Verbose Log Analyzer

The tool is quite cumbersome, but does the job really well. You start by press the Browse button and select your log file. The Open button only opens the file in notepad, as below, which is a really pointless function. The magic is in the Analyze button.

Notepad install.log

This particular 514 KB 2233 line log file is of a failed install. Finding the error is tricky. After hitting Analyze the error is the first thing shown.

Detailed Log View

So I had an error in my Custom Action CA_IISGetSelectedWebSite. The States button shows the Feature and Component states through the install. So if you have an unselected feature, you will see it here.

States

The Properties is one of the most useful views. It details what the properties were defined at during difference stages on the install, note the Client, Server or Nested options at the bottom.

Properties

The next useful view is the HTML Log. This is just a color code view of the log file with helpful navigation buttons. It makes getting at the detail simple and understandable.

HTML Log

If you have to debug Windows Installers this will save you a bunch of time and headache.

Technorati Tags:

Visual Studio 2013 Cookbook Review

Disclaimer: I was provided a complimentary electronic copy of this book by the publisher. In no way was I directed on the content of my review. Opinions are mine alone.

Why would you buy a Visual Studio 2013 book instead of just using the Internet to find what you need? Simple, time and content. The amount of new features and ways to do things in this version is substantial. Mining the Internet you are sure to miss many gems that could make a large difference to your day-to-day work. Going through this book you will benefit that a team of people has done the work for you. Ensuring they have included everything and the content is correct for your easy perusal.

  • Chapter 1 gives a very good overview of the new additions to the main tool you use each day.
  • Chapter 2 gives a very good starting place for Windows 8.1 development. It will get you past the new stuff quickly to allow you to get into the code.
  • Chapter 3 goes over Web Development with MVC 5 and the new One ASP.NET model.
  • Chapter 4 uses some I’m not dead yet WPF ("WPF remains the recommended choice for developing desktop applications on the Windows platform. Visual Studio 2013 itself is a WPF application", p 118) WF, WCF and .NET application development.
  • Chapter 5 Debugging Enhancements
  • Chapter 6 Asynchrony in .NET
  • Chapter 7 C++, I skimmed this one, but the C++ developer should be thrilled, and it makes me want to learn DirectX.
  • Chapter 8 Team Foundation Server features from a users perspective. Great features that many developers aren’t utilizing that can very much help day-to-day. If you are managing a TFS server though, you’ll want more than what is provided here.
  • Chapter 9 is on languages TypeScript, Python and IronPython. TypeScript I knew and understand, but I have no idea about Python, and don’t really want to, but it’s good to know it’s available if that’s your flavour of choice.
  • Appendix contains recipes on Installers, Submitting Windows Store apps, Visual Studio Add-ins and extensions, and creating your own snippets.

The "There’s more…" sections littered throughout are valuable on there own, revealing many power user features I was unaware of. The recipe format of the books make it a great reference to have on-hand. Going to the relevant recipe within the book will be a faster and more reliable process than going online.

I intended to flick through this book quickly and dive in on a couple of chapters. I however could not help but go through it all. The capabilities we have available to us, just from Visual Studio is amazing. Going over it all I can’t help but think of all the possibilities and opportunities. Now I must get back to coding.

If you want to get a hold of this book you can get it directly from the publisher, Amazon and many retailers.

Technorati Tags:

Need Help: TFS tbl_Content Table and Database growth out of control

Recently our TFS Database size has peaked at over 570GB. Granted we do have a lot of people working against it and use it fully for Source Control, Work Items and Builds. We used to have this problem with 10s of GB being added each week. The cause then on TFS 2010 was the Test Attachments and a run of the Test Attachment Cleaner would clean it up. Kinda. We found after a while although the tables were reporting smaller, we needed two SQL Server hotfixes to allow the space to actually be freed. After that though 100s of GB flowed free and things were good. These details are covered well in a post by Anutthara Bharadwaj.

We continued running the tool and then upgraded to TFS 2012 and were told (TFS 2010 SP1 Hotfix) the problem had now gone away. We stopped running the test attachment cleaner and later upgraded to where we are now on TFS 2013.

image

This year however our system administrator noticed we were running out of space again. However, looking at the Disk Usage By Top Tables report the tbl_Attachment table was not the problem. It was the tbl_Content table.

image

image

From Grant Holliday’s post he tells us that the Content table is the versioned files. In the forums there is this,

“If you have binary files, the deltafication of the files will add size to the table. For example, you might have 15 binary files and 1000 changes to the files – all that data needs to be stored somewhere.”

This got me to check out my source into a clean workspace and running Space Sniffer against it to spot if anything big had been added. Our entire source is about 50 GB. Which sounds like the total size isn’t too far off. But Main is only 820 MB and the whole team is working on there. We have been doing lots and compared to a 4 months ago it was 730 MB. We have many branches but that should only be less than a GB per branch and being a delta it should be next to nothing. Checking the tbl_Content table itself showed that the biggest rows were years old and no new large binaries have been added.

I then came across the comprehensive post by Terje Sandstrom. The also contained some queries for TFS 2012 to determine the attachment content size. Here’s where it doesn’t make sense. The report table size from the disk usage report, does not match up, whatsoever, against what these queries returned. And the query from Grant Holliday showing monthly usage again show huge amounts of data (60GB per month) in a table that is 680MB. Who is correct, SQL Server reports or table queries?

image

clip_image002

I then ran the Test Attachment Cleaner in preview mode, and sure enough it said it was cleaning up GBs of files. So I ran a Delete against the TFS database. While the cleaner was running the queries were showing the size in those tables dropping, and the Disk Usage report was showing drops in the tbl_Attachment table, albeit at a much much smaller scale. The total database size however was unchanged and the available space was getting less! On completion it said: Total size of attachments: 206030.5 MB!

image

image

The size reported by the database properties when I began was:

image

After cleaning apparently 200GB it is now:

image

Another suggestion was to delete old workspaces, which we have done. To my surprise this released about 5 GB from the content table. Git TFS can create a lot of workspaces.

image

Hence the problem. We are growing at up to about 5 GB per day. Our System Administrator is doing an awesome job keeping up. But we need to know if this is to be expected to continue or if there is something to tell us how we can use less space.

Update 3 April 2014: I have put the question on the forums.

Update 4 April 2014: Something has happened overnight. I’m assuming the workspace clean has caused it. We now have 115 GB space available! What’s odd though is that the tbl_Content size has dropped that space. What does that table have to do with workspaces? Some insights into how this is working so we can manage our systems would be appreciated.

image

image

Update 7 April 2014: More space has flowed free over the weekend without doing anything else. A shrink is in progress. Pretty crazy that 34% of the content size was local workspaces?

image

image

image

Update 8 April 2014: Shrink complete. Looking much better now. Notice however that the tbl_Content table in a day has gone up a few gigabytes, which is quite concerning.

image

image

Update 11 Apr 2014: By the end of the week, we have consumed around 5 GB in 3 days. I don’t see any significant amount of new workspaces created either. Unless the build server with TFS Service is creating them. I’m going to clean up TFS Service’s workspaces at the risk of breaking some builds so that I can monitor them carefully. I now have 51 server workspaces for TFS Service. Which does seem like a lot but we do have 18 Build Agents and many build definitions.

image

image

Update 14 Apr 2014: I don’t know what has happened Friday and over the weekend. The content has grown 2 GB but the database expanded a massive 60 GB! Event though the autogrowth is set to 1%. So a shrink is in progress.

image

image

image

Update 14 Apr 2014 #2: After the shrink it is back to the 2 GB growth for Friday and the weekend.

image

Update 16 Apr 2014: A couple of days growth.

image

image

Update 22 Apr 2014: Over the long weekend, it has done exactly the same thing. Another 60 GB expanded to. I’m not going to shrink this time. There is now 54 server workspaces for TFS Service, only 3 more than 11 days ago, so nothing extreme there. The growth of the Data in the tbl_Content from the 7th April has gone from 282,901,304 KB to 301,318,640 KB; 17.5 Gigabytes. Taking out weekends and public holidays that is almost 2 GB a day.

image

image

Update 12 May 2014: Usage has gone up consistently over the last couple of weeks. We have been shrinking regularly as it keeps expanding inappropriately. Here’s the current state:

image

image

I then disabled the CodeIndex and ran the delete commands as Anthony Diaz suggested in the comments:

image

image

It has removed about 500,000 records from the content table but only about 2 GB of data. We’ll see how it does for daily growth.

Technorati Tags:

Web Test playback does not use concurrent requests as recorded

I have been running performance tests using Visual Studio Web Tests and Load testing. The goal has been to prove the raw load time and the scalability under increasing users. The web test recorder is quite good for simple request-replay. However, beware a bunch of work is required to parse previous requests for keys to be passed back in subsequent requests, especially for more complex web applications but not as much on simple web sites.

Here’s the results for the single user playback with requests grouped into relevant transactions that we needed to measure.

Web Test results

The odd thing that was happening though is that when we were timing the action Open Report with a stopwatch manually in the browser we were getting typically getting under 5 seconds for completion. The reason for the difference became clear when you watch the web test running. You will see each request going sequentially. If you compare it to the timing from the browser, the actually running is not like that. Below is the timings column from network monitoring from the browser. The two highlighted rows are the requests that take the longest amount of time, but are run together.

Browser Timings

Adding up these request sequentially gets us to similar total time as to what the web test replay presented. So to make the request parallel in the web test you need to set the requests as dependent requests to the first. Right-click the request and select Add Dependent Request.

image

This will add http://localhost/. Delete that request after you have drag-and-dropped the requests that you want to run concurrently into the Dependent Requests folder.

image

Rerunning the web test after this change resulted in timings much like what we experienced manually.

Improved Web Test results

Other things to look for is that Think Times are not on and set to 0 for all requests and turn on cache control for all requests.

Update: So when does the web test recorder get the dependent requests correct? Taking into account the Initiator column in the Network tab of the development tools made it clear:

Initiator

These link and script tags are by default dependent requests for the page.

image

These XMLHttpRequest JavaScript calls are not seen by the web test recording as running in parallel and need to be manually be put in as dependent requests as above.

Technorati Tags: