There seems to be an issue installing Workflow Manager 1.0 refresh on Windows Server 2012 R2. Upon completion, when clicking through to configure Workflow Manager, you are informed that neither Service Bus 1.0, nor (obviously) CU1 for Service Bus 1.0 has been installed.
Digging into the event log on the machine in question, this shows that VC Redist 11.0 or greater is required, and this is not installed automatically by the WebPI.
On Windows Server 2012, VC redist 12.0 is installed automatically by WebPI and the installation of Workflow Manager 1.0 Refresh completes successfully.
Obviously the solution is to install VC redist 11.0 or 12.0 before attempting to install Workflow Manager 1.0 refresh on Windows Server 2012 R2.
Back in January I did a post How long is my TFS 2010 to 2013 upgrade going to take? I have now done some more work with one of the clients and have more data. Specially the initial trial was 2010 > 2013 RTM on a single tier test VM; we have now done a test upgrade from 2010 > 2013.2 on the same VM and also one to a production quality dual tier system.
The key lessons are
- There a 150 more steps to go from 2013 RTM to 2013.2, it takes a good deal longer.
- The dual tier production hardware is nearly twice as fast to do the upgrade, though the initial steps (step 31, moving the source code) is not that much faster. It is the steps after this that are faster. We put it down to far better SQL throughput.
DDD North is coming to the University of Leeds on Saturday 18 October.
It is now open for Session submission
The ALM Rangers are again producing a list of useful tools and widgets for TFS. It can be found at aka.ms/widgets and should be updated regularly
The output window in Visual Studio shows lots of useful information; but it doesn't link it to the time.
Sharepoint deployments take ages and I’m quite often distracted and forget whether I have triggered the deploy or not.
I want to quickly check the output window and see when I last triggered the deploy. I found 2 solutions:
1) Add a pre-build (pre-deployment) step.
Open project properties and add the following command:
ECHO ------ %TIME% ------
Now the output window displays:
2) Increase the MSBuild diagnostic level in options
By default Visual Studio shows ‘Minimal’ diagnostic information when compiling a project. Increasing this level will show more details (including timestamps at start, end and elapsed build time)
- Click tools –> options
- Under Projects and Solution –> Build and Run
- Change the output verbosity to ‘Normal’;
I am currently involved in moving some TFS TFVC hosted source to a TFS Git repository. The first step was to clone the source for a team project from TFS using the command
git tf clone --deep http://tfsserver01:8080/tfs/defaultcollection ‘$My Project’ localrepo1
and it worked fine. However the next project I tried to move had no space in the source path
git tf clone --deep http://tfsserver01:8080/tfs/defaultcollection ‘$MyProject’ localrepo2
This gave the error
git-tf: A server path must be absolute.
Turns out if the problem was the single quotes. Remove these and the command worked as expected
git tf clone --deep http://tfsserver01:8080/tfs/defaultcollection $MyProject localrepo2
Seems you should only use the quotes when there are spaces in a path name.
Updated 11 June – After a bit more thought I think I have tracked down the true cause. It is not actually the single quote, but the fact the command line had been cut and pasted from Word. This mean the quote was a ‘ not a '. Cutting and pasting from Word can always lead to similar problems, but it is still a strange error message, I would have expected an invalid character message
Microsoft UK have been running technical events around the UK for a couple of years now, and it’s a great thing. Too many events are focused in the south of England and there are lots of IT pros north of the M25!
Starting on Monday, the latest series of events kicks off. The People-Centric IT roadshow content is being delivered by MVPs from across the UK and Ireland. Covering hot-topics like Bring Your Own Device (BYOD) and information security, the sessions will talk about using the appropriate tooling from across the Microsoft stack to address these real-world problems.
You should think about attending if you want to learn more about:
- Using System Center and Intune for managing desktops, operating systems and devices.
- Managing desktop and application delivery with VDI
- Using technologies like DirectAccess, Work Folders and Dynamic Access Control to give easy and secure access to data.
- Using Active Directory Federation Services as part of your identity portfolio.
As always, the more people support these roadshows, particularly outside London, the more Microsoft will deliver similar events.
More information about each of the days can be found on the event pages, here:
11th June 2014 – Edinburgh
12th June 2014 – Sunderland
13th June 2014 – Birmingham
16th June 2014 – Bristol
17th June 2014 – Reading
The Sunderland event is being supported by our good friends at NEBytes, too!
My X220 is a stalwart machine. It’s built like a tank and can be upgraded in a numb of ways. Mine now has 16Gb of RAM and two SSDs which allow me to run multi-VM environments for development and demo. Unfortunately, however, there is no USB 3 on the laptop. That’s a pain if I need to copy stuff on and off via USB, or run VMs from a USB 3 pod.
I’ve tried adding USB 3 before – I bough a Startech ExpressCard 54 with two ports. That singularly failed to work – the card is detected but the system either fails to recognise connected devices, or sees them yet can’t access them.
Whilst at Build I was involved in a conversation with the Kinect product team. They said that when using the Kinect with Windows they had issues with USB 3, and it boiled down to chipset issues, where the manufacturer hadn’t implemented something quite in accordance with spec. This spurred me to look for another ExpressCard, carefully looking for a different chipset.
Enter, stage left, Targus. not a name I’d associate with this kind of periperal, but careful reading of their specs showed their card to be a totally different chipset from the Startech device.
I ordered mine from Amazon. It arrived the next day, plugged in and simply worked. It’s an ExpressCard 34, so only one port. However, it puts out enough power to run my Western Digital USB 3 pod without needing to use the included cable to draw additional power from another USB port. I get the same transfer speed from the disk as my colleagues USB 3-equipped W540 laptops, so I really can’t argue.
The one thing I did add was a 34-54 adapter (from Startech, ironically) to plug the hole left in my ExpressCard 54 slot.
With the current move to ultrabooks I can see no real replacement device for my X220 – a sealed unit with no more than 8Gb and a single drive doesn’t come close to my needs, and a 15” luggable workstation is just too heavy. Hopefully I can keep tweaking the X220 for a while yet.
Richard and I spend a good deal of time talking about Lab Manager and our environments. I’ve written here before about our migration to the latest versions of the various components of Lab and both Richard and I have delivered sessions at user groups and conferences.
Richard was in Belgium last week for Techorama, after which he was asked about the specifics of our setup. Between us, we came up with a diagram of our Lab Environment and Richard recently posted that to his blog. Hopefully some of you will find it useful.
Lets get the disclaimer out of the way first: What I’ve done is absolutely unsupported by Microsoft. Just because it works for me does not guarantee it will work for you and I am not in any way recommending that you follow my lead!
I use a great many virtual machines for both customer work, internal projects and just tinkering. My ThinkPad X220T is tricked out with extra RAM and two SSDs. Space is still an issue, though, and I can’t squeeze any more storage into my little workhorse.
Windows Server 2012 introduced Data Deduplication – a fantastic feature that is saving us huge amounts of disk space on our SCVMM library. I’d love to be able to use that on Windows 8.1 Sadly, Microsoft didn’t see fit to enable the feature.
There are a good many people out there who thought like I do and some of them decided to to figure out how to get data deduplication working on Windows 8.1. I’m not going to repeat those instructions here – I will instead post a link to the best of the articles I read before taking the leap, that of Mike Bijl.
Having installed and enabled the Data Deduplication feature I enabled dedupe on my D drive – a 500Gb Crucial M500 SSD. Note that you cannot dedupe your OS partition – you need OS and data volumes to get anywhere with this process. I started with about 12Gb of free space, gobbled up by ISO files and VHDs of installed VMs. Those dedupe beautifully, and I now have 245Gb of free space.
Have I encountered any problems yet? No. All my VMs run fine. I have a scheduled dedupe job running at noon to keep things tidy that has given no problems so far.
It is important to reiterate Mike’s point, however: If you enable dedupe on a volume and reinstall Windows 8.1 you will not be able to access any data on the drive until you re-enable dedupe (or stick the volume in a Windows Server 2012 or 2012 R2 machine). I’m happy with that – it’s no big deal for me. I would not, however, allow any of our developers to do this on their workstations, for example.
Doing all this, however, has got me thinking… Homegroup support is missing from Server 2012 and 2012 R2. I wonder if the same process might be used to enable features in the opposite direction…?