SCVMM 2008 Beta and non-admin access to remote machines – further information

Following my last blog post regarding SCVMM 2008 Beta and the issues I was seeing with non-admin access to remote machines via Hyper-V manager, I thought it would be beneficial to forward my query to the team concerned via Connect. Here’s the answer I got:

“What you are seeing is expected behaviour. When you add a Hyper-V host in SCVMM the Initialstore.xml file is no longer used for Hyper-V security. Instead SCVMM creates a new XML file and modifies it based on the user and admin roles that apply to that host in the SCVMM. That means that the step where you ran Azman and updated the Initialstore.xml file is lost. There is not a good workaround for this issue. The only thing that could be done is to add the user that needs access as a delegated administrator in SCVMM (with the right to administrator this specific host). Then SCVMM will update the XML file it uses with the correct info. Note that if you edit that file manually those changes will be lost when SCVMM refreshes it. It is called Hypervauthstore.xml.”

This is useful insofar as it does indeed allow me a nice way around the problem I was describing. It does however raise another issue, which is that I don’t believe that there is enough granularity in the delegated administrator role mentioned. I can only assign a host to a delegated administrator, not an individual guest. While I can limit which virtual machines a delegated administrator can log onto via user accounts, it may well generate a lower administrative overhead if I could limit the machines that a delegated administrator can connect to (say in the same way that TS Gateway works with RAPS and CAPS).

I’ll feed this suggestion back to the team via Connect.

Installation of SCVMM 2008 beta disables non-admin access to remote machines via Hyper-V manager

Yesterday I finally got around to installing SCVMM 2008 beta onto a virtual machine (mainly to help us with some virtual machine migrations we’ve got coming up).  I must say that I think SCVMM 2008 beta is very nice indeed!

On my Vista machine I use Tore Lervik’s Hyper-V Monitor Gadget for Windows Sidebar, and have done for some time.  With the number of virtual machines we run, I have found it an invaluable addition to my sidebar.

This morning however, when I tried to connect to one of the virtual machines listed by the gadget, I got an error message ‘An error occurred trying to find the virtual machine <GUID> on the server <servername>’.  In addition, when I tried to use Hyper-V manager, I received the error ‘The virtual machine management service is not available’.

We thought for a while that it was related to remote rights (WMI/DCOM) on the servers in question (well, technically it is…) and I spent a while trawling through John Howard’s articles relating to the required rights for remote management (well worth a read by the way).  Unfortunately even working through the articles didn’t solve my problem.

After a little more rummaging, it turns out that installation of the SCVMM agent onto the servers hosting the virtual machines I want to remotely manage is what is causing the problem.  Anyone who is a local admin on the servers in question can freely manage the remote virtual machines; if you’re not a local admin, you can’t.  There are two potential solutions to the problem:

  1. uninstall the SCVMM agent from the servers in question (which would no longer allow us to manage them from SCVMM)
  2. Make anyone who needs to remotely manage virtual machines a local administrator on the servers in question

Lets be honest, neither option is entirely appealing (it’s not that we don’t trust some of the people who need to remotely manage specific machines, I just always would prefer to work from a ‘minimum rights necessary’ point of view), but as we have some migrations coming soon for which SCVMM is going to really help, we’ve gone for the latter.

I hope that this is something that is corrected in the RTM version of SCVMM 2008!

Enigma, Bletchley Park and the Battle of the Atlantic

I attended a very interesting BCS talk last night hosted by the West Yorkshire Branch about Enigma, Bletchley Park and the Battle of the Atlantic.

Dr Mark Baldwin is a superb speaker; he spoke about Enigma machine itself, the decoding efforts started by the Poles in the early 1930’s, subsequent wartime efforts to break the codes, the machines used to aid in this process, the effects that code breaking had on the battle of the Atlantic and Bletchley Park itself for 2 hours without any notes! At the end of the talk, there was also the opportunity to examine a rare 4-rotor Enigma machine that Dr Baldwin had brought with him.

I was particularly intrigued to hear that the Germans thought that the sheer number of possible combinations that the Enigma machine allowed for (3 × 10114, a number significantly larger than the number of atoms in the observable universe!) precluded anyone being able to decode their messages; an assumption that remained until many years after the war. The rotors used with the Enigma machine were also not rewired at any point during the war. In addition, because they assumed that nobody could read the messages produced by such a system, they made very little effort to break the codes produced by our Typex system!

I was also saddened to see the state that Bletchley Park is now in. Many of the huts where so much incredibly important work was carried out are in a very poor state, some have even already been destroyed. Bletchley Park has receives no external funding and has been deemed ineligible for Heritage Lottery funding. I would urge you to sign the petition located at http://petitions.number10.gov.uk/BletchleyPark/ in the hope that the government will do something to help save this crucial piece of British history.

On a happier note, Robert mentioned to me that Black Marble does sponsor Bletchley Park! I look forward to being able to visit in the near future.

SharePoint Federated Search

One of the new features introduced with the recent SharePoint Infrastructure Update are the Enterprise Search Features that were shipped with Search Server 2008 and Search Server 2008 Express, that were not included in the original release of SharePoint 2007.  This includes some Search core platform performance updates, a unified administration dashboard and Federated Search.  The latter of these updates is the one I’d like to discuss briefly here.

SharePoint always had the facility to allow you to index external sources of data (e.g. an external web site), however the new Federated Search capability of SharePoint 2007 also allows you to include sites that use the OpenSearch 1.0/1.1 standard.  The Federated Results web part allows you to display the results in a separate section of your search results page.

To make sue of the Federated Search capabilities, you need to add Federated Locations, then add the Federated Search web part to the search results page and modify the settings of the web part.  The web part is typically added to the SharePoint search results page to provide extra sections at the right-hand-side of the page showing results from the Federated Locations.

I won’t go through the steps to set up a Federated Location here, as there is a Microsoft video which describes the process very well.  Note however that you can modify the look and feel of the returned results from within the Add Federated Location page by modifying the XML shown on the page and that you can restrict the use of the Federated Location you are setting up to specific sites within your farm.  You can also pass credentials to the Federated Location if you need to do so.

Microsoft have also provided an online gallery of pre-configured Federated Search Connectors (Locations) at http://go.microsoft.com/fwlink/?LinkID=95798.  There is a link to this gallery at the top of the Manage Federated Locations page within the Search Administration area of the SSP on your farm. By default, Microsoft provide the following Federated Locations installed with the Infrastructure update:

  • Internet Search Results (Live Search)
  • Internet Search Suggestions (Live Search)
  • Local Search (unscoped local search index)

The following Federated Locations are available (as at 1st August 2008) from the online gallery:

  • Live.com News
  • Yahoo News
  • Wired
  • The Register
  • MSDN
  • Technet
  • Wikipedia
  • Encyclopedia Britannica
  • Yahoo
  • Flickr
  • Yahoo Images
  • YouTube
  • PodScope
  • Technorati
  • Google Blog Search

These are available for download and import directly from the online gallery; just follow the instructions at the bottom of the page!

Once you have the Federated Locations you wish to use set up, the Federated Results web part needs to be added to your search results page.  Navigate to your search results page and add the Federated Results web part the a web part zone on the page as you would any other web part. To change the Federated Locations the web part uses, modify the shared web part and drop down the listbox at the top of the settings; this contains the items in your Federated Locations list.

Using these settings, you can

also modify the number of results that are returned, the number of characters displayed in the summary and URL, whether to return results asynchronously, whether to show the loading image (a miniature version if the standard SharePoint ‘processing’ animation) etc.  As usual you can also target the web part via the Audience controls.

We use the TechNet and MSDN Locations provided on the online gallery, and very useful we find them too!

Deployment of Digital Signature ActiveX control from within SharePoint fails

I’ve recently been investigating a rather curious problem we’ve been experiencing with deployment of the Digital Signature ActiveX Control from SharePoint 2007 to browsers.

First a little background: A number of workflows we’ve developed recently make use of the capability of InfoPath forms to incorporate sections that can be digitally signed. In general these forms are developed in InfoPath 2007, and are served from a MOSS 2007 farm into a browser window for the user. The forms are developed on a PC which has UK-English language and locale settings and are served from installations of MOSS 2007 which are typically configured for UK-English as well.

The first time a user sees a form with the capability to be digitally signed displayed in their browser window, assuming the Digital Signature Control is not already installed on their computer, they are asked whether they would like to install it:

installdigitalsignaturecontrol

Clicking yes, opens another window which presents them with a license agreement to read through and agree to. Checking the ‘I agree…’ checkbox and clicking the ‘Next’ button:

licenseagreement

… results in nothing further happening. All rather curious.

After the usual investigation for such things (check the event log; nothing, check the MOSS logs; again nothing, turn on trace logging; again nothing), followed by discussion with the SharePoint support team and some Fiddler traces, we worked out that the Digital Signature Control installer (DSigRes.cab) could not be found on the SharePoint farm at the expected location of _layouts/2057/DSigRes.cab (2057 being the LCID for EN-UK). Indeed, when we looked for C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\TEMPLATE\LAYOUTS\2057 the folder didn’t exist!

We tried installing the MOSS 2007 language pack, which made no difference.

As a workaround, I created the 2057 folder under _layouts and copied the DSigRes.cab file from the 1033 (LCID for EN-US) folder into it. This appeared to cure the problem and the Control installed succesfully.

At this point, the issue was escalated to the US InfoPath team. This morning the InfoPath team came back with their confirmation that this is indeed a bug with InfoPath and that it is not limited to UK-English, but at this time they do not have a complete list of the languages that it does affect.

They also confirmed that the workaround listed above is a reasonable solution at the moment.

When I hear that a Microsoft solution to this issue has been provided, I’ll let you know!

Aspnet_regiis -ga domain\user results in ‘an attempt was made to reference a token that does not exist.’

Today I had to run aspnet_regiis -ga domain\user to grant a specific domain user account access to the IIS metabase and other directories that are used by IIS.  The server in question is running Windows Server 2003 R2 x64 edition and after running aspnet_iisreg from the C:\Windows\Microsoft.NET\Framework\v2.0.50727 directory, I received an error that read:

‘An error has occured: 0x000703f0 An attempt was made to reference a token that down not exist.’

After a quick look at the server, I noticed that the .NET 2.0 framework was actually the x64 version rather than the x86 (i.e. 32-bit) version that I had assumed.  Running the aspnet_regiis command from the C:\Windows\Microsoft.NET\Framework64\v2.0.50727 directory, the command completed successfully.

Creating Custom Content Types for SharePoint using Visual Studio (or Notepad) and a few other tools…

There are a number of ways of generating Custom Content Types for SharePoint, some more difficult than others.  One of the easiest ways is to create your own using the SharePoint GUI interface via a web browser, however while a nice ‘point and click’ approach, there is no easy way to extract the resultant custom columns and content types for use on another farm.

At the other end of the spectrum, you can hand code the required XML from scratch and generate the features for installation.  This has the advantage that you can re-use the features you generate, but the fairly obvious disadvantage of having to hand-craft XML for use in the features.

There are also a couple of ‘middle of the road’ approaches, one of which I’m going to comment upon, the other of which is what I am going to briefly discuss here.

The first ‘middle of the road’ approach uses the Codeplex SharePoint 2007 Content Type Viewer, details of the use of which can be found at http://dhunter-thinkingoutaloud.blogspot.com/2007/06/creating-content-types-with-content.html – I do however have a problem with the approach outlined and I’ll explain why:

The Content Types Viewer is a very useful utility that allows you to extract information about content types from a running instance of SharePoint, even those created via the GUI.  The utility provides you with the Fields (columns), the FieldRefs (for specifying in the content type) and the schema for the content type.

Using it in the way described in the above link, the user would copy the Fields data from the Content Types Viewer and generate a feature to install these fields, then create a content type feature containing the information from the FieldsRefs.

The problem I have with this approach is that none of the information from the schema is used within the content type.  I’ll give and example as to why this is bad:

  • Create a custom content type; it doesn’t matter what columns this contains as long as it contains at least one column that is a choice.  Fill in a number of choices for this column.
  • Create a list associated with your custom content type.
  • Extract the fields and content type using the Content Types Viewer as instructed.
  • Copy the fields and content type features to another farm, install and activate them.
  • Recreate your list using the GUI on your new farm and associate it with your content type.
  • Try to create a new entry in the list.
  • You should see that the choice column(s) you contain no data!

The reason this happens is that the choice information you supplied for your custom content type is contained within the schema for that content type.

To rectify the situation, do the following:

  • Save the list on your original farm as a template (.stp).
  • Copy this template to your new farm and upload it into the list template gallery.
  • Create a new instance of your list from the template.
  • Try to create a new entry in the list and your content type magically now contains the choice data you input on the original farm.

This happens because the schema information is contained within the list template file created on the original farm.  You can test this hypothesis by extracting the manifest.xml from the .stp file you created and looking at it – you should see your choices listed within <CHOICE></CHOICE> XML fields.

While this works quite happily if you only ever want to transplant your custom content type/list combination to another farm, however if anyone creates a list from scratch and associates it with the custom content type on the new farm the list will not be able to function properly.

A better approach is to use the Content Types Viewer, but also include the required information from the schema.  While this involves a little more than ‘cut and paste’ from the Content Types Viewer, it does at least provide a complete content type for transplant to another farm.

Most of the information from the schema should be transplanted into the Fields feature, with the content type feature remaining the same.  A number of fields from the schema should not be transplanted as these cause errors when attempting to install/activate the feature.  These include:

  • Version
  • Customization
  • Aggregation

A few notes:

  • Generally you should use SourceID=http://schemas.microsoft.com/sharepoint/v3 in your fields feature.
  • There may be problems associated with referencing other fields for (for example) a calculation field as these typically utilise a GUID to identify the column you wish to extract data from.  If you are referencing columns within the same content type, you can simply use the column name and dispense with the GUID.  If you wish to access information from another list you have a problem as the GUID for the list will be generated dynamically when the list is created.  If you wish to do this you need to create the lists you wish to reference first and extract the list IDs for use in your feature.  An alternative (in particular if you wish to wrap your features up into a SharePoint Solution) is to write a feature receiver to dynamically add the appropriate columns with the correct GUIDs to your feature at activation.

I aim to discuss creating field and content type feature creating in more detail in a future post.