Swapping from Nvidia RAID to Software mirroring on a SunFire x2100

Our SunFire x2100 application servers, running Windows 2003 R2, are configured with Nvidia RAID system (built into the motherboard) to mirror the pair of SATA drives in each servers. Two of these servers keeps losing the mirrors, but a third, and a SunFire x2200, do not. When this happens Windows hanging – not good in a server.

We have had a support call open with Sun and swapped bits of hardware, but it keeps happening. During the support calls we discovered that the Nvidia RAID is not true hardware, but a ‘software trick’. So not a great step up from letting Windows do the mirroring itself.

To try to isolate if we have a hardware problems (motherboard, disk etc.) or a driver problem we decided to move from the Nvidia RAID to Windows software mirroring. This was not as hard as I had expected, even though you have to load a special driver for Nvidia RAID, it does not the standard SATA  built into Windows.

So the process is:

  1. Boot to BIOS setup (F2)
  2. In the integrated peripherals, disable the Nvidia RAID
  3. Boot to Windows (may take a couple of reboots to get to a login prompt, not sure why this is)
  4. When logged in you should now see two drives, copies of each other
  5. In the Admin Tools | Computer Manager, select the Disk manager and convert the two drives from basic to dynamic drives (this needs a reboot or two)
  6. When this is completed go back into the disk manager and delete the partition on the second drive
  7. Select the partition on the primary drive right click and add the second disk as a mirror

And now you should be done, at least when the resysnc completes, which took about four hours for our 500Gb disks which are only about 10-20% full.

It goes without saying to make sure you have a backup before you start.

Lets see if this fixes our problems

Microsoft SQL 2005 starts then stops with 3414 error

I recently had one of our Windows 2003 server lose it’s disk mirrors and locked up. When it was restarted it has two (virtually idenitical) drives C: and E:. It booted off the primary mirror disk (C:) and all seemed OK except SQL.

I also tried booting off the secondary mirror (E:) but this would not boot (this drive it turns out had some bad blocks).

So I went back to the primary disk. The actual problems was SQL server started but then stopped after a few seconds, the Windows error log showed the unhelpful 3414 error. I google for this, but all that was mentioned was issues with DTC, but this did not relavent as we not use distributed transactions. There was nothing else on the web of note.

I had a look at the MSQL.1\logs directory and this showed problems loading the various databases. So it seems when the disk de-mirrored it was writing SQL transaction logs, and they ended up corrupted. So in my case a generic 3414 error in the error log meant corrupt transactions that could not be rolled forward or back.

More in hope than expectations I tried copying the SQL datafiles and logs back from the faulty secondary drive (E:) and tried to restart SQL and this worked – SQL started without a problem! I was lucky the bad blocks were not near the SQL files. This saved me from having to rebuild the server and restore backups, espcially as some the the DBs were SharePoint, and a SharePoint SQL restore is rarely fun!

When installing Cassini why do I always forget this?

If installing the Cassini Personal Server on a PC you will often get the “Cassini managed web server failed to start listening to port 80. Possible conflict with another web server on the same port.” error.

You of course think this is a firewall, other web server or anti virus port blocker problems

IT IS NOT!

Ok it might be those problems as well but usually it is that you need to run

gacutil /i c:\cassini\cassini.dll

or just drag a copy of the cassini.dll into the GAC (C:\Windows\Assembly)

Shame the installer does not do this.

Moving Community Server

Today I moved this blog server from an old server to our nice shinny new ones. This meant splitting it so the DB to the dedicated SQL server and the front end to the new web server box. This cause a few problems.

The actual move was fine, just back and restore the DB and copy over the ASP.NET web contents. I then edited the web.config to point at the new server and had some problems, some expect some not.

  • First I altered the CustomErrors block to report full errors
  • In the SiteSqlServer setting I alter the server name to point to point at the new server, until this was done I got not unsurprisingly a server not found error.
  • Once the server was right I got a MyDomain\MyServerName$ could not connect, so I created this user giving the rights listed in the database readme file could in the scripts sub directory.
  • However this did not work, I then got a CS generated form titled Critical Error: Data Store Unavailable, that told me to edit the entries I had just edited!

After much digging about I found the answer in the CS forums, you have to also give the user the ASPNET_* rights for the CommunityServer database.

Hope this saves someone some time.

Lets just rename that Team Foundation Server…………….

I have previously posted on the fun I had getting TFS running in our office. Well thus far it has been stable, other than some ‘user too stupid’ errors, and we have been fairly happy.

The next stage was to expose the TFS server out through our firewall to allow home working. This turned out to not be too bad (expect some posts on our experiences with ISA server soon) but raised an interesting issue.

As far as the Visual Studio Team clients were concerned the TFS server had the physical PC server name in all it’s URL e.g http://myserver:8080. This was not an issue with the office as it could be resolved, but a problem on the Internet. Now I suppose we could have put some host file entries to address this but I really wanted to get it working as http://tfs.mydomain.co.uk:8080.

So we created a new DNS entry (both internally and externally resolving to the correct IP address). Once this DNS entry was created, and the firewall setup I could connect to the TFS server and pull down a project list and check files in and out from home. But I was getting those damned little red crosses next to the documents, reports and could not open the project SharePoint sites.

On checking the URL to be used for these services I saw that they were all still pointing to http://myserver with various SharePoint or Reporting Service directories on the end. Firstly, I had not exposed the default SharePoint and Reporting Services ports via the firewall, but that was easy to fix. The real problem was using the old name, how to change these entries? I think the best option would have been to install the TFS with the full name in the first place! But I did not really want to do a reinstall.

So I had search around and found that in C:\Documents and Settings\[name]\Local Settings\Application Data\Microsoft\Team Foundation\1.0\Cache\[GUID] directory there is a XML file RegProxyFileCache.xml. This contains the details used by the client and can be edited. I replaced the http://myserver entries with http://tfs.mydomain.co.uk. A snippet is shown below from around Line 250 of the file:

<RegistrationEntry>
     <Type>Reports</Type>
     <ChangeType>NoChange</ChangeType>
     <ServiceInterfaces>
           <ServiceInterface>
                 <Name>BaseReportsUrl</Name>
                 <Url>
http://tfs.mydomain.co.uk/Reports
</Url>
          </ServiceInterface>
          <ServiceInterface>
                 <Name>DataSourceServer</Name>
                 <Url>myserver</Url>
          </ServiceInterface>
          <ServiceInterface>
                 <Name>ReportsService</Name>
                 <Url>
http://tfs.mydomain/ReportServer/ReportService.asmx
</Url>
           </ServiceInterface>
       </ServiceInterfaces>
      <Databases />
      <EventTypes />
      <RegistrationExtendedAttributes />
      <ArtifactTypes />
 </RegistrationEntry>
<RegistrationEntry>
      <Type>Wss</Type>
       <ChangeType>NoChange</ChangeType>
       <ServiceInterfaces>
            <ServiceInterface>
                   <Name>BaseServerUrl</Name>
                   <Url>
http://tfs.mydomain.co.uk
</Url>
             </ServiceInterface>
         <ServiceInterface>
              <Name>BaseSiteUnc</Name>
              <Url>\\myserver\sites</Url>
        </ServiceInterface>
        <ServiceInterface>
              <Name>BaseSiteUrl</Name>
              <Url>
http://tfs.mydomain.co.uk/sites
</Url>
          </ServiceInterface>
          <ServiceInterface>
               <Name>WssAdminService</Name>
               <Url>
http://myserver:17012/_vti_adm/admin.asmx
</Url>
           </ServiceInterface>
 </ServiceInterfaces>
<Databases />
<EventTypes />
<RegistrationExtendedAttributes />
<ArtifactTypes />
</RegistrationEntry>

After this change is made on reloading Visual Studio the red crosses go away and the various features work.

However this does not answer the larger question of getting it set right in the first place for new clients, you don’t really want to have to edit each client config. I suspect making similar edits in the C:\program files\microsoft visual studio 2005 team foundation server\tf Setup\eleadservices.xml might do the trick but I have not confirmed this as yet.

I still hold with the comment that TFS is very much an ‘install it right first time’ sort of product.

Update on Virtual Server

Yesterday I posted about problems with accessing remotely the Virtual PCs from the Virtual Server console. It turns out the problem was domain name related. We had a different name on the internal DNS to that on the external side.

In effect the server was trying to call:

vmrc://virtualserver.mydomain.co.uk:5900/my pc

when we should have been calling

vmrc://virtualserver-external.mydomain.co.uk:5900/my pc

even though we had actually accessed the system via

http://virtualserver-external.mydomain.co.uk

This has been fixed by getting all our DNS entries in line.

Accessing Virtual Server via an ISA server

We have been moving over to ISA Server to allow better management of internet resources, one problem we have had is publishing our Virtual Server so our tele-workers can get remote access to the test systems.

The ISA application publish rule works fine to allow access to the main Virtual Server console page (can create, start stop PC etc) but if you click on a Virtual PC you get a no connection screen, so you cannot actually use the Virtual PC.

However I found a way round this problem. When you get the no connection screen, click the remote connect option on the top right you can enter the URL in the form

vmrc://virtualserver.mydomain.co.uk:5900/my pc

and assuming you have allowed the 5900 access through the ISA firewall it should work.

So it looks like the problem is that when you click the thumbnail of the PC to access it uses some other URL to the form shown above. I will try to find out why this is.

Look out for more posts on ISA on BM Bloggers as we get our teeth into it.