The blogs of Black Marble staff

Tech Ed EMEA IT: Day 3 - Server 2008 R2

We were in early today, looking forward to a session on SharePoint with Bill Engolish. Sadly, that was cancelled so Andy and I sat in on the Server 2008 R2 overview session presented by Iain McDonald. That was very interesing, and we learned a bit more about BranchCache. It doesn't look like it will replace WAN accelerators like Riverbved, because it doesn't appear to function at their low level. However, it does a similar thing at the file level. The client requests a file from the remote server, which instead replies with hashes. The client PC the requests those hashes from the local cache, improving performance. The cache itself is built on request so does not need to be pre-populated (which is good). I think WAN accelerators have nothing to fear from this, but for smaller organisations or ones which aren't able to put the accelerators in (perhaps their servers are hosted, for example) BranchCache looks like a very promising technology.

Something I saw and got excited about is DHCP failover. We don't suffer much with DHCP outage, but because the only way to sync up two DHCP servers is to export and import it's very hard to do resilient services. DHCP failover should solve that, and it looks good.

Also, more on the >net on server core front. The key 'takeaway' is that it is a subset of .Net 2, .Net 3 (WCF and WF, not WPF) and .Net3.5 (WF additions and Linq). That makes sense - why include elements related to the GUI. However, subset obviously means compatibility pitfalls and I am still very interested to see where this goes.

We spoke to a few guys on the IIS stand yesterday about SharePoint on IIS7. I need to talk to the SharePoint guys about the same thing. The IIS chaps were optimistic that what I wanted to do would work, but there had been no effort put into testing of the scenario as yet. As far as I am concerned, at the very least I want to be able to run my WFE servers as server core for security reasons. I'd really like to be able to deploy the app server roles to core as well, which falls in line with a single-purpose server, virtualised strategy.

I'm writing this as I wait for the MED-V session to start. The brief intro to this given during the Windows 7 session made it sound exciting and I really hope to come away from this feeling energised. Whilst it's been a solid conference so far, there's not been much to give me a buzz - perhaps this is it. I'll take notes and try to post my thoughts later.

Going to conferences is worth it, well the chats in the corridor certainly are.

I am down in Reading for the VBug conference where I am speaking on TFS tomorrow.

Whilst in the bar chatting to Roy Osherove from Typemock, the keynote speaker for the conference, he asked if I had looked at the Sharepoint patterns and practices document that details using Typemock Isolator for unit testing in Sharepoint.

On a first look it seems very interesting; as usual at this point  I just wonder how I missed the announcement of this document last month! Is is just me or does everyone struggle to keep up with the the blogs and site you should read?

Update 7th Nov - 


TechEd EMEA IT: Day 2 - Threat Management Gateway

Andy and I are now in a TMG preview demo. This looks really interesting - we spoke to the guys at ATE last night and saw a few items that I hope to see now in more detail. TMG is ISA Server vnext - codenamed 'Nitrogen' and part of the 'Stirling' next wave of Forefront.

Stirling family members exchange information to allow 'dynamic response' - trigger actions from different forefront elements (client sec etc) based on alerts from other elements (eg mail scanner). That looks really powerful.

New in TMG is web client protection - threat protection. Scan downloaded files as they pass through for malware. This blocks download of malware and shows the user a message page. Finally - way to save some users from themselves!

TMG can now also inspect ssl traffic! TMG encrypts between client and itself using it's own certificate to the client, assuming the cert from the actual site is valid. Notably, if you enable https inspection you can make TMG tell the users - warn them, if you like - that their 'secure' connection is being inspected. You can also exclude categories of sites from this inspection.

For large files, TMG will show the user a 'comforting' page informing them that the file has been downloaded by TMG and is being scanned for malware.

TMG inspects traffic and will try to detect if a download manager is being used. At that point the 'comforting' page won't be displayed. Interestingly, you can also block the download of encrypted zip files if you like - i.e. if TMG can't scan it, don't let it through.

TMG can also now do URL filtering. This is category-based, so you can block categories of sites. The site lists can be acquired through an external service. Can override the https inspection for categories of sites as well - e.g. banking sites.

These are gathered into the heading of Web Access Policies, which cover URL filtering, https inspection and malware inspection.

Also interesting is the Intrusion Prevention Systems which allows TMG to detect and block exploits for vulnerabilities, even if the hotfix is not yet released (such as the sql worm, for example). The demo of this was really cool, albeit in a geeky kind of a way. The exploit protection uses signatures which will be dowloaded and deployed, and my understanding is that they are not limited to TMG.

The firewall can also now continue to run if the logging DB server goes away. TMG creates a log queue locally, continues to operate normally, and will update the DB when it comes back online. The log viewer also continues to work, albeit only accessing the local queued items.

This is all cool stuff. There's lots more too, but the things I've mentioned here are of use to everyone, whereas some of the other stuff covered is certainly less applicable to us at BM because of the way we work. Another solid-looking new product that I would recommend anybody to look into, and particularly if you're currently using ISA 2006.

TechEd EMEA IT: Day 2 - Windows 7 Feature Preview

So, the first session of the day was an extremely well-attended overview of Windows 7 features. When they talk about evolution rather than revolution with regard to Windows 7, I think that's accurate. It was very much about developing and extending the foundations of Vista.

A few things stuck out, however. An almost throwaway comment about DirectConnect requiring IPSEC and IPv6 means that I must dig deeper, and that the technology, whilst cool, is almost totally useless to me, stuck behind two layers of NAT in a managed building. BranchCache was again mentioned with, again, no indication of how it works - more digging required.

Most pertinent to me, however, was the development of Bitlocker. I am typing this as I sit in the room waiting for the deep dive session on Bitlocker enhancements to start. The key new feature in Windows 7 is the ability to encrypt removable drives using Bitlocker. Interestingly, admins can also use policies to enforce encryption, at which point unencrypted drives become read only. Backwards compatibility ensures that 'Windows XP and Vista' can 'read' data from the drives. I'm guessing they can't write, and I'm also guessing (as it wasn't mentioned) that non-windows systems need not apply.

That lack of cross platform (and now I'm talking about OSX and Linux) support may anger some, but for our company needs  it's irrelevant. We already ensure no customer or sensitive data is copied on removable storage, but being able to encrypt, and force the encryption of all removable media attached to systems I own will help be be able to guarantee that any data copied from our systems is stored securely.

NOTE: Having now been to the deeper dive on Bitlocker, the current build of Windows 7 has no downlevel support. I'm really hoping this will change prior to launch (the presenter was carefully non-comittal, and probably rightly so at this stage). If it doesn't the technology is a dead duck for us, as I can't guarantee being able to get all our machines up to Windows 7 in a reasonable timeframe.

Also of interest to me were the developments in deployment technologies. I will try to attend the appropriate sessions on these too - the ability to add new drivers to wim and vhd files offline (and post-sysprep) could be a big benefit to use in extending the life of our system images, particularly as we look towards more automated provisioning of virtual machines from vhd and wim files onto varied hardware (especially when I get my hands on hyper-v in Windows 7!).

Overall it was a very interesting session, albeit shallow. Windows 7 is exciting - not because it is new and cool, but almost precisely because it isn't. It is to Vista what Windows 2000 was to NT4 and XP beyond - evolved, more stable, more trustworthy.

Barcelona Metro

I think that the Barcelona Metro is superb. So far over the last couple of days it’s been an extremely good, fast service that has got us around the city with no problems at all.

In addition, nobody seems to bat an eyelid when people take all sorts of things with them you’d never expect to see at home.  So far I’ve seen

  • 2 guys carrying a mattress
  • a bloke on a BMX
  • a couple with two (well behaved) dogs not on leads

Tech Ed EMEA 2008 IT – Day 1 reflections

Today has been interesting. Rik and I started the day doing the sightseeing we had time for. The Gaudi cathedral had been particularly recommended, so with limited time at our disposal, that’s what we decided to see. We arrived at the gate just as it opened, and were in within a few minutes. The cathedral is very, very impressive, though there is an awful lot of construction work going on at the moment. It is an amazing structure, with a very impressive sense of light and space inside:


Following the trip to the cathedral, we headed back towards the convention centre to get lunch and to try to get into the main auditorium for the keynote early enough to get a good seat. I was glad that we made the effort as we managed to get seats near the front tucked off to one side. Here’s our view of the stage, and the auditorium once it had nearly filled:

IMG_2901 IMG_2908

The keynote by Brad Anderson was interesting, with a number of announcements and some very useful demos. I was particularly impressed with the drive towards virtualisation, and the available and forthcoming tools to help you manage the resulting data centre. There was a live migration demo using Server 2008 R2 which demonstrated a live move of a virtual machine from one host to another with no interruption of service. In addition, Gemini was demonstrated; a self service BI offering allowing anyone within the organisation to view and manipulate data from sources such as SQL Server. The most impressive part of the demonstration as far as I was concerned was the ease (and speed!) with which the data could be published to SharePoint for consumption within the business:


Also mentioned were items such as Cross Platform Extensions for SCOM allowing monitoring and management of non-Microsoft systems and server Application Virtualisation allowing the separation of the server OS and the server application allowing each to be managed (and patched) separately – all very interesting! A number of announcements were also made, for example System Center Operations Manager 2007 R2 Beta will be available for download at the end of November.

From there it was off to the first session; Planning and Operations Tools for SharePoint which provided some useful pointers and allowed the possibility of some feedback to the managers of the solution accelerators programme.

After the sessions this afternoon, Rik and I spent some time wandering around the Ask The Expert area generally asking awkward questions of most of the people we could find.

All in all it’s been a very useful first day.

patterns & practices Acceptance Test Engineering Guidance

Robert blogged about the new beta release of the patterns & practices Acceptance Test Engineering Guidance document. I have had a chance to do a quick read now and I have to say I am impressed. If nothing else it gives great comparative look at waterfall and agile methods for delivery, and a review of many types of acceptance testing.

As with many of the p&p documents it is not exhaustive in what it covers, but what it does give is an excellent and detailed starting point for you to make the decisions that are right for your project. It does not give all the answers just most of the right questions.

Tech Ed EMEA IT 2008: Day 1 - Keynote

So, the keynote was interesting. Much of the content I had seen before, but there were some demos that were interesting and a few snippets that made me take note.

For example, I had not understood that the acquisition of Kidaro will enable interaction between applications running within a virtual machine and the host desktop in ways that are not currently achievable. That the technology will ship as part of a new Desktop Optimisation Pack was news. I believe the technology is name MEDV - Microsoft Enterprise Desktop Virtualisation.

Softgrid was also mentioned as solid way to achieve application virtualisation - a technology that I have not previously had chance to play with, but which is most definitely on my To Do list - I think of a few specific practical uses for us. One of the 'announcements' of the keynote was the RTM of Application Virtualisation 4.5 (I believe, the solution formerly known as SoftGrid). Critically, the team behind application virtualisation are working on virtualising the server applications. That has big implications for simplifying the deployment of new virtualised solutions and the stack of differencing disks and other VHDs needed.

Also of note - Server 2008 R2 includes the ability to live migrate virtual machines. What I did not know until today was that Server 2008 R2 M3 is available for download. I can feel some testing coming on...

On the subject of virtualisation, the release of System Centre Virtual Machine Manager including support for Hyper-V was also 'announced'. I believe we've been running that for about a week now and I am pretty impressed with it (we're currently migrating our Virtual Server 2005 VMs to Hyper-V - I'll post about that experience another time).

What was new to me was the idea being worked on of using M - the modelling language launched as part of Oslo - to create models of systems which can then be provisioned using SCVMM. For the creation of development and test environments that sounds cool!

All of this is part of a concerted (if a little low-key, I thought) push to position Microsoft as the cost effective (read, cheaper!) solution for virtualisation and virtualisation management.

A couple of enviro-quickies:

  • Microsoft is the largest commercial purchaser of servers in the world and is brining a new datacenter on-stream roughly once per quarter.
  • Their new DC in Quincy, WA is built next to a hydro-electric dam to ensure a clean source of energy.
  • The upcoming Dublin, Ireland DA will use natural air cooling, not air-con (and I'd love to hear more about that).

Announcement quickies:

  • SCOM 2007 R2 beta will be available for download at the end of November.
  • Centro - Essentual Busines Server will be 'announced' on November 12th.
  • Identity Lifecycle Manager '2' RC is now available

A key new feature in Server 2008 R2 is the availability of ASP.Net on Server Core. That has big implications for SharePoint and you can bet I will be talking to the guys from Microsoft about that one later!

Also interesting were a few new Server 2008 R2 features:

  • DirectAccess - device can connect securely over internet without requiring VPN. We currently use ISA server but there are limitations. This might be handy...
  • Bitlocker to Go - encryption for USB drives (and other removable storage, I assume). Definitely interested in that one.
  • BranchCache - branch office caching solution for data. Sounds like WAN acceleration a la Riverbed to me, and the demo did nothing to change that view. Does this mean the caching server has to be the gateway for the WAN? What does it support in terms of applications, protocols etc? Another one to discuss during the week.

Tech ED EMEA IT: Day 1 - Waiting for the Keynote

It's an exercise in surreality. I've just walked through tunnels reminiscent of THX1138, to emerge in a wonderful blue-bathed auditorium, and they're playing the Akira soundtrack (specifically the bit from just after the first nuclear explosion). Weird.

Andy and I travelled all the way from Bradford, and the first guy we strike up conversation with... is from Salford! What are the odds?

Anyway, here's a pic of the view from our seats. More after the keynote...The Tech Ed stage - waiting for the keyonte

PDC Review

Having just returned from PDC in LA, here are my highlights from the week.

Windows Azure - this is the OS for the cloud. Microsoft have learnt from their experiences and created a secure, scalable platform for developing and deploying your web applications.

.Net Services - Are a set of services hosted in the cloud to help you to develop cloud based or cloud aware applications. .Net services consists of 3 main components: Access Control, Service Bus and Workflow. Access control uses standards based identity systems including LiveId to help to secure your cloud applications. Whether the service is hosted behind your fire wall or in the cloud, the service bus allows you to connect your applications and services together across the internet. Workflow services is a cloud based host for your WF workflows and includes a set of management tools and api

OSLO - A platform for model driven development and consists of a modelling language called M; A tool for interacting with models called Quadrant; and a Repository which is a SQL server based database for storing and sharing models. M is used to define the domain specific data model, create a grammar for entering data and create a way of visualising the data.

Dublin - this is the codename for Microsoft's Application server. Dublin is a robust and scalable host for WF and WCF applications and will be used to support the OSLO modelling technologies.

Visual Studio 10 - There are some nice cool features in VS10 including impact analysis, historical debugging and better test management. Impact analysis looks at the code that has been changed and identifies the unit tests that are affected by the changes , allowing them to be run easily. Historical debugging allows debugging to be carried out after a fault has occurred and rewind backwards and see the state of the system rather than stepping forward through the system. Some issues are difficult to reproduce or step through without affecting the system. Having the ability to replay the sequence after the fault has occurred and interrogate the data will help the developers to fix problems more efficiently. Historical debugging can be tied into better test management by allowing the testers to run through their test scenarios and when a fault occurs, mark the test as a failure and then send the whole test information including a video (if selected) and the historical debug information through to the developer to fix. This will also help to eliminate the faults that can not be reproduced in the development environment.

Windows 7 - Another version of the windows operating system that should use less memory and be faster than Vista. In addition there will be multi desktop support in Remote desktop and on the fly virtual hard drive support which can then become bootable if required and better home networking.