BM-Bloggers

The blogs of Black Marble staff

Book Review: Windows Virus and Malware Troubleshooting by Andrew Bettany and Mike Halsey

Summary: A very useful volume that discusses what malware is, how to defend against it and how to remove it. Clear and simple instructions are given on ways to improve security on your PC, as well as how to deal with malware that may end up on your PC. Recommended.

Presented in a very easy to read writing style, this book immediately appeals due to the clear, concise and no-nonsense approach taken when discussing malware, what it is, how it can attack and affect your PC, how to defend against it and what to do if the worst should happen and your PC gets infected.

The first chapter provides a nice potted history of viruses and malware on PCs, discussing the various types and how both the proliferation and seriousness of infections has risen from the very first, typically benign examples to the modern day infections such as ransomware that has been in the news so much recently.

Chapter 2 deals with prevention and defence, and introduces the many security features that are built into modern versions of Microsoft Windows to help stop the initial infection. There’s a clear progression in security features as newer versions of Windows have been introduced, and it’s interesting to compare the versions of Windows that were most susceptible to the recent ‘WannaCry’ ransomware attack. Looking at the features discussed (and having been to a few presentations on the subject), this provides an excellent set of reasons for an upgrade to Windows 10 if you’ve not already done so!

Chapter 3 discusses defence in depth and includes information on firewalls, including the Windows firewall, as well as organisational firewalls (I.e. hardware firewalls and appliances) and how to generate a multi-layer defence. While at first glance this section appears to be more targeted at the organisational user, it’s actually also targeted at the home user with a hardware router/firewall combination, and some clarification that this is the case would, I feel, have been useful here. This chapter also bizarrely includes a section on keylogging software, which I feel would have been more useful in the first chapter

This chapter also provides some information on blacklists and whitelists (I.e. internet filtering) and the Internet of Things (IoT). For both of these sections I feel that there’s perhaps been a bit of a lost opportunity, for example a brief discussion of the filtering options available might have been helpful for home users (e.g. my Netgear router at home comes complete with an OpenDNS-based filtering option that can be enabled and configured quickly and easily and seems to provide reasonable protection) and further information on IoT security recommendations, particularly changing the default username and password on devices would be beneficial here.

Chapter 4 deals with identifying attacks starting with how malware infects a PC and providing pointers on how to identify both internal and external attacks. I was very pleased in this section to see information on social engineering and the role that this plays in malware infections.

Chapter 5 provides a very useful list of external resources that can be utilised to help protect your PC and clean a malware infection, including the Microsoft Malware Protection Center, a great location for finding updates, additional security recommendations and products etc. This chapter also provides some limited information on third-party tools that are available. Again, I would have liked to see a more expansive list here, and it’s worth mentioning that many anti-virus vendors provide a free option of their products.

Chapter 6 deals with manually removing malware, and for me this was probably the most useful part of this book. What do you do when malware has ended up on your PC despite your best efforts and you’re now having issues running the automated tools to get rid if it? This chapter helps in this scenario, and provides some steps to take to identify what’s running on the PC, suspend and/or kill the process and remove the infection. In particular I’m pleased to see the Microsoft Sysinternals tools discussed (albeit briefly) as they are my ‘go to’ toolset when dealing with an infection on a PC. If you’re interested in these and how they can be used, it’s worth looking at some of Mark Russinovich'sCase of the Unexplained’ videos as Mark goes through the use of these tools in more detail.

There are one or two downsides; the book is only a slim volume. This has both plusses and minuses insofar as being slim, more people are likely to read it end-to-end and therefore benefit the most from it, however in one or two areas a few more details might be appreciated. For such a slim volume, it’s also more expensive than I would hope for at an RRP of £14.99, which may limit its take-up.

All in all however this is a very easily accessible book that provides great guidance on how to secure your PC, what to watch out for and how to deal with a malware infection. I’ll be encouraging a few people I know to buy a copy and read it!

Title: Windows Virus and Malware Troubleshooting
Author(s): Andrew Bettany, MVP and Mike Halsey, MVP
Publisher: Apress
ISBN-13: 978-1-4842-2606-3

Test-SPContentDatabase False Positive

I was recently performing a SharePoint 2013 to 2016 farm upgrade and noticed an interesting issue when performing tests on content databases to be migrated to the new system.

As part of the migration of a content database, it’s usual to perform a ‘Test-SPContentDatabase’ operation against each database before attaching it to the web application. On the farm that I was migrating, I got mixed responses to the operation, with some databases passing the check successfully and others giving the following error:

PS C:\> Test-SPContentDatabase SharePoint_Content_Share_Site1

Category        : Configuration
Error           : False
UpgradeBlocking : False
Message         : The [Share WebSite] web application is configured with
                  claims authentication mode however the content database you
                  are trying to attach is intended to be used against a
                  windows classic authentication mode.
Remedy          : There is an inconsistency between the authentication mode of
                  target web application and the source web application.
                  Ensure that the authentication mode setting in upgraded web
                  application is the same as what you had in previous
                  SharePoint 2010 web application. Refer to the link
                  "
http://go.microsoft.com/fwlink/?LinkId=236865" for more
                  information.
Locations       :

This was interesting as all of the databases were attached to the same content web application, and had been created on the current system (I.e. not migrated to it from an earlier version of SharePoint) and therefore should all have been in claims authentication mode. Of note also is the reference to SharePoint 2010 in the error message, I guess the cmdlet hasn’t been updated in a while…

After a bit of digging, it turned out that the databases that threw the error when tested had all been created and some initial configuration applied, but nothing more. Looking into the configuration, there were no users granted permissions to the site (except for the default admin user accounts that had been added as the primary and secondary site collection administrators when the site collection had been created), but an Active Directory group had also been given site collection administrator permissions.

A quick peek at the UserInfo table for the database concerned revealed the following (the screen shot below is from a test system used to replicate the issue):

UserInfo Table

The tp_Login entry highlighted corresponds to the Active Directory group that had been added as a site collection administrator.

Looking at Trevor Seward’s blog post ‘Test-SPContentDatabase Classic to Claims Conversion’ blog post showed what was happening. When the Test-SPContentDatabase cmdlet runs, it’s looking for the first entry in the UserInfo table that matches the following rule:

  • tp_IsActive = 1 AND
  • tp_SiteAdmin = 1 AND
  • tp_Deleted = 0 AND
  • tp_Login not LIKE ‘I:%’

In our case, having an Active Directory Group assigned as a site collection administrator matched this set of rules exactly, therefore the query returned a result and hence the message was being displayed, even though the database was indeed configured for claims authentication rather than classic mode authentication.

For the organisation concerned, having an Active Directory domain configured as the site collection administrator for some of their site collections makes sense, so they’ll likely experience the same message next time they upgrade. Obviously in this case it was a false positive and could safely be ignored, and indeed attaching the databases that threw the error to a 2016 web application didn’t generate any issues.

Steps to reproduce:

  1. Create a new content database (to keep everything we’re going to test out of the way).
  2. Create a new site collection in the new database adding site collection administrators as normal.
  3. Add a domain group to the list of site collection administrators.
  4. Run the Test-SPContentDatabase cmdlet against the new database.

Book your free place at a Global DevOps Bootcamp venue for the 17th June 2017 event

Are you enthused by the all news at Build 2017?

Do you want to find out more about VSTS, DevOps and Continuous Delivery?

 

Well why not take the chance to join us on June 17th at Black Marble, or one of the over 25 other venues around the world for the first Global DevOps Bootcamp?

gdb-logo (002) (002)

The Global DevOps Bootcamp is a free one-day event hosted by local passionate DevOps communities around the globe. Find your local venue on the Global DevOps Bootcamp website or search for Global DevOps Bootcamp on EventBrite

Learn about the latest DevOps trends, ‘get your hands dirty during the Hackaton’, gain insights in new technologies and share experiences with other community members. All based around the concept of "From Server to Serverless in a DevOps world". The Global DevOps Bootcamp is all about DevOps on the Microsoft Stack

 

Remember, places are limited at all venues so make sure you get your name down soon to avoid disappointment

Options migrating TFS to VSTS

I did an event yesterday on using the TFS Database Import Service to do migrations from on premises TFS to VSTS.

During the presentation I discussed some of the other migration options available. Not everyone needs a high fidelity migration, bring everything over. Some teams may want to just bring over their current source or just a subset of their source. Maybe they are making a major change in work practices and want to start anew on VSTS.

To try to give an idea of the options I have produced this flow chart to help with the choices

Click for a PDF version

image

It mentions a few 3rd party tools in the flowchart, so here are some useful links

Also, if you find yourself in the orange box at the bottom and don’t want to use the TFS Database Import Service for some reason, have a look at this post I did on Microsoft’s UK Developers site. It might give you some ideas

o7 is back in Store

Several years ago, for the launch of Windows Phone 7, we built a game based on work we did on helping teaching AI with .NET.

o7 was born.  A great game, and now it is back in glorious UWP for Windows 10.

Try it out and let us know what you think.

 o7

Get it here.

 

b

Regional Director

 

Once again I am so very proud to announce I have been selected to continue as a Regional Director for Microsoft for another two years.

The Regional Directors are a truly extraordinary set of individuals and I am humbled every time I’m in their company.  Not only for their extraordinary depth of knowledge and level of technical skills, but also for a passion as deep as mine for helping and supporting the community of developers across the world.

 Microsoft-Regional-Director-logo-600x140

For those out there supporting my endeavours for helping the community, thank you.

 

b.

DDD is Back for the 12th Time

 

After last year’s successful reboot, DDD is back in Reading for its twelfth outing, on Saturday, 10th of June.

 

Again DDD is free for all to attend, thanks to some great sponsors – and if you would like to be considered, please email ddd@blackmarble.com.

Our aim is to make DDD accessible to all, making it free is key to this but also leaves us limited budget for the nice to haves but we will always try.

I would love to see new speakers and new topics in the DDD line up, so please submit a session.  I’m afraid we can’t cover your costs, but so many great speakers have come out of DDD, many of them prized on the International Developer conference circuit.

 

Session submission has finished and so please vote on which sessions you would like to see @ http://developerdeveloperdeveloper.com/

 DDD12

I look forward to seeing you all in June and we will be announcing more DDD dates at the event

 

b.

Policing and HoloLens

 

Recently we had a great day demonstrating the great innovations we have made with HoloLens for the Police to a room full of jaded but ultimately enthusiastic, technology journalists.

It was great to see their responses to the solutions we have put together, and heartening to read some of the things they had to say.

Police

Alice Bonasio (Tech Trends) produced this great piece on how MR CSI isn’t SciFi anymore! It was great to see how she could see the potential in what we had produced, “With tuServ you can effectively have a fully functional portable Command and Control Centre.”

 

With HoloLens and tuServ, we have envisaged a real-world solution, that can make a difference in how police officers can do their job.

Back Again and HoloLens

 

I have been a bit quiet on the blogging front for a while as I have been deep in planning for community events and steeped in the joys of HoloLens.

 

The results, two more DDD community events on the way (Reading and the North), and Black Marble has been made one of Microsoft’s HoloLens Agency Readiness Partners.  So few companies have achieved this, it’s great recognition for all the hard work we have put in, and I’m looking forward to what we can achieve.

 

What have we been doing with HoloLens?

 

Under the banner of tuServ we have build a mobile command and control solution for the Police…

 

Black-Marble-HoloLens-848x400

 

The reactions and feedback have been amazing and I am so proud of our both our tuServ and HoloLens teams for producing a great product.

But we did not stop there we also have build a prototype Scene of Crime tool for HoloLens as well; more on that to come.

 

Good to be Back

 

b

 

Debugging Typescript in Visual Studio Code

This is one of those posts I do as a reminder to myself. I have struggled to get debugging working in VSCode for Typescript files. If I set breakpoints in the underlying generated JavaScript they worked, but did not work if they were set in the Typescript source file. There are loads of walkthroughs and answers on Stackoverflow, but all with that vital little bit (for me) missing. So this is what I needed to do for my usage scenario…

Whilst developing a Node based VSTS extension I have the following structure

Git repo root (folder)

--- .vscode (folder)

------ launch.json

--- mystuff.src   (the source .TS files)

------ script.ts

------ tsconfig.json

--- mystuff    (the target folder for the .JS files)

I develop my Typescript in the .src folder and use the TSC compiler to generate the .JS file into the target folder, running the ‘tsc –p .’ command whilst sitting in the .src folder.

Note: I actually run this Tsc command using Gulp, but this post does not need to go into detail of that.

The key thing is to make sure the two .json files have the correct options

The tsconfig.json is

{

"compilerOptions": {

"target": "ES6",

"module": "commonjs",

"watch": false,

"outDir": "../mystuff/",

"sourceMap": true

},

"exclude": [

"node_modules"

]

}

The important lines are highlighted

  • The path to generate to .JS to – for me it was important to generate to a different folder as this made it easier to create the VSTS extension packages without shipping the .TS files by accident.This was part of my debugging problem as if the .ts and .js files are in the folder the should be no issues
  • Creating the source map which enables debugging, the .JS.MAP files

The launch.json is

{

"version": "0.2.0",

"configurations":

[

{

"type": "node",

"request": "launch",

"name": "Node – my stuff",

"program": "${workspaceRoot}/mystuff.src/script.ts",

"outFiles": ["${workspaceRoot}/mystuff/*.js"]

}

]

}

The critical lines, and the ones I messed up are

  • Program must point at the .TS file
  • Outfiles must point to the location of the .JS and .JS.Map files in the target folder and those square brackets [] are vital

Once I had all this in place I could set breakpoints the .TS file and they worked