Proxy multiple, or all TPC’s in TFS 2010

September 20th, 2010

MS have recently updated their information on proxying TPC’s to include multiple or all Team Project Collections.

http://msdn.microsoft.com/en-us/library/ms400735.aspx

This means you no longer need an OS for each proxy server.

Upgrade WSS2.0 to WSS3.0 on Server 2008

July 27th, 2010

There’s a lot of forum activity from people bewlidered about how to migrate from WSS2 to WSS3, especially when it involves a change of underlying OS too.

I started this process myself about a month ago and downloaded a document from MS about the migration process – the document is here (do not download it, at 128 pages, it’s about 127 pages too long)

I’ll now show you in a few easy steps how to migrate and upgrade, using the following assumptions

  • You have a WSS2.0 instance (either as part of TFS 2005/08 or not) on a server called ‘oldserver’
  • You have a test server called ‘testserver’ with an installation of WSS3.0 running on SQL 05/08.
  • Your migration destination is a server called ‘newserver’, and is also running WSS3.0

First of all you’ll be wanting to download and run the pre-scan tool. This has to be run on the ‘oldserver’ as it will mark the database as being ready to upgrade. Follow the instructions here

After this tool has be run, you’ll be wanting to take a SQL backup of your WSS content database. (i just used SQL Server Management Studio) If it’s part of TFS 2005/08 it’s probably called something like ‘WSS_Content_TFS’ . *You can find the name of your old content db name by checking the the WSS2.0 Central Admin pages.

Now, restore your database on ‘testserver’ (again I used SQL Management Studio). Once restored, you’ll now need to tell your WSS3.0 instance to attach the db. We do this through the SharePoint command line utility stsadm. The command for adding the content db into WSS is as follows:

stsadm -o addcontentdb -url http://testserver -databasename WSS_DB_NAME -databaseserver testserver

Now, you have a WSS2.0 database attached to your testserver, which is running WSS3.0 so you’ll be wanting to upgrade that now… Again, on stsadm, we use

stsadm.exe -o upgrade -inplace -url http://testserver

So now we have a WSS3.0 instance with a bang-up-to-date db attached. It’s now just a case of ‘moving’ this db onto your new server. If you’re moving to TFS 2010, you will almost certainly want to edit the Site Collection URL. For this reason (and many more) we use the stsadm commands again. At this point, I’m just interested in moving the root Site Collection from my testserver to the newserver, so we run the following

stsadm.exe -o backup -url “http://testserver” -filename sitecollection.bak

This will create a file called sitecollection.bak in the following folder:
c:\Program Files\common files\microsoft shared\web server extensions\12\bin

Now, copy that file to the same folder on your ‘newserver’. Now log on to your newserver and run the following command to restore the collection

stsadm.exe -o restore -url “http//newserver/sites/whateveryouwant” -filename sitecollection.bak

You’ve now got an up-to-date WSS3.0 site collection running on your new server, whilst the old server remains running WSS2.0.

One or more files in the restored site collection will exceed the maximum supported path length

July 27th, 2010

If you’re using the backup/restore functionality of stsadm for moving WSS3.0 site collections, you may well come across this error.

In my situation, I’m moving a site collections from

http://name

to

http://longname/sites/something

The total difference in path length between the old and new URL’s was 22. So my task was the ensure that no path lengths in my old site collection exceeded 260 (the limit) minus 22, as otherwise the new Site Collection would exceed the 260 limit once migrated.

In order to examine your WSS content Db, you’ll be wanting to run the following query in your WSS content db

SELECT
LEN(DirName + N’/’ + LeafName) AS Total,
DirName,
LeafName
FROM
Docs WITH (NOLOCK)
ORDER BY Total DESC

Then it’s usually a case of just tinkering with your longest URL’s until nothing exceeds your particular limit.

Search across Team Project Collections with Search Server Express 2010

July 24th, 2010

If you’re using TFS 2010, you’ll be familiar with the idea of Team Project Collections. The Team Project Collections are completely independent from one another, and so you’ll need to decide how many Team Projects will live together in each TPC. To determine your approach to this, you’ll need to consider the following;

Each TPC is a distinct database in your Date Tier, and therefore, a TPC is now your lowest unit of restore. Whilst this gives some much needed restore granularity (which TFS 2005/2008 did not have) it also means that each Team Project Collection will be mapped to a SharePoint Site Collection. If you’re using WSS3.0 this means that you will not be able to search content across these Site Collections. The end result is that your developers may not be able to search each others documents/work items etc.

To demonstrate the problem…

TPC to SC

To solve this problem, you can use MS Search Server Express 2010. I’d heartily recommend using a standard SQL Server 2008 instance for the databases, as the alternative is to use the Microsoft Internal database which SharePoint can install automatically (which is very unpleasant to have to manage)

Once you’d installed the pre-requisites, (which come with the download) and you’ve installed SSE, chose the ‘Farm’ option (not standalone) – this will allow you to specify a SQL instance. Then, in the Search Application, you can simply create a new content source, and point the indexer to your SharePoint root Site Collection address. The indexer will then crawl all your Site Collections and your users can then search across all the TFS Project Portals.

No Search Results in Windows Sharepoint Server or TFS Project Portal?

July 24th, 2010

If you’ve got a WSS3.0 Installation, (as part of TFS 2010 or not) then you may be struggling to get ANY search results from your project portal.

This is caused by a bug with WSS3.0 (with SP2) whereby Sharepoint will not index any Site Collection content, UNLESS a root Site Collection exists.

Typically with TFS 2010, you would normally simply create a site collection at
http://servername/sites/
and this is where your TFS Site Collections will live.

Unfortunately, unless you can actually browse to content at
http://servername
You’ll never see any search results!

The easiest way to fix this, is simply to create a new site collection in the root. If you’re doing this as part of TFS 2010, you’re best of making a Wiki Site, or at least NOT chosing a ‘Team’ Site, otherwise the Site Collection will moan about not being connected to a TFS 2010 Team Project Collection…

TFS 2010 Build’s fail to GET files larger than 2MB

July 24th, 2010

You might find that in TFS 2010 your builds fail whever they attempt to GET a file over 2MB in size. This ocurrs if you’re using IIS7.5 (on Windows Server 2008 R2) and is in fact another IIS bug, and not a TFS 2010 Problem.

This issue occurs because the idle connection time-out for the HTTP service expires prematurely. This causes the network connection to disconnect unexpectedly. You will get a message through Visual Studio informing you which files failed to download. Usually with this error message:

Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host

To solve this issue, you’ll need to install a patch on one (or all) of your Application Tier Servers. The patch and conditions for install can be found here

http://support.microsoft.com/kb/981898

Scale-out Application Tier Servers for TFS 2010

July 24th, 2010

Just a quick post about scaling out Application Tier servers in TFS 2010.

To Scale-out TFS 2010 and Windows Sharepoint, you can follow the TFS Guides, and it’s usually a fairly painless process as long as you follow all of the pre-requisites. Scaling out SQL Reporting Services does have a slightly bigger gotchya though.

Unless you’ve installed SQL Server Enterprise Edition, you cannot scale out your SQL RS, as you will simply get a message saying it’s an unsupported feature.

This can be something of a kick in the teeth if you’ve successfully scaled out everything else, and then fail at the last hurdle…

The good news is, SQL Server 2008 Standard can be very easily upgraded to Enterprise Edition without disrupting your SQL RS instance, it’s just a case of running the Enterprise install media and following the upgrade option.

This post gives more detail
http://www.sqldev.org/sql-server-reporting-services/how-to-upgrade-sql-server-reporting-services-2008-standard-edition-to-enterprise-edition-on-the-fron-13865.shtml

Cannot upload large files to WSS3.0 running on IIS7?

July 24th, 2010

This is an issue a lot of people have been seeing recently – Unfortunately, due to the massive number of different Apps which sit atop IIS, it can manifest itself in a huge variety of ways. You might see the error through Windows Sharepoint like this;

Error message when you try to upload a large file to a document library on a Windows SharePoint Services 3.0 site: “Request timed out”

Or you might see the error through Team Foundation Server, like this:

—————————
Microsoft Visual Studio
—————————
Attachment upload failed. Check that you have a network connection and that the Team Foundation Server is available. If the problem persists, contact your Team Foundation Server administrator.
—————————
OK
—————————

Or any other number of errors from various applications.

The problem lies with IIS7′s maxAllowedContentLength. This is by default set to a value of 30000000 bytes (~28.6 MB)

The fix the issue, make the following change to your web.config file

<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength=”2000000000″/>
</requestFiltering>
</security>
</system.webServer>

You can change the value of ’2000000000′ to be whatever you want.

stsadm is not recognised as an internal or external command

March 26th, 2010

If you’re getting this message, you’re probably trying to run the stsadm command line utility, but you’ve not yet added the stsadm path, to your environmental variables;

stsadm is located here

c:\Program Files\common files\microsoft shared\web server extensions\12\bin

Along with some other SharePoint command line tools.

How to find a domain controller

December 1st, 2009

Found this today which is marvellous.

Start -> Run -> nslookup
set type=all
_ldap._tcp.dc._msdcs.DOMAIN_NAME

Facebook: VP of Technology talks about how Facebook works…

October 15th, 2009

Scalability is often one of the biggest challenges you can face, when you website grows – it often grows *very* quickly. To produce systems and architecture that scale with your growth is usually spectacularly complex and expensive. When you’re talking about a site with 300 million users, that’s really only about 5 years old, you have a pretty fascinating case study. Follow the link to hear how Jeff Rothschild describes how Facebook meets the challenges in his presentation entitled High Performance and Massive Scale

mms://video-jsoe.ucsd.edu/calit2/JeffRothschildFacebook.wmv

Key stats from the presentation about Facebook’s infrastructure/scale include:

  • Facebook has 30,000 servers supporting its operations*
  • Facebook stores 80 billion images (20 billion images, each in four sizes
  • Facebook serve up 600,000 photos a second
  • Facebook creates 25 terabytes of data per day in logging data into a Hadoop cluster
  • Facebook services about 120 million queries to it’s Memcache per second
  • Facebook has c. 230 Engineers, which is a ratio of roughly 1.1 million active users per Engineer
  • Facebook operates a shared nothing architecture wherever possible
  • Facebook’s development of php, mysql, memcache and various others, has virtually all been made open source
  • Facebook sends over 1 billion outbound (transactional, typically notifcation) emails per day

* Changes daily. The bulk of which is webserver to handle the low runtime efficiency of php.

One of the most interesting developments for anyone currently using php in a commercial environment is Facebook’s current development of a php compiler, which they estimate will give them 50 – 70% increase in runtime efficiency, this might give them a small amount of breathing room in terms of current server requirements. More interestingly for the wider php community, if this complier is made open source (which I’m absolutely sure it will) it could give php a real boost in the popularity stakes.

—–

IIS7 Request Filtering via the GUI or appcmd.ex

September 28th, 2009

Request Filtering is IIS7  is a nice feature allowing an excellent level of customisation on virtually every level of an HTTP request.

To enlabe request filtering in Win 2008, follow these steps

  1. On the taskbar, click Start, point to Administrative Tools, and then click Server Manager.
  2. In the Server Manager hierarchy pane, expand Roles, and then click Web Server (IIS).
  3. In the Web Server (IIS) pane, scroll to the Role Services section, and then click Add Role Services.
  4. On the Select Role Services page of the Add Role Services Wizard, select Request Filtering, and then click Next.

Request Filtering for IIS7

Once installed you set the various options through appcmd.exe.  Or, if you prefer, you can install IIS7 Admin Pack, which will create a GUI for easier configuration of this feature.

A list of the various options can be found here:
http://www.iis.net/ConfigReference/system.webServer/security/requestFiltering

Enabling Server side includes in IIS7

July 22nd, 2009

A relatively little used feature now, but still important if your HTML passes server side includes. Mainly used for templated html structures which call various blocks of layout to the page, this feature is not enabled by default in IIS7.

To switch the feature on, firstly:
Control Panel > Programs and Features > Turn Windows Features On or Off

control panel

Next you need to enable the SSI feature as shown:

enabling ssi includes in IIS7 on Vista

If you’re in Vista -the next time you go into the Handler Mappings in IIS, you’ll see that .shtml, .shtm and .stm file types have been automatically added with the correctly configured handler. If you’re not using Vista, you may need to add these mappings yourself manually through the Add Module Mapping section.

Should be relatively easily tested by setting a test .shtml page to call another section using the standard include:

< !–#include file=”foo.html”– >

As long as the content of that file pulls through, you’ll know it’s working.

Creating nlite image discs for your PC/server

July 10th, 2009

For anyone who regularly (or even irregularly) has to re-install operating systems in a business environment, you’ll know exactly how time consuming, dull and repetitive this can be. Thankfully there are products out there for making this process a little easier, and this one is even free!

One such product is called nlite (or vlite for vista products). It’s not a fully fledged imaging product- but instead allows you to take a huge amount of the hassle out of the re-installation process by bundling together service packs, windows updates, drivers and pretty much anything else you might want, into one neat iso.  What this means for administrators who look after just a few different models of workstation, is that you can build up and keep a library of up-to-date disc images, so that in the event of needing to re-install – you’ve got a disc on hand which will leave the machine in a much more ready state than a standard OS disc.

So, first of all, download nlite

http://www.nliteos.com/download.html

Once installed, launch the Appnlite in action

Now, you’ll need to point the installation at your OS disc, for nlite to start to build it’s own image.

At this point, you’ll be prompted to chose the various features of nlite’s features,

nlite features

the most important (read as: useful) features are

1. Slipstreaming Service packs and updates
2. Including Drivers

You can then customise pretty much every aspect of Windows, ripping out services, languages, all kinds of features can be remove to save space, or for security – but the real risk here is that you might rip something out that you’ll later need, so take care here.

In the customisation section there’s some really good features, such as being able to setup your default windows explorer views, customising your ‘start’ toolbar and many many more – one of my personal favourites is the ability to add the OS, Service Pack and version to the bottom right hand corner of your desktop background. Until I saw this in nlite, I didn’t even know that it existed in Windows!

Once you’ve finalised all of your options, you’ll get a page where you’re prompted to begin the build

nlite ready to build

nlite will then build you a bootable image which you simply write to disc with your favourite CD writing software and you’re ready to install your machine.

Automating VMWare ESXi snapshots through Scheduled Tasks

July 7th, 2009

Earlier I wrote about Creating ESXi snapshot backups with ghettoVCB.sh. Now, the next logical step is to be able to automate these snapshots so you don’t need to ssh to the ESXi host and run the script manually and wait for the result.

We can use plink, and Windows Scheduled tasks to achieve this result.

In order to use Plink, the file plink.exe will need either to be on your PATH or in your current directory. To add the directory containing Plink to your PATH environment variable.

Start -> Control Panel -> System -> Advanced -> Environmental Settings -> System Variables -> Path -> Edit, and enter the path for your plink.exe file.

OK, so once plink is setup correctly, you can use the following command to connect to your ESXi host, and set your script running with the appropriate ‘vmbackups’ file for the relevant machine.

E:\backup\putty\plink.exe root@10.6.40.36 -pw passsword “nohup /vmfs/volumes/datastore1/scripts/ghettoVCB.sh /vmfs/volumes/datastore1/scripts/vmbackupsbuilder > /vmfs/volumes/datastore1/scripts/backuplog.txt &”

Picking the above command apart:

1. E:\backup\putty\plink.exe (the path to plink.exe)

2. root@10.6.40.36 -pw passsword (the username, IP address and password of your ESXi Host)

3. “nohup /vmfs/volumes/datastore1/scripts/ghettoVCB.sh /vmfs/volumes/datastore1/scripts/vmbackupsbuilder > /vmfs/volumes/datastore1/scripts/backuplog.txt &” (this is the path to your ghettoVCB.sh file, the path to your vmbackups file, and a command to log the output

We also run this as a ‘nohup’ command so that the snapshot can continue without anyone being continuously logged into the host.

You probably want to run this without the nohup as a first test, to make sure everything is working OK.

Once you’re happy with the command and it runs successfully for you, you can simply add this command to a .bat file, and then use the standard Windows Task Manager to schedule it to run as frequently as you wish.

Spotlight on Windows, fantastic windows management tool, and it’s free!

July 6th, 2009

I’ve seen a lot of software meant for managing the performance of windows servers, obviously a lot of those tools are extremely specific (for services like SQL and Exchange etc) but for standard windows machines absolutely nothing I’ve seen matches the amazing GUI of Spotlight on Windows. Initially the GUI looks like it’s trying too hard, but actually it’s an amazing blend of static and real-time info. Here’s how it looks

Spotlight on Windows
click for bigger

In terms of content – if offers pretty much everything you’d expect:

CPU usage
CPU queue length
LAN usage
Disk I/O
Processes
Memory usage
Virtual Memory usage
Memory queue
Page file usage
Fixed Disk usage

it also shows the various buses between these objects and shows the pages/sec moving around the motherboard, this feature is sorely missing from many other management products, and it’s often key at determining what’s going on with your server. Each monitor has a helpful explanation in case you’re feeling a little knowledge-light.

My advice for this product would be as follows.

1. Download and Install from the Quest Software site (it’s free!)

2. Set-up your connections (File -> Connect)

3. Ensure Spotlight successfully connects and choose a 6 hour calibration period.

This last step is most important, ideally you want to let Spotlight gather data from the machine over a time-span where it’s under some load, ie a representative snapshot of its daily load. Otherwise you’ll spend a lot more time in future customising the alert levels for usage which you know is “normal”.

Once all your connection are calibrated you’ll have a management tool which can tell you more about a servers performance in a single glance than you’d think possible.

Thanks to Quest Software.

IBM RAID adapters – will it fit?

July 6th, 2009

For those of us who work with IBM server hardware, there’s a baffling array of RAID cards which may or may not fit into your server’s architecture. Without having to locate and flick through your Technical manual for each one, the following page is a wonderfully comprehensive guide to virtually every IBM RAID card ever, and a matrix at the foot of the page showing which cards are compatible with this servers.

http://www.redbooks.ibm.com/abstracts/tips0054.html

Fantastic resource!

Where’s the Disk Cleanup option in Windows Server 2008?

July 3rd, 2009

Windows Server 2008 doesn’t have the Disk Cleanup option switched on as a default.  In order to enable it, you’ll need to install the ‘Desktop Experience’ which actually contains rather a lot of other stuff. Not quite sure why Microsoft have bundled so much stuff into this ‘Desktop Experience’ when 99% of most people will just want the Disk Cleanup option, just another MS idea which *could* have been good, but not quite.

To enable the ‘Desktop Experience’

Step 1 – Open the Server Manager (Start -> Administrative Tools -> Server Manager)

Step 2 – In the Features Summary section, click on Add Feature.

Step 3 – Select the Desktop Feature experience and click through the various ‘Next’ steps to install

Step 4 – Reboot your machine and then you should find the Disk Cleanup option now appears in the usual place. (Right click on disk name -> properties)

That’s it, you’ll now be able to easily delete the swathes of junk that Windows builds up, easily and from one place.  If you’re going to De-frag a disk, always run this first and get rid of as much junk as you can.

Adding Remote Desktop MMC Snap-in for Windows Vista

July 3rd, 2009

A lot of Sys Admins use the RDP Snap-in, on either Windows XP or Windows Server 2003 because it makes it far easier to manage mutliple machines. Unfortunatelty, this feature isn’t available in any version of Windows Vista.

However, due to the similarities between 2003 and Vista, you can get it working, with a bit of tinkering, here’s how.

Step 1 – Download and install the adminpak.msi for Windows Server 2003 and install it on your Vista machine. Both

Windows Server 2003 Service Pack 1 (SP1) Administration Tools Pack and Windows Server 2003 R2 Administration Tools Pack

can be installed onto Vista without error. Unfortunately, you might run into one or two error messages if you use Active Directory Users and Computers on your Vista machine.  So there’s more to be done.

Step 2 – Open an elevated permission command prompt, (Right click on ‘Cmd’ and chose ‘Run as Administrator) and register the following dll’s

regsvr32 /s adprop.dll
regsvr32 /s azroles.dll
regsvr32 /s azroleui.dll
regsvr32 /s ccfg95.dll
regsvr32 /s certadm.dll
regsvr32 /s certmmc.dll
regsvr32 /s certpdef.dll
regsvr32 /s certtmpl.dll
regsvr32 /s certxds.dll
regsvr32 /s cladmwiz.dll
regsvr32 /s clcfgsrv.dll
regsvr32 /s clnetrex.dll
regsvr32 /s cluadmex.dll
regsvr32 /s cluadmmc.dll
regsvr32 /s cmproxy.dll
regsvr32 /s cmroute.dll
regsvr32 /s cmutoa.dll
regsvr32 /s cnet16.dll
regsvr32 /s debugex.dll
regsvr32 /s dfscore.dll
regsvr32 /s dfsgui.dll
regsvr32 /s dhcpsnap.dll
regsvr32 /s dnsmgr.dll
regsvr32 /s domadmin.dll
regsvr32 /s dsadmin.dll
regsvr32 /s dsuiwiz.dll
regsvr32 /s imadmui.dll
regsvr32 /s lrwizdll.dll
regsvr32 /s mprsnap.dll
regsvr32 /s msclus.dll
regsvr32 /s mstsmhst.dll
regsvr32 /s mstsmmc.dll
regsvr32 /s nntpadm.dll
regsvr32 /s nntpapi.dll
regsvr32 /s nntpsnap.dll
regsvr32 /s ntdsbsrv.dll
regsvr32 /s ntfrsapi.dll
regsvr32 /s rasuser.dll
regsvr32 /s rigpsnap.dll
regsvr32 /s rsadmin.dll
regsvr32 /s rscommon.dll
regsvr32 /s rsconn.dll
regsvr32 /s rsengps.dll
regsvr32 /s rsjob.dll
regsvr32 /s rsservps.dll
regsvr32 /s rsshell.dll
regsvr32 /s rssubps.dll
regsvr32 /s rtrfiltr.dll
regsvr32 /s schmmgmt.dll
regsvr32 /s tapisnap.dll
regsvr32 /s tsuserex.dll
regsvr32 /s uddi.mmc.dll
regsvr32 /s vsstskex.dll
regsvr32 /s w95inf16.dll
regsvr32 /s w95inf32.dll
regsvr32 /s winsevnt.dll
regsvr32 /s winsmon.dll
regsvr32 /s winsrpc.dll
regsvr32 /s winssnap.dll
regsvr32 /s ws03res.dll

Step 3 – At this point a lot of people have had a mixture of success in attempting to add the ‘Remote Desktops’  through the MMC Snap-in, it just doesn’t appear in there for some people. The easiest way around it is simply to add a shortcut to the tsmmc.msc file, I’ve added this to my Desktop, and it can be launched without elevated permission to bring up the Remote Management Screen.

Step 4 – Now it’s just a question of adding your connections and you’re away.

Enjoy.

Download and install VMWare Infrastructure Client

July 3rd, 2009

There’s a lot of people searching and posting about the VMWare Infrastructure Client that’s used for managing your ESXi Host, people don’t seem to be able to find and download it from the web.

There’s a good reason for this, it’s not widely available online!  Do install it, simply visit the IP address of your ESXi host in your favourite internet browser, and you can download the Infrastructure Client from the ESXi Host directly.

Hope this helps some of the confused!