MS have recently updated their information on proxying TPC’s to include multiple or all Team Project Collections.
This means you no longer need an OS for each proxy server.Filed under Uncategorized | Comment (0)
There’s a lot of forum activity from people bewlidered about how to migrate from WSS2 to WSS3, especially when it involves a change of underlying OS too.
I started this process myself about a month ago and downloaded a document from MS about the migration process – the document is here (do not download it, at 128 pages, it’s about 127 pages too long)
I’ll now show you in a few easy steps how to migrate and upgrade, using the following assumptions
- You have a WSS2.0 instance (either as part of TFS 2005/08 or not) on a server called ‘oldserver’
- You have a test server called ‘testserver’ with an installation of WSS3.0 running on SQL 05/08.
- Your migration destination is a server called ‘newserver’, and is also running WSS3.0
First of all you’ll be wanting to download and run the pre-scan tool. This has to be run on the ‘oldserver’ as it will mark the database as being ready to upgrade. Follow the instructions here
After this tool has be run, you’ll be wanting to take a SQL backup of your WSS content database. (i just used SQL Server Management Studio) If it’s part of TFS 2005/08 it’s probably called something like ‘WSS_Content_TFS’ . *You can find the name of your old content db name by checking the the WSS2.0 Central Admin pages.
Now, restore your database on ‘testserver’ (again I used SQL Management Studio). Once restored, you’ll now need to tell your WSS3.0 instance to attach the db. We do this through the SharePoint command line utility stsadm. The command for adding the content db into WSS is as follows:
stsadm -o addcontentdb -url http://testserver -databasename WSS_DB_NAME -databaseserver testserver
Now, you have a WSS2.0 database attached to your testserver, which is running WSS3.0 so you’ll be wanting to upgrade that now… Again, on stsadm, we use
stsadm.exe -o upgrade -inplace -url http://testserver
So now we have a WSS3.0 instance with a bang-up-to-date db attached. It’s now just a case of ‘moving’ this db onto your new server. If you’re moving to TFS 2010, you will almost certainly want to edit the Site Collection URL. For this reason (and many more) we use the stsadm commands again. At this point, I’m just interested in moving the root Site Collection from my testserver to the newserver, so we run the following
stsadm.exe -o backup -url “http://testserver” -filename sitecollection.bak
This will create a file called sitecollection.bak in the following folder:
c:\Program Files\common files\microsoft shared\web server extensions\12\bin
Now, copy that file to the same folder on your ‘newserver’. Now log on to your newserver and run the following command to restore the collection
stsadm.exe -o restore -url “http//newserver/sites/whateveryouwant” -filename sitecollection.bak
You’ve now got an up-to-date WSS3.0 site collection running on your new server, whilst the old server remains running WSS2.0.Filed under Uncategorized | Comment (1)
If you’re using the backup/restore functionality of stsadm for moving WSS3.0 site collections, you may well come across this error.
In my situation, I’m moving a site collections from
The total difference in path length between the old and new URL’s was 22. So my task was the ensure that no path lengths in my old site collection exceeded 260 (the limit) minus 22, as otherwise the new Site Collection would exceed the 260 limit once migrated.
In order to examine your WSS content Db, you’ll be wanting to run the following query in your WSS content db
LEN(DirName + N’/’ + LeafName) AS Total,
Docs WITH (NOLOCK)
ORDER BY Total DESC
Then it’s usually a case of just tinkering with your longest URL’s until nothing exceeds your particular limit.Filed under Command Line, TFS 2010, WSS | Comment (0)
If you’re using TFS 2010, you’ll be familiar with the idea of Team Project Collections. The Team Project Collections are completely independent from one another, and so you’ll need to decide how many Team Projects will live together in each TPC. To determine your approach to this, you’ll need to consider the following;
Each TPC is a distinct database in your Date Tier, and therefore, a TPC is now your lowest unit of restore. Whilst this gives some much needed restore granularity (which TFS 2005/2008 did not have) it also means that each Team Project Collection will be mapped to a SharePoint Site Collection. If you’re using WSS3.0 this means that you will not be able to search content across these Site Collections. The end result is that your developers may not be able to search each others documents/work items etc.
To demonstrate the problem…
To solve this problem, you can use MS Search Server Express 2010. I’d heartily recommend using a standard SQL Server 2008 instance for the databases, as the alternative is to use the Microsoft Internal database which SharePoint can install automatically (which is very unpleasant to have to manage)
Once you’d installed the pre-requisites, (which come with the download) and you’ve installed SSE, chose the ‘Farm’ option (not standalone) – this will allow you to specify a SQL instance. Then, in the Search Application, you can simply create a new content source, and point the indexer to your SharePoint root Site Collection address. The indexer will then crawl all your Site Collections and your users can then search across all the TFS Project Portals.Filed under IIS, SQL Server, TFS 2010, WSS | Comment (0)
If you’ve got a WSS3.0 Installation, (as part of TFS 2010 or not) then you may be struggling to get ANY search results from your project portal.
This is caused by a bug with WSS3.0 (with SP2) whereby Sharepoint will not index any Site Collection content, UNLESS a root Site Collection exists.
Typically with TFS 2010, you would normally simply create a site collection at
and this is where your TFS Site Collections will live.
Unfortunately, unless you can actually browse to content at
You’ll never see any search results!
The easiest way to fix this, is simply to create a new site collection in the root. If you’re doing this as part of TFS 2010, you’re best of making a Wiki Site, or at least NOT chosing a ‘Team’ Site, otherwise the Site Collection will moan about not being connected to a TFS 2010 Team Project Collection…Filed under IIS, TFS 2010, WSS | Comment (0)
You might find that in TFS 2010 your builds fail whever they attempt to GET a file over 2MB in size. This ocurrs if you’re using IIS7.5 (on Windows Server 2008 R2) and is in fact another IIS bug, and not a TFS 2010 Problem.
This issue occurs because the idle connection time-out for the HTTP service expires prematurely. This causes the network connection to disconnect unexpectedly. You will get a message through Visual Studio informing you which files failed to download. Usually with this error message:
Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host
To solve this issue, you’ll need to install a patch on one (or all) of your Application Tier Servers. The patch and conditions for install can be found hereIIS, TFS 2010 | Comment (1)
Just a quick post about scaling out Application Tier servers in TFS 2010.
To Scale-out TFS 2010 and Windows Sharepoint, you can follow the TFS Guides, and it’s usually a fairly painless process as long as you follow all of the pre-requisites. Scaling out SQL Reporting Services does have a slightly bigger gotchya though.
Unless you’ve installed SQL Server Enterprise Edition, you cannot scale out your SQL RS, as you will simply get a message saying it’s an unsupported feature.
This can be something of a kick in the teeth if you’ve successfully scaled out everything else, and then fail at the last hurdle…
The good news is, SQL Server 2008 Standard can be very easily upgraded to Enterprise Edition without disrupting your SQL RS instance, it’s just a case of running the Enterprise install media and following the upgrade option.SQL Server, TFS 2010 | Comment (0)
This is an issue a lot of people have been seeing recently – Unfortunately, due to the massive number of different Apps which sit atop IIS, it can manifest itself in a huge variety of ways. You might see the error through Windows Sharepoint like this;
Error message when you try to upload a large file to a document library on a Windows SharePoint Services 3.0 site: “Request timed out”
Or you might see the error through Team Foundation Server, like this:
Microsoft Visual Studio
Attachment upload failed. Check that you have a network connection and that the Team Foundation Server is available. If the problem persists, contact your Team Foundation Server administrator.
Or any other number of errors from various applications.
The problem lies with IIS7′s maxAllowedContentLength. This is by default set to a value of 30000000 bytes (~28.6 MB)
The fix the issue, make the following change to your web.config file
You can change the value of ’2000000000′ to be whatever you want.Filed under IIS, TFS 2010, WSS | Comment (0)
If you’re getting this message, you’re probably trying to run the stsadm command line utility, but you’ve not yet added the stsadm path, to your environmental variables;
stsadm is located here
c:\Program Files\common files\microsoft shared\web server extensions\12\bin
Along with some other SharePoint command line tools.Filed under Command Line | Comment (0)
Scalability is often one of the biggest challenges you can face, when you website grows – it often grows *very* quickly. To produce systems and architecture that scale with your growth is usually spectacularly complex and expensive. When you’re talking about a site with 300 million users, that’s really only about 5 years old, you have a pretty fascinating case study. Follow the link to hear how Jeff Rothschild describes how Facebook meets the challenges in his presentation entitled High Performance and Massive Scale
Key stats from the presentation about Facebook’s infrastructure/scale include:
- Facebook has 30,000 servers supporting its operations*
- Facebook stores 80 billion images (20 billion images, each in four sizes
- Facebook serve up 600,000 photos a second
- Facebook creates 25 terabytes of data per day in logging data into a Hadoop cluster
- Facebook services about 120 million queries to it’s Memcache per second
- Facebook has c. 230 Engineers, which is a ratio of roughly 1.1 million active users per Engineer
- Facebook operates a shared nothing architecture wherever possible
- Facebook’s development of php, mysql, memcache and various others, has virtually all been made open source
- Facebook sends over 1 billion outbound (transactional, typically notifcation) emails per day
* Changes daily. The bulk of which is webserver to handle the low runtime efficiency of php.
One of the most interesting developments for anyone currently using php in a commercial environment is Facebook’s current development of a php compiler, which they estimate will give them 50 – 70% increase in runtime efficiency, this might give them a small amount of breathing room in terms of current server requirements. More interestingly for the wider php community, if this complier is made open source (which I’m absolutely sure it will) it could give php a real boost in the popularity stakes.
—–Filed under Uncategorized | Comment (0)
Request Filtering is IIS7 is a nice feature allowing an excellent level of customisation on virtually every level of an HTTP request.
To enlabe request filtering in Win 2008, follow these steps
- On the taskbar, click Start, point to Administrative Tools, and then click Server Manager.
- In the Server Manager hierarchy pane, expand Roles, and then click Web Server (IIS).
- In the Web Server (IIS) pane, scroll to the Role Services section, and then click Add Role Services.
- On the Select Role Services page of the Add Role Services Wizard, select Request Filtering, and then click Next.
Once installed you set the various options through appcmd.exe. Or, if you prefer, you can install IIS7 Admin Pack, which will create a GUI for easier configuration of this feature.
A list of the various options can be found here:
A relatively little used feature now, but still important if your HTML passes server side includes. Mainly used for templated html structures which call various blocks of layout to the page, this feature is not enabled by default in IIS7.
To switch the feature on, firstly:
Control Panel > Programs and Features > Turn Windows Features On or Off
Next you need to enable the SSI feature as shown:
If you’re in Vista -the next time you go into the Handler Mappings in IIS, you’ll see that .shtml, .shtm and .stm file types have been automatically added with the correctly configured handler. If you’re not using Vista, you may need to add these mappings yourself manually through the Add Module Mapping section.
Should be relatively easily tested by setting a test .shtml page to call another section using the standard include:
< !–#include file=”foo.html”– >
As long as the content of that file pulls through, you’ll know it’s working.Filed under IIS, Windows 2008 Server | Comment (0)
For anyone who regularly (or even irregularly) has to re-install operating systems in a business environment, you’ll know exactly how time consuming, dull and repetitive this can be. Thankfully there are products out there for making this process a little easier, and this one is even free!
One such product is called nlite (or vlite for vista products). It’s not a fully fledged imaging product- but instead allows you to take a huge amount of the hassle out of the re-installation process by bundling together service packs, windows updates, drivers and pretty much anything else you might want, into one neat iso. What this means for administrators who look after just a few different models of workstation, is that you can build up and keep a library of up-to-date disc images, so that in the event of needing to re-install – you’ve got a disc on hand which will leave the machine in a much more ready state than a standard OS disc.
So, first of all, download nlite
Once installed, launch the App
Now, you’ll need to point the installation at your OS disc, for nlite to start to build it’s own image.
At this point, you’ll be prompted to chose the various features of nlite’s features,
the most important (read as: useful) features are
1. Slipstreaming Service packs and updates
2. Including Drivers
You can then customise pretty much every aspect of Windows, ripping out services, languages, all kinds of features can be remove to save space, or for security – but the real risk here is that you might rip something out that you’ll later need, so take care here.
In the customisation section there’s some really good features, such as being able to setup your default windows explorer views, customising your ‘start’ toolbar and many many more – one of my personal favourites is the ability to add the OS, Service Pack and version to the bottom right hand corner of your desktop background. Until I saw this in nlite, I didn’t even know that it existed in Windows!
Once you’ve finalised all of your options, you’ll get a page where you’re prompted to begin the build
nlite will then build you a bootable image which you simply write to disc with your favourite CD writing software and you’re ready to install your machine.Filed under Backups, Windows 2008 Server, Windows Vista | Comment (0)
Earlier I wrote about Creating ESXi snapshot backups with ghettoVCB.sh. Now, the next logical step is to be able to automate these snapshots so you don’t need to ssh to the ESXi host and run the script manually and wait for the result.
We can use plink, and Windows Scheduled tasks to achieve this result.
In order to use Plink, the file
plink.exe will need either to be on your
PATH or in your current directory. To add the directory containing Plink to your
PATH environment variable.
Start -> Control Panel -> System -> Advanced -> Environmental Settings -> System Variables -> Path -> Edit, and enter the path for your plink.exe file.
OK, so once plink is setup correctly, you can use the following command to connect to your ESXi host, and set your script running with the appropriate ‘vmbackups’ file for the relevant machine.
E:\backup\putty\plink.exe firstname.lastname@example.org -pw passsword “nohup /vmfs/volumes/datastore1/scripts/ghettoVCB.sh /vmfs/volumes/datastore1/scripts/vmbackupsbuilder > /vmfs/volumes/datastore1/scripts/backuplog.txt &”
Picking the above command apart:
1. E:\backup\putty\plink.exe (the path to plink.exe)
2. email@example.com -pw passsword (the username, IP address and password of your ESXi Host)
3. “nohup /vmfs/volumes/datastore1/scripts/ghettoVCB.sh /vmfs/volumes/datastore1/scripts/vmbackupsbuilder > /vmfs/volumes/datastore1/scripts/backuplog.txt &” (this is the path to your ghettoVCB.sh file, the path to your vmbackups file, and a command to log the output
We also run this as a ‘nohup’ command so that the snapshot can continue without anyone being continuously logged into the host.
You probably want to run this without the nohup as a first test, to make sure everything is working OK.
Once you’re happy with the command and it runs successfully for you, you can simply add this command to a .bat file, and then use the standard Windows Task Manager to schedule it to run as frequently as you wish.Filed under Backups, Command Line, NFS, SSH, VMWare ESXi | Comment (1)
I’ve seen a lot of software meant for managing the performance of windows servers, obviously a lot of those tools are extremely specific (for services like SQL and Exchange etc) but for standard windows machines absolutely nothing I’ve seen matches the amazing GUI of Spotlight on Windows. Initially the GUI looks like it’s trying too hard, but actually it’s an amazing blend of static and real-time info. Here’s how it looks
In terms of content – if offers pretty much everything you’d expect:
CPU queue length
Virtual Memory usage
Page file usage
Fixed Disk usage
it also shows the various buses between these objects and shows the pages/sec moving around the motherboard, this feature is sorely missing from many other management products, and it’s often key at determining what’s going on with your server. Each monitor has a helpful explanation in case you’re feeling a little knowledge-light.
My advice for this product would be as follows.
1. Download and Install from the Quest Software site (it’s free!)
2. Set-up your connections (File -> Connect)
3. Ensure Spotlight successfully connects and choose a 6 hour calibration period.
This last step is most important, ideally you want to let Spotlight gather data from the machine over a time-span where it’s under some load, ie a representative snapshot of its daily load. Otherwise you’ll spend a lot more time in future customising the alert levels for usage which you know is “normal”.
Once all your connection are calibrated you’ll have a management tool which can tell you more about a servers performance in a single glance than you’d think possible.
Thanks to Quest Software.Filed under Uncategorized | Comment (0)
For those of us who work with IBM server hardware, there’s a baffling array of RAID cards which may or may not fit into your server’s architecture. Without having to locate and flick through your Technical manual for each one, the following page is a wonderfully comprehensive guide to virtually every IBM RAID card ever, and a matrix at the foot of the page showing which cards are compatible with this servers.
Fantastic resource!Filed under Uncategorized | Comment (0)
Windows Server 2008 doesn’t have the Disk Cleanup option switched on as a default. In order to enable it, you’ll need to install the ‘Desktop Experience’ which actually contains rather a lot of other stuff. Not quite sure why Microsoft have bundled so much stuff into this ‘Desktop Experience’ when 99% of most people will just want the Disk Cleanup option, just another MS idea which *could* have been good, but not quite.
To enable the ‘Desktop Experience’
Step 1 – Open the Server Manager (Start -> Administrative Tools -> Server Manager)
Step 2 – In the Features Summary section, click on Add Feature.
Step 3 – Select the Desktop Feature experience and click through the various ‘Next’ steps to install
Step 4 – Reboot your machine and then you should find the Disk Cleanup option now appears in the usual place. (Right click on disk name -> properties)
That’s it, you’ll now be able to easily delete the swathes of junk that Windows builds up, easily and from one place. If you’re going to De-frag a disk, always run this first and get rid of as much junk as you can.Filed under Windows 2008 Server | Comment (0)
A lot of Sys Admins use the RDP Snap-in, on either Windows XP or Windows Server 2003 because it makes it far easier to manage mutliple machines. Unfortunatelty, this feature isn’t available in any version of Windows Vista.
However, due to the similarities between 2003 and Vista, you can get it working, with a bit of tinkering, here’s how.
Step 1 – Download and install the adminpak.msi for Windows Server 2003 and install it on your Vista machine. Both
can be installed onto Vista without error. Unfortunately, you might run into one or two error messages if you use Active Directory Users and Computers on your Vista machine. So there’s more to be done.
Step 2 – Open an elevated permission command prompt, (Right click on ‘Cmd’ and chose ‘Run as Administrator) and register the following dll’s
regsvr32 /s adprop.dll
regsvr32 /s azroles.dll
regsvr32 /s azroleui.dll
regsvr32 /s ccfg95.dll
regsvr32 /s certadm.dll
regsvr32 /s certmmc.dll
regsvr32 /s certpdef.dll
regsvr32 /s certtmpl.dll
regsvr32 /s certxds.dll
regsvr32 /s cladmwiz.dll
regsvr32 /s clcfgsrv.dll
regsvr32 /s clnetrex.dll
regsvr32 /s cluadmex.dll
regsvr32 /s cluadmmc.dll
regsvr32 /s cmproxy.dll
regsvr32 /s cmroute.dll
regsvr32 /s cmutoa.dll
regsvr32 /s cnet16.dll
regsvr32 /s debugex.dll
regsvr32 /s dfscore.dll
regsvr32 /s dfsgui.dll
regsvr32 /s dhcpsnap.dll
regsvr32 /s dnsmgr.dll
regsvr32 /s domadmin.dll
regsvr32 /s dsadmin.dll
regsvr32 /s dsuiwiz.dll
regsvr32 /s imadmui.dll
regsvr32 /s lrwizdll.dll
regsvr32 /s mprsnap.dll
regsvr32 /s msclus.dll
regsvr32 /s mstsmhst.dll
regsvr32 /s mstsmmc.dll
regsvr32 /s nntpadm.dll
regsvr32 /s nntpapi.dll
regsvr32 /s nntpsnap.dll
regsvr32 /s ntdsbsrv.dll
regsvr32 /s ntfrsapi.dll
regsvr32 /s rasuser.dll
regsvr32 /s rigpsnap.dll
regsvr32 /s rsadmin.dll
regsvr32 /s rscommon.dll
regsvr32 /s rsconn.dll
regsvr32 /s rsengps.dll
regsvr32 /s rsjob.dll
regsvr32 /s rsservps.dll
regsvr32 /s rsshell.dll
regsvr32 /s rssubps.dll
regsvr32 /s rtrfiltr.dll
regsvr32 /s schmmgmt.dll
regsvr32 /s tapisnap.dll
regsvr32 /s tsuserex.dll
regsvr32 /s uddi.mmc.dll
regsvr32 /s vsstskex.dll
regsvr32 /s w95inf16.dll
regsvr32 /s w95inf32.dll
regsvr32 /s winsevnt.dll
regsvr32 /s winsmon.dll
regsvr32 /s winsrpc.dll
regsvr32 /s winssnap.dll
regsvr32 /s ws03res.dll
Step 3 – At this point a lot of people have had a mixture of success in attempting to add the ‘Remote Desktops’ through the MMC Snap-in, it just doesn’t appear in there for some people. The easiest way around it is simply to add a shortcut to the tsmmc.msc file, I’ve added this to my Desktop, and it can be launched without elevated permission to bring up the Remote Management Screen.
Step 4 – Now it’s just a question of adding your connections and you’re away.
Enjoy.Filed under Command Line, RDP, Vista | Comment (0)
There’s a lot of people searching and posting about the VMWare Infrastructure Client that’s used for managing your ESXi Host, people don’t seem to be able to find and download it from the web.
There’s a good reason for this, it’s not widely available online! Do install it, simply visit the IP address of your ESXi host in your favourite internet browser, and you can download the Infrastructure Client from the ESXi Host directly.
Hope this helps some of the confused!Filed under Uncategorized, VMWare ESXi | Comment (0)