How to workaround Visual Studio 2010 Publish Web Site via FTP Bug

by APIJunkie 5. October 2010 08:03

After years of deploying web site files using custom FTP, Remote Desktop File Transfer etc. We decided to give VS 2010 Publish Web FTP a try.

It worked flawlessly the first couple of times until we started receiving errors on the staging/deployment server that were not reproducible on the development machines.

After doing some digging around, we found out that the web publish process was the culprit.

It turns out some files were not being replaced/updated when their versions changed on the development machines.

In our case they were web server controls sitting under one of the web site sub folders.

For some reason the web publish process did not detect changes even though earlier file versions were previously deployed correctly. It seems like there is a bug in the file change detection algorithm inside the FTP publish web process.

To solve the problem we had to force the publish web process to detect changes by deleting all the old files before deploying a new version.

To do this make sure you check the “Delete All Existing Files Prior to Publish” option in the Publish Web dialog/FTP publish options:

It makes deployment slower but at least you know you get the latest version of the files each time you publish.

Good luck!


ASP.NET | IIS | Web Development

About the author

Name of author

I was first wounded by x86 assembly, recovered and moved on to C. Following a long addiction to C++ and a short stint at rehab I decided to switch to a healthier addiction so I am now happily sniffing .NET and getting hooked on Silverlight.

I am mainly here to ramble about coding, various API’s, Junkies(me especially) and everything else that happens between coders and their significant other.

  James Bacon