Links.  Pesky things.  Yet necessary.  You do your best to cultivate quality links, but what happens when poor quality links point to you?  Links happen, right?  So how can you protect yourself from malicious link building?  How do you tell the engine you just don't want to associate your content with "that" site?
Over the past few months we’ve being hearing from the industry that webmasters wanted more control and the ability to tell us they didn't trust a link pointed at their content.  While this is not a new conversation, our latest redesign work has enabled us to take action on this topic.

Today we’re announcing the Disavow Links feature in Bing Webmaster Tools.  Use the Disavow Links tool to submit page, directory, or domain URLs that may contain links to your site that seem "unnatural" or appear to be from spam or low quality sites.  This new feature can be easily found in the Configure Your Site section of the navigation.



Using the Disavow Links tool, you can easily and quickly alert Bing about links you don’t trust.  Using the drop-down, you can choose to input signals to us at the page level, a directory level or at a domain level.  We’ll note the “type” of location (page, directory or domain) and the date you told us of the action.
You can easily export the list as you build it over time.

These signals help us understand when you find links pointing to your content that you want to distance yourself from for any reason.  You should not expect a dramatic change in your rankings as a result of using this tool, but the information shared does help Bing understand more clearly your intent around links pointing to your site.

There is no limit on the number of links you can disavow via this tool.

SEO Analyzer updates
We've made a couple of important updates to this just-released tool with the goal being greater flexibility and usefullness.
  1. If you use the SEO Analyzer to fetch a webpage from your site and do a real-time scan for best practices compliance, its important to know that this action will ignore any robots.txt directives you have in place.  We had people saying the system was timing out when trying to scan their pages, and in every instance we noted the content they tried to scan was blocked by a robots.txt notation.  This change affects only the SEO Analyzer actions.  Bingbot still follows your robots.txt directives as normal.  If you want to test a robots.txt compliance command, you can use the Fetch as Bingbot feature, as we've previously noted.
  2. SEO Analyzer can now follow redirects on your site.  So if you give it an active URL, the tool will follow the redirect and render it's report based on the first page it reaches which returns a 200 OK code header response.  As a bonus, you can see each hop in the redirect change via a handy Show/Hide link, enabling you to see each URL and response code along the way (outlined in orange below).  The tool will follow up to 6 redirects. (Really, you shouldn't have that many anyway. ;) )