IIS Search Engine Optimization Toolkit
Add/remove rules to robots.txt programmatically
Last post Oct 02, 2012 10:05 AM by Jagermeister
Sep 11, 2012 08:10 AM|Jagermeister|LINK
I would like to know if it is possible to add or remove rules to robots.txt programmatically, from ASP.NET. Is there any API to do this? I don't need anything related to site analysis or sitemaps, just simple allow/disallow rules to robots.txt.
I can see that going to IIS, clicking on 'SEO' and adding a new rule to exclude one site is quite easy, but I need a way to do that programmatically. And I would like to avoid accessing the file 'robots.txt' and parsing it myself.
Sep 11, 2012 08:15 AM|fab777|LINK
have a look at the StreamWriter class: [url=http://msdn.microsoft.com/en-us/library/system.io.streamwriter.aspx]http://msdn.microsoft.com/en-us/library/system.io.streamwriter.aspx[/url]
Sep 14, 2012 03:59 AM|fab777|LINK
Hi, did it fits your need?
Oct 02, 2012 10:05 AM|Jagermeister|LINK
Thank you for your reply. Sorry for the late reply, I had "email subscription enabled" but the notification e-mail went to the spam folder, so I had thought no one replied :/
StreamWriter class would mean editing the file directly, which is precisely what I would like to avoid.
If there is an action that can be performed using a User Interface, I had assumed there would be some API to perform it programmatically, but according to all manuals and internet forums I'm starting to think that there is none.