


Session cookies is the most commonly used method for website login systems. We have created a demo project that test crawler login support for websites that use session cookies. Website Login - Post Form / Session Cookies: Details and Demo Project If you are looking for an alternative to FireFox Live HTTP Headers you can check out Fiddler (for Internet Explorer) and WireShark (general tool). Having done that, you just copy-and-paste the appropriate values into the A1 Sitemap Generator login configuration: Use this data to configure headers to send.Notice the content (POST data query string) it sends.Notice the website address FireFox connects to.Now focus on the logged HTTP header data from the first entry / page.Try make a website login in FireFox browser.Clear all HTTP headers already collected.You can use a FireFox plugin called Live HTTP Headers to see the headers transferred during the login process:

To use this solution, you will need to understand what data is passed when you login to a website, so you can configure A1 Sitemap Generator to send the same. Scan website | Crawler engine | Default path type and handler. Historically, POST form login has been tested most with the HTTP using Indy engine for internet and localhost option in Website Login Method: Protocol Based Login and Authentication Methods This can cause problems with a few websites. It will by Microsoft design default to behave as an older version. When using Internet Explorer in embedded mode, Note if you did not use the normal installer: This combination will ensure that A1 Sitemap Generator has access to all cookies transferred during the login. You can now close the embedded browser window.Navigate to the login section of the website and login like you normally would.Depending on the program version: Click the button Copy session cookies if available.In Scan website | Crawler login click the button Open embedded browser and login before crawl.Fill Scan website | Paths | Website domain address first as it makes the next step easier.In Scan website | Crawler engine select HTTP using Windows API WinInet.Įach time you want to initiate the website scan, do the following: This is the easiest login method to use since it requires the least configuration.
