How to bypass Userid or password / Blocked by robot.txt


Hi, I used spider to get 10 000 urls with interesting information on a paying site where I have user id and a password. When I use Xpathonurl, it gives me some answers but both the mail and the phone are replaced by "access for user". How can I get the mail and the phone, when I already have the address and other non strategic info ?

Is there a place where I could provide the User Id nd the password ?

Or is it impossible ?

Thanks .


Some sites accept login credentials in HTTP headers which can be added via Global Http Settings or custom HTTP Settings for XPathOnUrl:

I would use the Inspector tools in your browser to see how your credentials are being passed to this site.

1 Like