I'm attempting to set the request header 'Referer' to spoof a request coming from another site. We need the ability test that a specific referrer is used, which returns a specific form to the user, otherwise an alternative form is given.
I can do this within poltergeist by:
page.driver.headers = {"Referer" => referer_string}
but I can't find the equivalent functionality for the selemium driver.
How can I set request headers in the capybara selenium driver?
Webdriver doesn't contain an API to do it. See issue 141 from Selenium tracker for more info. The title of the issue says that it's about response headers but it was decided that Selenium won't contain API for request headers in scope of this issue. Several issues about adding API to set request headers have been marked as duplicates: first, second, third.
Here are a couple of possibilities that I can propose:
Use another driver/library instead of selenium
Write a browser-specific plugin (or find an existing one) that allows you to add header for request.
Use browsermob-proxy or some other proxy.
I'd go with option 3 in most of cases. It's not hard.
Note that Ghostdriver has an API for it but it's not supported by other drivers.
For those people using Python, you may consider using Selenium Wire which can set request headers as well as provide you with the ability to inspect requests and responses.
from seleniumwire import webdriver # Import from seleniumwire
# Create a new instance of the Chrome driver (or Firefox)
driver = webdriver.Chrome()
# Create a request interceptor
def interceptor(request):
del request.headers['Referer'] # Delete the header first
request.headers['Referer'] = 'some_referer'
# Set the interceptor on the driver
driver.request_interceptor = interceptor
# All requests will now use 'some_referer' for the referer
driver.get('https://mysite')
Install with:
pip install selenium-wire
I had the same issue. I solved it downloading modify-headers firefox add-on and activate it with selenium.
The code in python is the following
fp = webdriver.FirefoxProfile()
path_modify_header = 'C:/xxxxxxx/modify_headers-0.7.1.1-fx.xpi'
fp.add_extension(path_modify_header)
fp.set_preference("modifyheaders.headers.count", 1)
fp.set_preference("modifyheaders.headers.action0", "Add")
fp.set_preference("modifyheaders.headers.name0", "Name_of_header") # Set here the name of the header
fp.set_preference("modifyheaders.headers.value0", "value_of_header") # Set here the value of the header
fp.set_preference("modifyheaders.headers.enabled0", True)
fp.set_preference("modifyheaders.config.active", True)
fp.set_preference("modifyheaders.config.alwaysOn", True)
driver = webdriver.Firefox(firefox_profile=fp)
Had the same issue today, except that I needed to set different referer per test. I ended up using a middleware and a class to pass headers to it. Thought I'd share (or maybe there's a cleaner solution?):
lib/request_headers.rb:
class CustomHeadersHelper
cattr_accessor :headers
end
class RequestHeaders
def initialize(app, helper = nil)
#app, #helper = app, helper
end
def call(env)
if #helper
headers = #helper.headers
if headers.is_a?(Hash)
headers.each do |k,v|
env["HTTP_#{k.upcase.gsub("-", "_")}"] = v
end
end
end
#app.call(env)
end
end
config/initializers/middleware.rb
require 'request_headers'
if %w(test cucumber).include?(Rails.env)
Rails.application.config.middleware.insert_before Rack::Lock, "RequestHeaders", CustomHeadersHelper
end
spec/support/capybara_headers.rb
require 'request_headers'
module CapybaraHeaderHelpers
shared_context "navigating within the site" do
before(:each) { add_headers("Referer" => Capybara.app_host + "/") }
end
def add_headers(custom_headers)
if Capybara.current_driver == :rack_test
custom_headers.each do |name, value|
page.driver.browser.header(name, value)
end
else
CustomHeadersHelper.headers = custom_headers
end
end
end
spec/spec_helper.rb
...
config.include CapybaraHeaderHelpers
Then I can include the shared context wherever I need, or pass different headers in another before block. I haven't tested it with anything other than Selenium and RackTest, but it should be transparent, as header injection is done before the request actually hits the application.
I wanted something a bit slimmer for RSpec/Ruby so that the custom code only had to live in one place. Here's my solution:
/spec/support/selenium.rb
...
RSpec.configure do |config|
config.after(:suite) do
$custom_headers = nil
end
end
module RequestWithExtraHeaders
def headers
$custom_headers.each do |key, value|
self.set_header "HTTP_#{key}", value
end if $custom_headers
super
end
end
class ActionDispatch::Request
prepend RequestWithExtraHeaders
end
Then in my specs:
/specs/features/something_spec.rb
...
$custom_headers = {"Referer" => referer_string}
If you are using javacsript and only want to implement on chrome, Puppeteer is the best option as it has native support to modify headers.
Check this out: https://pptr.dev/#?product=Puppeteer&version=v10.1.0&show=api-pagesetextrahttpheadersheaders
Although for cross-browser usage you might check out #requestly/selenium npm package. It is a wrapper around requestly extension to enable easy integration in selenium-webdriver.The extension can modify headers.
Check out: https://www.npmjs.com/package/#requestly/selenium
Setting request headers in the web driver directly does not work. This is true.
However, you can work around this problem by using the browser devtools (I tested with edge & chrome) and this works perfectly.
According to the documentation, you have the possibility to add custom headers:
https://chromedevtools.github.io/devtools-protocol/tot/Network/
Please find below an example.
[Test]
public async Task AuthenticatedRequest()
{
await LogMessage("=== starting the test ===");
EdgeOptions options = new EdgeOptions {UseChromium = true};
options.AddArgument("no-sandbox");
var driver = new RemoteWebDriver(new Uri(_testsSettings.GridUrl), options.ToCapabilities(), TimeSpan.FromMinutes(3));
//Get DevTools
IDevTools devTools = driver;
//DevTools Session
var session = devTools.GetDevToolsSession();
var devToolsSession = session.GetVersionSpecificDomains<DevToolsSessionDomains>();
await devToolsSession.Network.Enable(new Network.EnableCommandSettings());
var extraHeader = new Network.Headers();
var data = await Base64KerberosTicket();
var headerValue = $"Negotiate {data}";
await LogMessage($"header values is {headerValue}");
extraHeader.Add("Authorization", headerValue);
await devToolsSession.Network.SetExtraHTTPHeaders(new Network.SetExtraHTTPHeadersCommandSettings
{
Headers = extraHeader
});
driver.Url = _testsSettings.TestUrl;
driver.Navigate();
driver.Quit();
await LogMessage("=== ending the test ===");
}
This is an example written in C# but the same shall probably work with java, python as well as the major platforms.
Hope it helps the community.
If you use the HtmlUnitDriver, you can set request headers by modifying the WebClient, like so:
final case class Header(name: String, value: String)
final class HtmlUnitDriverWithHeaders(headers: Seq[Header]) extends HtmlUnitDriver {
super.modifyWebClient {
val client = super.getWebClient
headers.foreach(h => client.addRequestHeader(h.name, h.value))
client
}
}
The headers will then be on all requests made by the web browser.
With the solutions already discussed above the most reliable one is using Browsermob-Proxy
But while working with the remote grid machine, Browsermob-proxy isn't really helpful.
This is how I fixed the problem in my case. Hopefully, might be helpful for anyone with a similar setup.
Add the ModHeader extension to the chrome browser
How to download the Modheader? Link
ChromeOptions options = new ChromeOptions();
options.addExtensions(new File(C://Downloads//modheader//modheader.crx));
// Set the Desired capabilities
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability(ChromeOptions.CAPABILITY, options);
// Instantiate the chrome driver with capabilities
WebDriver driver = new RemoteWebDriver(new URL(YOUR_HUB_URL), options);
Go to the browser extensions and capture the Local Storage context ID of the ModHeader
Navigate to the URL of the ModHeader to set the Local Storage Context
.
// set the context on the extension so the localStorage can be accessed
driver.get("chrome-extension://idgpnmonknjnojddfkpgkljpfnnfcklj/_generated_background_page.html");
Where `idgpnmonknjnojddfkpgkljpfnnfcklj` is the value captured from the Step# 2
Now add the headers to the request using Javascript
.
((Javascript)driver).executeScript(
"localStorage.setItem('profiles', JSON.stringify([{ title: 'Selenium', hideComment: true, appendMode: '',
headers: [
{enabled: true, name: 'token-1', value: 'value-1', comment: ''},
{enabled: true, name: 'token-2', value: 'value-2', comment: ''}
],
respHeaders: [],
filters: []
}]));");
Where token-1, value-1, token-2, value-2 are the request headers and values that are to be added.
Now navigate to the required web-application.
driver.get("your-desired-website");
You can do it with PhantomJSDriver.
PhantomJSDriver pd = ((PhantomJSDriver) ((WebDriverFacade) getDriver()).getProxiedDriver());
pd.executePhantomJS(
"this.onResourceRequested = function(request, net) {" +
" net.setHeader('header-name', 'header-value')" +
"};");
Using the request object, you can filter also so the header won't be set for every request.
If you just need to set the User-Agent header, there is an option for Chrome:
chrome_options = Options()
chrome_options.add_argument('--headless')
chrome_options.add_argument('user-agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.103 Safari/537.36"')
Now the browser sends User-Agent.
Related
I am able to successfully send requests to a sandbox via postman, given by a provider following their specs (see images below)
Successful request (see below)
In order to do that, aside from the respective headers and parameters (see image 2) I have to add a ssl/Tls certificate (.pfx) given that the server requires a 2 way handshake so it needs SSl client certificate:
Authorization (see below).
Headers (see below)
Body (see below)
Now, I am trying to do ir programatically using dotnet core 6, but I keep running into the same problem:
And here is my code:
public static string GetAccessToken(IConfiguration _config)
{
string UserName = Environment.GetEnvironmentVariable("USER_NAME");
string Password = Environment.GetEnvironmentVariable("PASSWORD");
var client = new RestClient("https://connect2.xyz.com/auth/token");
var request = new RestRequest();
X509Certificate2 FullChainCertificate = new X509Certificate2("Path/to/Cert/cert.pfx", "test");
client.Options.ClientCertificates = new X509CertificateCollection() { FullChainCertificate };
client.Options.Proxy = new WebProxy("connect2.xyz.com");
var restrequest = new RestRequest();
restrequest.Method = Method.Get;
restrequest.AddHeader("Accept", "*/*");
restrequest.AddHeader("Cache-Control", "no-cache");
restrequest.AddHeader("Content-Type", "application/x-www-form-urlencoded");
restrequest.AddHeader("Authorization", "Basic " + Convert.ToBase64String(Encoding.Default.GetBytes($"{UserName}:{Password}")));
restrequest.AddParameter("grant_type", "client_credentials");
RestResponse response = client.Execute(restrequest);
AccessTokenPointClickCare accessToken = JsonConvert.DeserializeObject<AccessTokenPointClickCare>(response.Content);
string strToken = accessToken.access_token;
return strToken;
}
Now, as the error seems to show, it has to do with the certificates (apparently), but I don't know if something in the code is wrong, or if I'm missing something, etc...
It is worth noting that this code did run in someone else's pc with the same set-up, but of course with that person's own pfx, but for the rest, it is essentially the same, and not to mention that it does work on my postman.
Finally, as the title on this question, the only thing I can think it might also be affecting the request is the host. If I reference the postman, there is a field where I have to place the host name of the server https://connect2.xyz.com/auth/token
So made it work by changing to a new Windows 10. Researching in other Stackoverflow threads found the answer: .NET CORE 5 '''HandshakeFailure'" when making HTTPS request
So I conclude it has to do with the cyphers
Stuck on this problem where I received this error everytime making POST request to my actix-web server.
CORS header 'Access-Control-Allow-Origin' missing
my javascript (VueJs running on localhost:3000) :
let data = //some json data
let xhr = new XMLHttpRequest();
xhr.open("POST", "http://localhost:8080/abc");
xhr.setRequestHeader("Content-Type", "application/json");
xhr.onload = () => {
console.log(xhr.responseText);
}
xhr.send(JSON.stringify(data));
My Actix_Web server (running on localhost:8080) :
#[actix_web::main]
async fn main() {
HttpServer::new(move || {
let cors = Cors::default()
.allowed_origin("http://localhost:3000/")
.allowed_methods(vec!["GET", "POST"])
.allowed_header(actix_web::http::header::ACCEPT)
.allowed_header(actix_web::http::header::CONTENT_TYPE)
.max_age(3600);
App::new()
.wrap(cors)
.service(myfunc)
})
.bind(("0.0.0.0", 8080))
.unwrap()
.run()
.await
.unwrap();
}
my cargo.toml dependencies
[dependencies]
actix-web = "4"
actix-cors = "0.6.1"
...
Got any idea?
Okay, so I've done some testing. If you're writing a public API, you probably want to allow all origins. For that you may use the following code:
HttpServer::new(|| {
let cors = Cors::default().allow_any_origin().send_wildcard();
App::new().wrap(cors).service(greet)
})
If you're not writing a public API... well, I'm not sure what they want you to do. I've not figured out how to tell the library to send that header. I guess I will look at the code.
UPDATE:
So funny story, this is how you allow specific origins:
let cors = Cors::default()
.allowed_origin("localhost:3000")
.allowed_origin("localhost:2020");
BUT, and oh boy, is that but juicy. The Access-Control-Allow-Origin response header is only set when there is a Origin request header. That header is normally added by the browser in certain cases 1. So I did that (using the Developer tools in the browser). What did I get? "Origin is not allowed to make this request". I set my origin header to localhost:3000. Turns out, the arctix library simply discards that header if no protocol was provided... (e.g. http://) (I assume it discards it, if it deems its format invalid). That internally results in the header being the string "null". Which is, checks notes, not in the list of allowed origins.
And now the grand finale:
Your origin header needs to be set to (by either you or the browser): "http://localhost:3000".
Your configuration needs to include: .allowed_origin("http://localhost:3000").
After doing that, the server will happily echo back your origin header in the Access-Control-Allow-Origin header. And it will only send that one.
I've no idea if any of that is what the standard specifies (or not). I encourage you to read through it, and if it doesn't comply, please open an issue on GitHub. I would do it myself, but I'm done with programming for today.
Cheers!
Trying to figure out how to modify request headers for the graph client toolkit and have it apply when using the GET component. I don't necessarily want to override the entire graph client.
For reference, here's how you do it:
// Already likely on page, adding as reference
TeamsProvider.microsoftTeamsLib = microsoftTeams;
const provider = new TeamsProvider(config);
const options = {
authProvider: provider,
fetchOptions: { headers: {'ConsistencyLevel':'eventual'}}
};
const client = Client.initWithMiddleware(options);
provider.graph = new Graph(client)
// Now set provider with new graph
Providers.globalProvider = provider
I have gotten this to work without having to re-initialize the graph client by doing this:
Providers.globalProvider.graph.client.config.fetchOptions = { headers: {'ConsistencyLevel':'eventual'}}
You can also remove the request headers by setting fetchOptions to {}. This can be toggled at any time between requests, and can be set even after the globalProvider has been initialized without having to re-create the middleware chain.
I posted a comment in the issues section of the MGT repo here.
I made a small program to test Microsoft Cognitive Service, but it always return
{
"code":"InternalServerError",
"requestId":"6d6dd4ec-9840-4db3-9849-a6497094fa4c",
"message":"Internal server error."
}
The code I'm using is:
#!/usr/bin/env python
import httplib, urllib, base64
headers = {
# Request headers
'Content-Type': 'application/json',
'Ocp-Apim-Subscription-Key': '53403359628e420ab85a516a79ba1bd0',
}
params = urllib.urlencode({
# Request parameters
'visualFeatures': 'Categories,Tags,Adult,Description,Faces',
'details': '{string}',
})
try:
conn = httplib.HTTPSConnection('api.projectoxford.ai')
conn.request("POST", "/vision/v1.0/analyze?%s" % params,
'{"url":"http://static5.netshoes.net/Produtos/bola-umbro-neo-liga-futsal/28/D21-0232-028/D21-0232-028_zoom1.jpg?resize=54g:*"}', headers)
response = conn.getresponse()
data = response.read()
print(data)
conn.close()
except Exception as e:
print("[Errno {0}] {1}".format(e.errno, e.strerror))
Am I doing something wrong or it's a generalized server problem?
The problem is in the params variable. When defining which visual features you would like to extract you can specify specific details from the image, as described in the documentation. The details field, if used, must be initialized with one of the valid string options available (currently, only supporting the "Celebrities" option, that would identify which celebrity is in the image). In this case, you initialized the details field with literally the placeholder noted in the documentation ('{string'}). That caused the system to give an internal error.
To correct that, you should try:
params = urllib.urlencode({
# Request parameters
'visualFeatures': 'Categories,Tags,Adult,Description,Faces',
'details': 'Celebrities',
})
(PS: Have already reported this behavior to Microsoft Cognitive Services.)
I'm writing a flex application that polls an xml file on the server to check for updated data every few seconds, and I'm having trouble preventing it from caching the data and failing to respond to it being updated.
I've attempted to set headers using the IIS control panel to use the following, without any luck:
CacheControl: no-cache
Pragma: no-cache
I've also attempted adding a random HTTP GET parameter to the end of the request URL, but that seems like it's stripped off by the HttpService class before the request is made. Here's the code to implement it:
http.url = "test.xml?time=" + new Date().getMilliseconds();
And here's the debug log that makes me think it failed:
(mx.messaging.messages::HTTPRequestMessage)#0
body = (Object)#1
clientId = (null)
contentType = "application/x-www-form-urlencoded"
destination = "DefaultHTTP"
headers = (Object)#2
httpHeaders = (Object)#3
messageId = "AAB04A17-8CB3-4175-7976-36C347B558BE"
method = "GET"
recordHeaders = false
timestamp = 0
timeToLive = 0
url = "test.xml"
Has anyone dealt with this problem?
The cache control HTTP header is "Cache-Control" ... note the hyphen! It should do the trick. If you leave out the hyphen, it is not likely to work.
I used the getTime() to make the date into a numeric string that did the trick. I also changed GET to POST. There were some issues with different file extensions being cached differently. For instance, a standard dynamic extension like .php or .jsp might not be cached by the browser and
private var myDate:Date = new Date();
[Bindable]
private var fileURLString:String = "http://www.mysite.com/data.txt?" + myDate.getTime();
Hopefully this helps someone.
I also threw a ton of the header parameters at it but they never fully did the trick. Examples:
// HTTPService called service
service.headers["Pragma"] = "no-cache"; // no caching of the file
service.headers["Cache-Control"] = "no-cache";