box_auth() without localhost - r

I'm trying to use the boxr package to link my box account to R-Server.
I get as far in the box_auth() instructions as step 3 from the box pdf https://cran.r-project.org/web/packages/boxr/boxr.pdf
A window pops up and I authorise connection then I get the error 'Safari can't connect to the server'.
I have no knowledge of how apache or web development works so forgive my naivety, I've come to understand the problem is I don't have localhost set up on my Mac.
I'm unable to turn these features on because it requires admin rights and my company won't allow users to have this.
Is there something else I can put in the redirect_uri box apart from localhost that will allow this to authenticate?
Thanks

The issue i had was mostly that authenticating box through R-Server isn't supported. https://github.com/brendan-r/boxr/issues/23
To get around this I used my personal laptop to authenticate locally then uploaded the .Renviron and .boxr-oauth files to R server (which is the advice in the github post)
This was slightly tricky as R wasn't showing the .boxr-oauth file but I managed to copy it to a folder, zip the folder, then upload that to R-Server.
Now running the box_auth() function authenticates as it should.

As of v0.3.5 (November 2019) boxr has a new alternative authentication method designed for remote servers, box_auth_service(), that closes issue 23. It's slightly is different than the oAuth way, because it uses "Service" accounts as the actors instead of the "User" account. But the "Service" accounts are what's needed to maintain security within a organization, so we opted for that. Please open an issue on the repo if you run into any issues.

Related

Unable to copy to Amazon S3 using Full Administrator access and Full S3 access

I had a perfectly working instance of a WP-CLI wordpress plugin to upload files to S3 using the AmazonS3FullAccess policy. I migrated servers, and the copy started failing. "Failed to copy or write".
I even included the Full Administrator access to the IAM policy just to see what's going on when there are no restrictions, and the copy is still failing. Any idea what might be wrong?
Things I have tried: ensure time (via NTPD synchronization) on the new server is correct. Cross check the environment: php version, etc. The application files are exactly the same. I also used the host files method to check the previous server and it is working well.
Solved the problem by creating new access keys. For some reason, it seems that migrating a server will make the old access keys stop working? Ah, well.
P.S. I also downgraded the policies right back, to only what the application needs.

Automatically unblocking executables downloaded from the web site

I have a web site (intranet) that allows you to download an executable (currently a .Net Console Application) written in ASP.NET and is using https.
However on many machines I can't run it right away after download - I need to right click on it, go to Properties and click Unblock which makes using this app uncomfortable (users will often have to download this executable and run - every time it is a new one as it is code generated)
Is there any way to make this executable automatically unblocked? Modifying client machine is not an option, but I can do anything with the server.
From the beginning I thought this is impossible as it is a security protection, but Chrome somehow does this. If I take a new PC with IE installed, type Chrome into Bing and install it - I don't have to unblock executable.
So far I've tested this only on W10 Chrome and IE, but I am pretty sure older Windows versions have this problem as well.
The mechanism for showing the untrusted executable dialog is based around alternate Datastreams. The metadata gets added by Windows or the browser when you download something from a network source, thus it is not possible for your file/webserver to influence this behaviour. Windows on the other hand has a ruleset which it uses to apply the flags which can be found in the TrustZone-Settings of your Internet Options.
NTFS has a neat little feature which allows for a file to have multiple contents, also known as alternate Datastreams. This is an NTFS-only feature, so you won't find it on other partition types. This basically allows you to store more data in your file which is not perse visible to the user and cannot be easily found out by a standard windows user. Windows uses those alternate datastreams to mark the origin of a file, especially when downloaded from the inter- or intranet. The Alternate Datastream which is used for this data is called the "Zone.Identifier" and holds an ID to the zone which the file was copied from. When you decide to trust a file you basically tell Windows to remove that datastream.
Windows uses the concept of different zones to classify those files. Windows knows four zones in Total: Internet, Intranet, Trusted Sites and restricted Sites. You can alter the settings and rules for those in the Internet-Options dialog in the tab "Trust Zone"
Security Remark: Before changing your settings for the trust zones in the company consider the security risks of this thrice. As it will allow any executable from those verified sources to be executed, potentially laying way to malicous executables which can then be started by already infected PCs or Users themselves.
The correct way to resolve that issue is to sign that executable with a trusted and valid code signing certificate which is better to be with EV (Extended Validation). Windows will check the certificate when you run the file and will allow it to run without further actions as it is signed with a trusted cert.

The current session is not interactive

I have developed web page for displaying certificates using X509Certificate2UI class. It's working fine when it is running at local host It's displaying all certificates, I could choose the certificate, subsequently I am using certificate for digital signing a pdf document, but when I deployed it on web server it is showing exception, The current session is not interactive. It is not showing any certificates. Any one had ever face this problem. If so please guide me how to resolve this issue. I really confused why it is behaving in that manner.
Certificates are often in "stores". Sometimes USer Stores are tied to windows users. Try setting "Load User Profile" in IIS app pool to make it interactive. However this may not be enough to make it interactive the way your code is assuming.
Make sure you understand what store you are using at runtime.
I would write my code so that it did Not require an interactive session. Store in Machine store or in files.

How to execute an exe or a file in local machine using a website or html (like clone in windows in github)

These are my requirements
How to open an pdf file which is located in my local machine using html page?
How to execute an exe file which is loacted in my local machine using a website?
This is like what github does when we do clone in windows option.
I need to implement exact same operation . I have a button and when I click that it need to run an application.
Thanks in advance.
You installed GitHub for Windows on your computer. And this installation registered the protocol github-windows: with the GitHub for Windows executable as handler. Nothing special going on here.
The only chance I see is to register your own URL scheme (as you said myapp-pdf: or something like it).
Then you can redirect (or open a new window) to a URL with your custom scheme and the browser should start your application giving you the URL as a command line parameter.
Create custom Url Schema and Map to the application
I just explain some thing I got after your inputs.
As all guys mentioned above, I need to generate a url schema for my application
I need to register the schema and application path to be executed in Windows registry. This need to handle during the installation .
http://msdn.microsoft.com/en-us/library/aa767914(v=vs.85).aspx
This link will show how to add the particular schema in windows registry and we can specify the application to be executed like mailto: for Outlook.
Thanks for SO to provide the details from here.
how do I create my own URL protocol? (e.g. so://...)
Pros
Need to check about the security issues which may occur if we are using this approach.

Issue in webdav on mac machine

A user with permission to create folders and components is not able to copy and paste items through WebDAV. This User is using Mac OS X Lion. The error he gets is that he does not have read and write permissions. Is there any resolution?
The WebDAV Connector is enabled by default server-side per the SDLLiveContent documentation at least for SDL Tridion 2011.
Only valid items are allowed via WebDAV which includes binaries (multimedia in Tridion), .xml components, and other types.
It seems like the user doesn't have permissions to read and write for a given folder. You can confirm by having them attempt creating folders or components in the same folder in the Content Manager Explorer (CME).
Is it possible that the Mac does not authenticate properly to Windows. In this case, you should be able to see the failed connections in your server logs. Is this user (or any other Mac user) able to use webdav successfully in any other folders?
As it's been suggested already, you will have to do some detective work to determine what exactly is failing. Tridion permissions do not change based on the client you use, so if they work from one client, they must work from another (excluding authentication issues here).
Go to your Windows Event Viewer, Tridion Content Manager log, check for error messages written to it when you try to copy content from the mac.
Post the exact error message you're getting. I doubt that Tridion is telling you "user is don't have read and write access"
Bottom line, if it works from Windows and not Mac, the issue is not with the WebDav server, but with the WebDav Client.
I also fail to see the programming question on this one...
What version of Tridion are you using? First action which you should do is to check Event Log for error messages (it was already suggested by Peter Kjaer), if you don't have anything there you can try to enable webdav debug logging by modifying cc_crtd_def.xml file which is situated in Tridion\webdav\WebDAVcartridges\Default\ folder. You should change loglevel property (as far as I remember it should be 4 for debug). And then there should be a log file created in the same folder or in webdav folder. You can try to find exact error message in this log file and post it here.

Resources