Сhanging users ' email addresses with GitLab API - gitlab-api

I need to write a python script for GitLab that allows me to change the email addresses of users
The problem is that I manage to change various user attributes, such as "bio" and etc
But I can't change the "email".
The script reports a successful change of these attributes, but in fact they do not change.
I am working as the root user, using his token
To change the user attributes, I use this construction
gl = gitlab.Gitlab(arg.url, private_token=arg.token)
user = gl.users.list(username = 'name')[0]
user.bio = f"{user.username}#EXAMPLE.COM"
user.save()
I also tried working with classic requests, instead of the gitlab library, but the result was the same

Related

Telegraf - how to monitor multiple Tomcat instances?

I managed to gather data from single Tomcat instance to Telegraf as follows.
[[inputs.tomcat]]
## URL of the Tomcat server status
url = "http://127.0.0.1:19090/manager/status/all?XML=true"
## HTTP Basic Auth Credentials
username = "admin"
password = "fD*(*DSS"
## Request timeout
# timeout = "5s"
## Optional SSL Config
# ssl_ca = "/etc/telegraf/ca.pem"
# ssl_cert = "/etc/telegraf/cert.pem"
# ssl_key = "/etc/telegraf/key.pem"
## Use SSL but skip chain & host verification
# insecure_skip_verify = false
Now, I want to monitor multiple Tomcat instances, but there does not seem to be an example of how to monitor multiple. Does anybody know?
The answer turned out to be very simple. Just declare the inputs.tomcat block multiple times as follows.
[[inputs.tomcat]]
## URL of the Tomcat server status
url = "http://127.0.0.1:19090/manager/status/all?XML=true"
## HTTP Basic Auth Credentials
username = "admin"
password = "fD*(*DSS"
[[inputs.tomcat]]
## URL of the Tomcat server status
url = "http://127.0.0.1:29090/manager/status/all?XML=true"
## HTTP Basic Auth Credentials
username = "admin"
password = "fD*(*DSS"
So as far as I recall there are couple of ways.
1) Easiest way is to create, use and try via using different configuration files where you may create tomcat1.conf place it under /etc/telegraf/telegraf.d/tomcat1.conf folder where you'd end up using the same plugin that you have mentioned above (inputs.tomcat) and similarly, create another configuration file for tomcat2.conf etc.. for all Tomcat instances. This way you may be able to monitor multiple Tomcat instances. See if that helps! Con of this approach is, you have to create N no. of tomcatXX.conf files under telegrad.d folder (Which can be easily fixed if you create these files on the fly while provisioning a machine using Ansible/similar tools - templating the file and iterating over the tomcatXX list).
2) Other way, which which may help as well using just one configuration file.
In one configuration file, use the following plugins together to capture what you are looking for. PS: If you use inputs.exec plugin, then the output you'll generate from your custom script (which you'll call in inputs.exec plugin) must generate the output in a known format (InfluxDB/Line Protocol) that Telegraf and InfluxDB can understand / store or you'll see some minor errors for which you can see few of my posts.
exec plugin: https://github.com/influxdata/telegraf/tree/master/plugins/inputs/exec
http_* plugin (especially http_response): https://github.com/influxdata/telegraf/tree/master/plugins/inputs/exec
filestat plugin: https://github.com/influxdata/telegraf/tree/master/plugins/inputs/filestat
logparser plugin: https://github.com/influxdata/telegraf/tree/master/plugins/inputs/logparser
procstat plugin: https://github.com/influxdata/telegraf/tree/master/plugins/inputs/procstat
Look at the plugin links mentioned above for what they do and how to set them up in Telegraf and that'd get you most of what you are looking at if you don't want to have multiple conf files for each Tomcat instance.
https://github.com/influxdata/telegraf/tree/master/plugins/inputs contains all input plugins (see if there are some that you may be interested in).
See if you can utilize how to use prefix property efficiently to distinguish between various metrics/events coming from using these plugin(s).

ASP Identity : Binding Sessions To Request Url (Sub Domains)

I don't think the title of the question is particularly accurate but that's how best i could title it.
Without summarizing, I have an MVC app hosted on Microsoft Azure. The app was built for multiple institutions (each connecting to a separate database) but the Login Module (Asp Identity) is in a central database (users are identified by their institution code). So during deployment, a sub domain is created (still pointing to the app on azure).
My problem is, the app has no regard for the Request Url, the sessions are maintained across domains. This is a serious problem because i cache User data (by session). So if a user Logs in on "domain1.myapp.com" , then opens another tab , logs into "domain2.myapp.com" , all data cached for the user logged in for "domain1" will be used for the user logged in at "domain2". The app doesn't bother to get data for the user in "domain2" since the key for that data value is already present in the session cache.
Okay, I hope the problem is understood. How do i get past this.
Ideas ? implementation ?
EDIT 1
I insert data into the cache by,
HttpRuntime.Cache.Insert("KEY", "VALUE", null, DateTime.Now.AddMinutes(30),Cache.NoSlidingExpiration);
Your caching strategy needs to change when you cache per user and per domain. The simplest approach is just to add the domain and user name to the cache key, which will keep each cache in a separate per user and per domain bucket. Make sure you put a delimiter between the values to ensure uniqueness of the key.
var domain = HttpContext.Request.Url.DnsSafeHost;
var user = HttpContext.User.Identity.Name;
var key = "__" + domain + "_" + user + "_" + "KEY";
HttpRuntime.Cache.Insert(key, "VALUE", null, DateTime.Now.AddMinutes(30),Cache.NoSlidingExpiration);
Note that if you use HttpContext.Session, it will automatically put different domain information into separate buckets because it is based on a cookie (which by default is domain specific). However, HttpContext.Session comes with its own set of problems.

Are there new facebook restriction for Rfacebook package?

I want to get some data from Facebook, so I wanted to create application to get token for 60 days like I did few months ago. Then everything worked well, I just followed steps from the tutorial like this:
http://thinktostart.com/analyzing-facebook-with-r/
So It was enough to create "empty" application, write in R with proper id and secret
fb_oauth <- fbOAuth(app_id="123456789", app_secret="1A2B3C4D",extended_permissions = TRUE)
fill website page as http://localhost:1410/ and autenthication was complete and I was able to make get some data from facebook. It seems that it is not so easy anymore.
When I try to follow exactly the same steps it seems that now I have to fill in my application (with some description, photos...) and "send" it to submission.
Do you have similar problem or I just miss something? I just want to use information from facebook for my own use, not for business or something. Is there any (other) way to get a token for R which allows me to get some information from Facebook without filling application. I don't think that filling it with some fake data will pass facebook verification.
I just want to use information from facebook for my own use
Then you don’t need to submit it for review.
See https://developers.facebook.com/docs/apps/faq#roles – it explains that you can ask any user that has a “role” in the app (meaning admin, developer or tester) for any permission without prior review.
For one, this is of course implemented this way, so that people can actually test the functionality they are developing properly. And it is also an “official loophole” for apps such as yours, that are for “private use” only, and not meant to be used by the general public in the first place.
(And this has nothing whatsoever with the Rfacebook package – it is the same for all apps, no matter what framework/SDK they might be using.)
UPDATE
As #CBroe said earlier, you do not need an approved app, you just need to add the users of the app as admin in the app's role menu in Facebook Developers.
Follow these steps and you will get your permanent FB token:
Create new application at https://developers.facebook.com/apps/ with basic setup
Fill in the app name in lower case and without the words Facebook or FB for display name and namespace, category set to Business
In "Settings/Basic" I added a new "Website" platform with the URL of http://localhost:1410/ and localhost as the "App Domain"
In the "Settings/Advanced" tab I added http://localhost:1410/ as the Valid OAuth redirect URIs
Then, run this code:
library(httr)
app <- oauth_app('facebook', appid, appsecret)
Sys.setenv("HTTR_SERVER_PORT" = "1410/")
tkn <- oauth2.0_token(
oauth_endpoints('facebook'), app, scope = c('ads_management', 'read_insights'),
type = 'application/x-www-form-urlencoded', cache = FALSE)
save(tkn, file = "~/Documents/RFiles/fb_token") # save the token for future use
Make sure you put 'read_insights' in scope, otherwise you are not telling Facebook what kind of permissions you want the app to take.
Finally you can use the token:
library(Rfacebook)
load("~/Documents/RFiles/fb_token")

Is it possible to access a shared note via the evernote SDK?

I'm wondering if its possible to access a note from the Evernote SDK that a user has shared publicly based on its URL?
Obviously you can pull the page itself down without the API, and you can't write to it either way, but I was wondering if it was possible to get a readonly copy via the API so that you could get the note data without having to attempt an unreliable screen scrape.
Yes, you can. The shared note url is of the format : hostname/shard/shardId/notGUID/noteKey .
You can parse this URL, to get all the fields separated out. Then, use authenticateToSharedNote API.
You can then use the AuthenticationResult to create a note store :
sharedNoteStoreUrl = AuthenticationResult.noteStoreURL;
TBinaryProtocol sharedNoteStoreProt = new TBinaryProtocol(new THttpClient(sharedNoteStoreUrl));
NoteStore.Client sharedNoteStore = new NoteStore.Client(sharedNoteStoreProt,sharedNoteStoreProt);
You can then access the note with the getNote API, using the auth token from step 2.

Sitecore - build URL with Agent

I have an Email sending class, when the activate the item it generates a link to the dash board as follows,
Item dashboardItem = DatabaseManager.WebDatabase.GetItem"/sitecore/content/Public/Pages/Users/Dashboard");
string url = LinkManager.GetItemUrl(dashboardItem, opt);
URL generated as http://mysite/Pages/Users/Dashboard, which is the expected behaviour. This is the user accessible URL.
I am trying to generate the same Email using a scheduled task. But when it runs and tries to execute this code URL generated as follows,
http://127.0.0.1/sitecore/content/Public/Pages/Users/Dashboard
Seems like when we are using the scheduler LinkManager can not identify the URL mapped with the item. How can I generate the user accessible URL with the scheduled task?
This happens because the scheduled task is running in a different SiteContext.
In the code of your task, you should manually switch to the SiteContext that contains the item you are linking to.
In such way:
using (new Sitecore.Sites.SiteContextSwitcher(
Sitecore.Sites.SiteContext.GetSite("your_site_name")))
{
// load item & generate url here ...
}
your_site_name is the site name that is configured in the <sites> configuration.

Resources