Initially, I had created a case to report a problem with credentials assignment to Cloudant. But, after a few iterations with support, I am no longer able to view my own case via the link I get in the support e-mail.
I only get a message You do not have the right permissions to view cases.
So, I try to open a new case - but then I get You do not have the right permissions to open cases and a description telling me what to do.
Following the steps (Creating an access group for working with cases), I'm able to follow the first steps (From the menu bar, go to Manage > Access (IAM), select Access groups, and click Create), but at the Access Groups page, there is no Create button or any way to create a new access group.
So, I'm not even able to ask for support any more...
You can always open support cases via email to support#cloudant.com - if you provide your Cloudant account name (the one that ends with -bluemix) it’s easier for support to locate you.
You still have a -bluemix account - look at the URL when opening the Cloudant dashboard. Here’s an instance I just created using my internal IBM creds...
https://5217efab-4dcf-4ea0-a1c7-a0ea017a8ccd-bluemix.cloudant.com
Related
I am trying to give access to a PowerBI(PBI) workspace for an Active Directory(AD) group comprised of few users. When users login to PBI service, they cant see the workspace. The type of the AD group where these users are, set as a Distribution List. There is another separate workspace I created where users in an AD group with type - Mail Enabled Security. Those users can see that Workspace with no issues. Level of permission the AD group was given for this Distribution list PBI workspace was - Viewer. When the users are individually added to this workspace, they can see the workspace. Could someone kindly confirm, if the AD group type has to be a - Mail Enabled Security for the users to see the PowerBI workspace?
According to PowerBI documentation, PBI Workspace also supports AD groups of the type, Distribution List.
See the link
https://learn.microsoft.com/en-us/power-bi/admin/service-admin-rls
Thank you for your replies. Much appreciated.
enter image description here
Edited: Hi Andrey, I added an extra image. This I got from a posting in a blog post. It's confusing whether the group has to be security group or a distribution lists are also allowed under PBI workspaces. According to this image, distribution lists are also allowed.
Also want to add that PBI workspace here was created as new workspace type not the Classic type. Under the point 2 in the link, what that images says confirms by the Microsoft PBI documentation.
https://learn.microsoft.com/en-us/power-bi/collaborate-share/service-give-access-new-workspaces
In order to make it simple, I didn't mention the fact that these groups are being used to access couple of reports inside the workspace. These reports use roles that maintain Row Level Security. I thought it would still show the users in the group the workspace even though they might not get access to the individual reports inside. Am I too optimistic here?
Edit 2:
Thank you everyone. The issue has been resolved without me doing anything. It was a delay in syncing the changes within the office365/AD accounts/PowerBI it seems. Just for the record I will leave this post here hoping it might help someone with my situation in the future.
I'm new to Appmaker & for the most part, advanced web development. In 2016 I created a very rudimentary AMP stack page to be used in my employers office to take leads by phone and email, then afterward the estimators log in and claim them (place their name in a field) to remove them from available leads. I used Adobe CS5 Dreamweaver, which I'm sure you all know, no longer receives support for their PHP backend since PHP has had so much change. By the way, I know very little about PHP or Mysql, that's why I used Dreamweaver and I now move toward Appmaker. I also have no scripting background which is where I'm stuck at now, I think.
It took me awhile but I figured out how to setup Appmaker (We have no sysadmin, so I dug around until I got it working). I now know the basics of Appmaker, I even paid to take the Appmaker University Bootcamp course which did open my eyes to the correct way to build pages. Onto my issue...
Lead comes in and shows in main list, estimator views details and clicks one of two checkboxes. 1.) Pass (Not interested, do not list anymore in my view)
2.) Claim (Move into claimed status, now owns this lead).
I have not started using database relations and I am unsure if this should be an instance where I should use them, but for now I just have Claim and Pass as Boolean table entries. I also have for each of these, accompanying table fields of Claim Date, Claim Estimator & Pass Date, Pass Estimator (Multiple Estimators can pass but only one can Claim).
I cannot figure out how to have the backend enter the date and user email upon clicking the Pass or Claim checkboxes.. I have tried adding stuff to onCLick and onValidate and nothing seems to work. Im confident I am looking in the wrong direction, please help.
One way to solve this would be to use an onBeforeSave event at the model level.
if(record.Claim){
record.ClaimEstimator = app.a.a.a[38].authority.getUsername();
record.ClaimDate = new Date();
}else if(record.Pass){
record.PassEstimator = app.a.a.a[38].authority.getUsername();
record.PassDate = new Date();
}
We were checking newly implemented Google Analytics for our mobile app and surprisingly there are a lot of visitors from multiple countries but in actuality, we haven't released our app for any store and it's just beta between 5 main users.
After checking Google Analytics report in details we have found that it got spammed by Bot call "Trumps Bot" when something happens on your account you can see following lines in your language section.
“Secret.ɢoogle.com You are invited! Enter only with this ticket URL. Copy it. Vote for Trump!”
There are a lot of solution available to avoid this data in your reports using the filter but i was just wondering if there is any concrete solution on permanently remove this data from my reports and also is there anything we can do to avoid such data in future as its seriously affecting business strategy.
Due the tecnology used on Google Analytics the only way to eliminate this referal is using a filter, check one common point of all this hits . In this case is a hard one, because all the parameters changes , exept for the language, for a well know reason, to see the spam.
So try to use this one, in my case works
I highly recommend you read the community policy, this can be considered as off-topic question
Analytics spammers are always trying to find new ways of getting attention, and with this one, this spammer hit it big.
It is not possible to permanently remove it unless you delete the whole property. But you can create and advance segment to get a clean view.
But the most important part is blocking it so it doesn't pollutes your data. For this particular type of spam you should create a custom exclude language filter with this expression:
\s[^s]*\s|.{15,}|.|,
That expression will block any hit that doesn't use a proper language. That combined with a valid hostname filter should prevent most of the current spam and save you a lot of headaches.
If you need help, you can check this step by step guide for building these filters and creating the advanced segment to remove it from your historical data.
Here is also a related question.
Login in to Your Google Analytics account
Select ADMIN Section
Click on All Filters -- Add Filters
Give a filter name such as -- Include only website traffic
In Predefined section, select Include Only
for more... Click Here
I'm getting some very strange behavior in DTM. When our page loads (from a local instance of the website) we get the expected call going out with the proper dev report suite. When a custom link call is made from that page, for some reason DTM sends it with a production report suite. If I look in Adobe Analytics for the custom link name reported under the prod RSI, it does not show up in there.
Any ideas on what is going on and how I can fix this issue?
This is my shot in the dark based on what you have said, and it is based on the assumption that your statements are true (e.g. you aren't seeing pink elephants, that the request was indeed showing your prod rsid in the proper portion of the request url, that you did in fact check your prod rsid after an acceptable amount of time has past, no segment or other filter shenanigans, etc..: in short, that you do know how to accurately perform the basic QA song and dance).
Under that assumption, the below is a scenario that can plausibly reproduce what you are describing. I could be partially right or totally off for your specific situation, but there's really no way for me to know for sure without having access to your DTM instance.
The Scenario
Long story short is it sounds like you have a blend of custom code and DTM automatic settings enabled, and DTM is overriding and/or not caring about your custom code for link tracking.
More specifically, it sounds to me like you have AA implemented as a tool in DTM, and in the config settings, you have your production and staging rsids specified in the text fields.
Then in the General section, you either do NOT have values specified for Tracking Server and Tracking Server Secure, or else they are set to the wrong values.
Then, in the Library Management section, you have either selected "Managed by Adobe" in which case DTM takes care of the library, or else you have selected "Custom" and you are adding the library yourself AND you have NOT checked "Set report suites using custom code below".
Then, somewhere in DTM (e.g. the Library Management > Custom code box, or Customize Page Code codebox) you have code that pops rsid stuff (e.g. s.account, s_account, dynamicAccountList stuff), and possibly also trackingServer and trackingServerSecure.
Finally, you (like most other people, because DTM's double script include for staging vs. prod is.. dumb) just use the prod script include on your page, and either use the debug/staging mode or rely on whatever rsid routing logic you've setup to route to dev.
So.. when the page is first loaded, DTM loads the AA library and it sets variables and stuff based on what you specified in the tool config. During this time, it is also popping any custom code blocks you have in the tool config, which may or may not override what you have specified in the tool config fields, depending on what you enabled. Then after that, it pops stuff you have in page load rules (if any), etc..
But then comes the link click.. As I have mentioned in other posts on SO, DTM has this caveat (IMO bug) about how it references the AA object after the initial page load/AA request: basically, it doesn't. Instead, it makes use of internal methods (the main one being a .getS() method) to create a new instance of the AA object, based on whatever things you have configured in the tool config section. Well here's the rub.. it does NOT account for or execute any custom coding you have done in code boxes in the tool config section.
So that basically happens whenever an event based or direct call rule is triggered, and it effectively screws you. Why does DTM do this? I do not know. IMO Adobe needs to change this feature caveat bug. Either they should refactor DTM to execute the code boxes, OR they could, you know.. just reference the original AA object created, like any normal script would do..
But in any case..
So for example, my theory here is that page loads fine, points to dev rsid based on your setup. But then you click a link and an event triggers, and DTM makes a new AA object not caring about your custom code, so all it has to go on is what you have in the tool's config fields.
Since DTM doesn't actually have any rules around the prod vs. dev rsids you specify in those fields (you have to write custom code in the custom code boxes - that DTM ignores!), it just pops the prod rsid, because that's the script include you have on your page.
Then as far as not seeing the data actually show up in your prod rsid: again, since DTM ignores what you set in your custom code boxes, it's defaulting to what is specified in the trackingServer fields in the tool config, and my assumption here is they are either blank or wrong (you should be able to look at the request url to adobe to verify this). This theory is because you said the prod rsid is right, and you see a request being made. So the next culprit would be wrong tracking server specified.
So, that is my theory of what's going on. Maybe it's all right, maybe it's some right, hopefully it may point you in the right direction at least.
Edit:
If you can confirm that this is indeed how you have things setup, then you will naturally ask "Okay, well what do I do about that?". As I have said in a lot of my other SO answers.. basically, your only option is to uncheck all the settings that make DTM automate AA, and in all your rules, keep the AA section disabled and whatever AA vars you wanna set, set them yourself and make the s.t() or s.tl() call yourself in a 3rd party script code box, so that it continues to reference and pop based off the originally instantiated AA object.
Update
Based on your comments below, okay so yeah.. that sounds like what I described, and accounts for prod rsid popping. As for data not showing up in report.. so if you are certain tracking server is set correct (the request url looks good) then this isn't a DTM issue. Here are some other explanations for why the data wouldn't show up:
Are you sure the request is being sent to your prod rsid? I don't know what you are looking at to verify this, but this is where you should be looking: In the request URL to AA: "http://[trackingServer value]/b/ss/[s.account value]/1..."
Click request isn't making it to Omniture. Verify in a packet sniffer that the request is actually made and that you are getting a 200 OK or NS_Binding_Aborted response.
You aren't waiting long enough to check for the data. Even basic hit data and looking at "real time" reports takes a little bit of time to show up.
You have a segment/filter active that's not jiving with the data you are trying to look at. Make sure that you don't have anything applied. Or, if you are using those things to find your data (and aren't seeing it), ensure that you are correctly applying it.
You recently created the rsid and the "go live" date hasn't passed yet. Data will not show up in the report suite until up to 24 hours after the specified "go live" date.
You have a vista rule in place that's affecting data showing up. Some companies have a vista rule in place for a number of reasons and there are a million ways it could affect data (e.g. routing to a different report suite). For shits and grins, check your dev (or other rsids) to see if your data showed up there. Even if that doesn't make sense, at least it's a step forward.
You have a bots / ip exclusion rule in place that's catching data from your location.
The data sent in from the link click isn't relevant to the report. For example, maybe you are looking at e.g. prop10 report and prop10 isn't actually sent in the click request.
I know a lot of these are basic things to check, and no doubt you've checked, but check again. Have someone else check for you to be sure. I'm not questioning g your abilities here, but even the best of coders forget to cross their t's and dot their i's sometimes, and manage to miss obvious things. If you are sure about all of these then contact Adobe ClientCare because I really can't think of anything else that wouldn't involve an issue with Adobe's backend.
I ran into a similar problem with my implementation. Essentially what I did was set the s.account variable directly inside the doPlugins, so it would be set on all tracking calls. I wrote specifics here also: DTM Tracking Account
I am playing with Google Analytics API and found that when I get the web property list, I have a defaultProfileId very useful. It can just help me pass the queryProfiles call, to save one request and make the whole app works faster.
But I noticed that some web properties just don't have the defaultProfileId thing.
Just for the information, most of the situations happens to a tracking ID like UA-XXXX-1.
Any tips?
Thanks!
You are correct webProperty does not always return a defaultProfileId. I was also unable to find any information on the Web Properties page as to how the API decides what a Default Profile Id is. I submitted a bug report for it, with the Analytics dev team you can find it at: defaultProfileId - not always sent with a WebProperty. Lets hope they come with a response you are correct this is a very useful feature.
Yes you are probably going to have to query the profiles every time to get the correct profile you are after.
I just found this:
https://www.googleapis.com/analytics/v3/management/accounts/~all/webproperties/~all/profiles?oauth_token={Token}
There should be away of working that to make one request for accounts, one to get all the Web Properties , then one to get all the Profiles.