I'm building a website in which folks register for services to make passive income. Additionally, these services offer referral bonuses.
The default setting is to have my personal referral links. I have icons displayed that have URLs for my referral bonus.
I'd love for users to be able to create an account on the website and have it generate a referral URL. When they share it, I want the page to be identical to the original, but the icons to now have their referral links.
I can create a wordpress site that has users log in. I can also assign them a generic identifier. I can also create a submission form that stores their specific referral URLs in the userdata of the website.
What I can't do is automatically change the icon URLs to match their referral links.
Any ideas?
I've found that automatically changing hyperlinks to icons is a database task. Also, dynamic pages on Wix will allow custom user URL's.
However, the most effective method of this is to have an affiliate system and pay for it.
Related
I would like to use the AutoFill Plugin to easily populate profile form fields from LinkedIn profiles into a website I have created that uses registration forms, and for users to enter a profile. I would like to use your AutoFill Plugin to offer users to fill this profile with their LinkedIn profile. In reading your documentation: https://learn.microsoft.com/en-gb/linkedin/consumer/integrations/self-serve/plugins/autofill-plugin
It says "domains must be allowlisted for LinkedIn AutoFill to properly function. Contact your LinkedIn representative for more information."
CanI ask to add the following domains to this allowlist?
The site is question is https://go-digital-platform.thisiscrowdlab.com/
Is it possible to also add a local dev URL http://events.local (only if possible, as the above one is more important).
Or do let me know who I should contact to add domains to the allowlist for this plugin.
Many thanks,
I have nopCommerce website and when testing in Facebook Sharing Debugger, it shows canonical url as "/login?ReturnUrl=%2F". Also Google Search Console says that my sitemap is not valid XML but a HTML page - I think google bots are also redirected to login page.
Facebook Sharing debugger screenshot here
Google Search Console screenshot here
My ACL Public Store rules are all set to enabled for customers and as you can see, you can visit all the pages without loging in.
Why are google and facebook bots redirected through login page? How can I fix this?
Solution is to make sure that table [Customer_CustomerRole_Mapping] has two records:
Customer_Id | CustomerRole_Id
2 | 4
3 | 4
These represent permissions for search engine crawlers and etc.
So these records give them same permissions as Guest.
You should check your Search engine friendly page name filed in all pages that included in your sitemap.xml
there's always a missed SEname filed in pages like content management, products, manufacturers, blogs and etc.
if you have plenty of URLs inside your sitemap and it's hard to find it you can disable/enable below options in https://yoursitename.com/Admin/Setting/GeneralCommon
Sitemap includes categories
Sitemap includes manufacturers
Sitemap includes products
To find which of these pages have a missed SEname filed.
If this solution does not help you, you should use a query to find which pages does not have SEname field.
Have setup a WordPress multisite which is using the same GA tracking ID to make things much simpler in our Google Analytics Dashboard (plus there is like a limit of 50 individual IDs you can track with one account I think?).
WP multisite is setup on subdirectories and not subdomains, e.g.
mysite.com.au/sitename1
mysite.com.au/sitename2
Anyway, some of the admin from the sites want access to the Google Analytics for their site. Upon investigating, I found that they would be able to access stats for other sites (besides their own) within the multisite. We definitely do not want this.
Is there a way to limit a GA user account so that they can only view pages within their own site (a fixed subdirectory), for example:
mysite.com.au/sitename1
and not be able to view information from any other site subdirectories? I'm a beginner to Google Analytics and I heard I might be able to setup particular views, goals or campaigns to achieve this.
You can do this with Views (bad thing - each property can just have 25 Views)
You find Views in the right column of the Admin page. Add a new View through the Dropdown. You can now set a filter to this View (/sitename1).
After this click on "User Management" in the View column and give the User you want to have there access.
My question is quite similar to this question. However my concerns are not fully answered there, so I am posting a separate question.
I will try to be as detailed as possible here.
I have to build a website (SAAS), say abc.com wherein registered users would get a subdomain on the website, like abc.com/def or pqr.abc.com.
Now some of those users might want to have their own domains in use. for eg. 123.com or xyz.com.
All of these websites need to have identical backend (dashboard). But most importantly a visitor should be able to type a search term on the main website (abc.com), and the search should contain results from ALL websites including the subdomains (abc.com/def or pqr.abc.com) and custom domains (xyz.com).
I am not versed with other frameworks, so I figured out that WordPress could be a good solution.
My approach was that every registered user would be assigned role of an author, with them being able to create/edit their own content. I would then add custom post type for the exact type of content they can add. I would then use dashboard customizing plugins (like Adminimize) to configure what admin menus can the editors see. THis way I would be able to define/force the fields they can use for adding content, and I can also restrict the custom taxonomoies and terms they can use. And also be able to search through the content created by any user.
The only issue here is to create domains for the users.
The I heard of domain mapping. So, is it possible that map domains like xyz.com to abc.com in such a way that whenever a user types 123.com (or xyz.com) in address-bar, they are served the content of abc.com, but still see 123.com (or xyz.com) in their address bar.
I believe this is called masked domain forwarding. I tried a bit of it, and succeeded partially in that whenever a visitor types 123.com (or xyz.com) in address-bar, they are served the content of abc.com, but still see 123.com (or xyz.com) in their address bar. The problem is that whenever users type 123.com/wp-admin/ then instead of getting to the login screen, they see blank screen.
Not sure if the setup is corret, or if it is even acievable using WordPress.
Another alternative could be using WordPress multisite. But it has limitations for my caase:
1. Search across all sites in network is going to be a very expensive operation
2. I would not be able to force identical terms of custom taxonomies across all sites. I can create the taxonmies and terms using code and put it in a plugin and network activate it. It would work for new terms. But when I decide to delete/edit a term, I will have to login to each site's dashboard to sync the terms.
So, is there a way with WordPress to achieve what i am trying to do :custom domain names and identical dashboards, that can be controlled/dictated by admin (me) , and the facility to search through all the sites/domains.
If not with WordPress, then is there any other framework with which I can do this?
In our website built on WordPress, we changed name of one of our Custom Post type from 'A' to 'B' and also changed hierarchy of few categories.
Now, the problem is that google is indexing/crawling the old 'A' CPT Name and also old catgeory structure, which is leading to either random pages (because WordPress makes guess and shows page with those keywords in URL) or 404 errors.
What can we do (via Webmaster Tools) to make google re-index our whole site and start honoring our new structure? Thanks.
Here is the brief explanation of the Google's indexing policy:
The process
The crawl process begins with a list of web addresses from past crawls and sitemaps provided by website owners. As Google crawlers visit these websites, they look for links for other pages to visit. The software pays special attention to new sites, changes to existing sites and dead links.
Computer programs determine which sites to crawl, how often and how many pages to fetch from each site. Google doesn't accept payment to crawl a site more frequently for your web search results. They care more about having the best possible results because in the long run that's what's best for users and, therefore, their business.
Choice for website owners
Most websites don't need to set up restrictions for crawling, indexing or serving, so their pages are eligible to appear in search results without having to do any extra work.
That said, site owners have many choices about how Google crawls and indexes their sites through Webmaster Tools and a file called “robots.txt”. With the robots.txt file, site owners can choose not to be crawled by Google bot or they can provide more specific instructions about how to process pages on their sites.
Site owners have granular choices and can choose how content is indexed on a page-by-page basis. For example, they can opt to have their pages appear without a snippet (the summary of the page shown below the title in search results) or a cached version (an alternate version stored on Google's servers in case the live page is unavailable). Web-masters can also choose to integrate search into their own pages with Custom Search.
Read more here and here.