Get Twitter handle for an NFT contract address - web-scraping

I'm looking to get the official Twitter handle from an OS Verified project, programmatically.
I've tried calling the "collections" OS API, but the twitter_username field seems to rarely be populated with anything but "null" even for verified projects.
I've tried scraping the data manually with fetch but I got 1020 errors likely due to cloudflare protections.
Has anyone else used Moralis NFT or some other GraphHQ like service to obtain the Twitter handle of a given NFT project (starting out with a contract address)?

You can try Ubiquity API. They have an endpoint for collection metadata which also includes the Twitter account name
https://ubiquity.api.blockdaemon.com/v1/nft/ethereum/mainnet/collection?contract_address=0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D
This will give you the following response
{
"collection": {
"id": "4203aedd-7964-5fe1-b932-eb8c4fda7822",
"name": "Bored Ape Yacht Club",
"description": "The Bored Ape Yacht Club is a collection of 10,000 unique Bored Ape NFTs— unique digital collectibles living on the Ethereum blockchain. Your Bored Ape doubles as your Yacht Club membership card, and grants access to members-only benefits, the first of which is access to THE BATHROOM, a collaborative graffiti board. Future areas and perks can be unlocked by the community through roadmap activation. Visit www.BoredApeYachtClub.com for more details.",
"logo": "https://ubiquity.api.blockdaemon.com/v1/nft/media/ethereum/mainnet/collection/4203aedd-7964-5fe1-b932-eb8c4fda7822/logo.png",
"banner": "https://ubiquity.api.blockdaemon.com/v1/nft/media/ethereum/mainnet/collection/4203aedd-7964-5fe1-b932-eb8c4fda7822/banner.jpeg",
"verified": true,
"contracts": [
{
"address": "0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D",
"name": "BoredApeYachtClub",
"symbol": "BAYC",
"description": "",
"image_url": "https://ubiquity.api.blockdaemon.com/v1/nft/media/ethereum/mainnet/",
"type": "ERC721"
}
],
"meta": {
"discord_url": "https://discord.gg/3P5K3dzgdB",
"external_url": "http://www.boredapeyachtclub.com/",
"twitter_username": "BoredApeYC"
},
"sub_collection": []
}

Related

How to list all active rooms in Janus with SFU plugin?

I'm trying to build a simple Aframe chat hub like Mozilla Hubs using networked-aframe with janus as adapter. I've installed everything in one server and everything is working fine.
However, I'm trying to set a limit for max amount of users to be connected in one 'room' because otherwise the browser might crash because of too many avatars to render, then redirect new users to connect to a new randomly generated room automatically.
Is there a way to do so by using the available Janus API? So far I've tried this Janus Signalling API for SFU plugin because it's the only reference that mention how to get how many users in one room, although not directly.
The example request body:
{
"kind": "join",
"room_id": room ID,
"user_id": user ID,
"subscribe": [none|subscription object]
}
The example result, but I think it should not be done like this to achieve what I want because I need list of ALL rooms, not just 1 room:
{
"success": true,
"response": {
"users": {room_alpha: ["123", "789"]}
}
}

IPFS uri format: https://ipfs.io/ipfs/<CID> vs. ipfs://<CID>?

Here is my test tokenURI.json file w/ the imageURI I pass to my token contract.setTokenURI():
{
"attributes": [
{
"trait_type": "location",
"value": "West Awesomeville"
},
{
"display_type": "date",
"trait_type": "created",
"value": 1535250800
}
],
"description": "My awesome NFT.",
"image": "https://ipfs.io/ipfs/QmaUXii41ESnUMxLJUoVcrEeXowz7RHcdTiumvrBmUvcwG?filename=test4.png",
"name": "NFT 1"
}
Which is the best IPFS uri form to use esp. if I want to load this NFT into Opensea?
The docs in IPFS recommend:
https://ipfs.io/ipfs/<CID>
but the docs in Opensea recommend:
ipfs://<CID>
Which form is better and why?
In the above json I'm using the first form recommended by IPFS. It works but loading into Opensea is slow/somewhat unpredictable.
The form Opensea recommends is shorter, no gateway. Would the image load faster in Opensea if I used the 2nd form?
IPFS docs: Address IPFS on the Web
Opensea docs:
If you use IPFS to host your metadata, your URL should be in the format ipfs://CID. For example, ipfs://QmTy8w65yBXgyfG2ZBg5TrfB2hPjrDQH3RCQFJGkARStJb.
The ipfs:// url is the better way. Because gateways can go down. Now the ipfs pinner that you're using (pinata.cloud?) can also go down, or you can stop paying them and they will disappear your stuff.
Opensea is not likely to care, as long as they can find your metadata / images from the uri returned by the contract they will list your thing, and there's a way somewhere to do a metadata refresh (if you do a reveal)
And if I can also suggest, it probably might be a good idea to include a way to update the baseURI in the contract just in case.

Is it possible to mention people in a status update via the Linkedin AP v2I?

I need to tag a company/person in a LinkedIn post which is generated by code when I use the #name in the string which is posted to Linkedin then the post turns out to just pain text and not a tag to which the person is informed.
Yes, it is possible to mention organisations or people within a share. You can do this by adding the annotations parameter for every organisation or person you want to mention. For example:
{
"annotations": [
{
"entity": "urn:li:organization:1337",
"length": 8,
"start": 6
}
],
"text": "Hello LinkedIn world!"
}
You can read more about sharing an update and mentioning in the LinkedIn doc's here: https://learn.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/share-api#share-text-and-mentions.

To get more than 5 reviews from google places API

I am doing an application where I extract the google reviews using google places API.When I read the document related to it in "https://developers.google.com/maps/documentation/javascript/places",I found out that I could get only 5 top reviews.Is there any option to get more reviews.
In order to have access to more than 5 reviews with the Google API you have to purchase Premium data Access from Google.
That premium plan will grant you access to all sorts of additional data points you have to shell out a pretty penny.
If you are a Business owner wanting to retrieve all of your reviews, you can do so but first you have to get verified and could do this through the MyBusiness API more info here: https://developers.google.com/my-business/
There is a feature request for that: Issue 7630: Response to Include More Than 5 Reviews ─ I'd recommend you "star" it to receive updates.
Unfortunately there's no way to get more than 5 reviews in the places API unless you are the business owner after getting verified as Tekill said.
But it looks like there are some external services that can get all the reviews. My guess is that they scrape them from Google Maps directly:
Some of these services are Wextractor, ReviewShake and AllReviews
Alternatively, you can use a third party solution like SerpApi to scrape all the reviews of any place. It's a paid API with a free trial.
Each page fatches 10 results. To implement the pagination just use the start parameter which defines the result offset (e.g., 0 (default) is the first page of results, 10 is the 2nd page of results, 20 is the 3rd page of results, etc.)
Example python code (available in other libraries also):
from serpapi import GoogleSearch
params = {
"engine": "google_maps_reviews",
"place_id": "0x89c259a61c75684f:0x79d31adb123348d2",
"api_key": "SECRET_API_KEY"
}
search = GoogleSearch(params)
results = search.get_dict()
reviews = results['reviews']
Example output:
"reviews": [
{
"user": {
"name": "Waylon Bilbrey",
"link": "https://www.google.com/maps/contrib/107691056156160235121?hl=en-US&sa=X&ved=2ahUKEwiUituIlpTvAhVYCc0KHbvTCrgQvvQBegQIARAx",
"thumbnail": "https://lh3.googleusercontent.com/a-/AOh14GjOj6Wjfk1kSYjhvH7WIBNMdl4nPj6FvUhvYcR6=s40-c0x00000000-cc-rp",
"reviews": 1
},
"rating": 4,
"date": "a week ago",
"snippet": "I've been here multiple times. The coffee itself is just average to me. The service is good (the people working are nice). The aesthetic is obviously what brings the place some fame. A little overpriced (even for NY). A very small cup for $6 where I feel like the price comes from the top rainbow foam decor , when I'm going to cover it anyways. If it's for an insta pic then it may be worth it?"
},
{
"user": {
"name": "Amber Grace Sale",
"link": "https://www.google.com/maps/contrib/106390058588469541899?hl=en-US&sa=X&ved=2ahUKEwiUituIlpTvAhVYCc0KHbvTCrgQvvQBegQIARA7",
"thumbnail": "https://lh3.googleusercontent.com/a-/AOh14Gj84nHu_9V_0V4yRbZcr-8ZTYAHua6gUBP8fC7W=s40-c0x00000000-cc-rp-ba3",
"local_guide": true,
"reviews": 33,
"photos": 17
},
"rating": 5,
"date": "2 years ago",
"snippet": "They really take pride in their espresso roast here and the staff is extremely knowledgeable on the subject. It’s also a GREAT place to do work although a table is no guarantee; you might have to wait for a bit. My almond milk cappuccino was very acidic at the end which wasn’t expected but I could still tell the bean was high quality. Their larger lattés they put in a tall glass cup which looks really really cool. Would definitely go again.",
"likes": 2,
"images": [
"https://lh5.googleusercontent.com/p/AF1QipMup24_dHrWtNN4ZD70EPsiRMf_tykcUkPw6A1H=w100-h100-p-n-k-no"
]
},
{
"user": {
"name": "Kelvin Petar",
"link": "https://www.google.com/maps/contrib/100859090874785206875?hl=en-US&sa=X&ved=2ahUKEwiUituIlpTvAhVYCc0KHbvTCrgQvvQBegQIARBG",
"thumbnail": "https://lh3.googleusercontent.com/a-/AOh14GhdIvUDamzfPqbYIpwhnGJV2XWSi77iVXfEsiKS=s40-c0x00000000-cc-rp",
"reviews": 3
},
"rating": 4,
"date": "3 months ago",
"snippet": "Stumptown Cafe is the perfect place to work or catch up with friends. Never too loud, never too dead. Their lattes and deliciously addicting and the toasts are tasty as well. Wifi is always fast, which is a huge plus! The staff are the friendliest, I highly recommend this place!"
},
...
]
You can check out the documentation for more details.
Disclaimer: I work at SerpApi.
Adding to the answer of #miguev, there's at the moment no way to get more than 5 top reviews without using premium APIs (according to a Google Maps guy I had a talk with) and that's pricey.
We tried to sign for The Google Maps Platform Premium Plan to show them on pages like this but Google said it's no longer available for sign up or new customers. Right now we're limited to 5 reviews only.

Using webhooks with Google Analytics

I'm trying to integrate my CRM with Google Analytics to monitor lead changes (from lead to sell) and so on. As I understood, I need to use Google Measurement Protocol, to receive webhooks from CRM and translate it to Analytics Conversions.
But in fact, I don't really understand how to do it. I need to make some script, to translate webhook code to analytics, but where I need to place that script? Are there some templates? And so on.
So, If you know some tutorials/courses/freelancers to help me with intergrating webhooks with Analytics - I need your advice.
Example of webhook from CRM:
{
"leads": {
"status": {
"id": "25399013",
"name": "Lead title",
"old_status_id": "7039101",
"status_id": "142",
"price": "0",
"responsible_user_id": "102525",
"last_modified": "1413554372",
"modified_user_id": "102525",
"created_user_id": "102525",
"date_create": "1413554349",
"account_id": "7039099",
"custom_fields": [
{
"id": "427183",
"name": "Checkbox custom field",
"values": ["1"]
},
{
"id": "427271",
"name": "Date custom field",
"values": ["1412380800"]
},
{
"id": "1069602",
"name": "Checkbox custom field",
"values": ["0"]
},
{
"id": "427661",
"name": "Text custom field",
"values": ["Валера"]
},
{
"id": "1075272",
"name": "Date custom field",
"values": ["1413331200"]
}
]
}
}
}
"Webhook" is a fancy way of saying that your CRM can call a web based service whenever something interesting happens (i.e. the CRM can "hook" into a web based application). E.g. if a new lead is created you can call an url with the lead details as parameters.
Specifics depend on your CRM, but when you set up a webhook there should be a field to set a url; the script that evaluates the CRM data is located at the URL.
You have that big JSON thing as your example - No real way to tell without knowing your system, but I assume that is sent as request body. So in your script you evaluate the request body, extract the parameters you want to send to analytics (be mindful that you are not allowed to store personally identifiable information) and sent it via the measurement protocol as described in the documentation linked in the other answer.
Depending on the system you might even be able to call the measurement protocol without having a custom script in between (after all the measurement protocol is an url with a few parameters).
This is an awfully generic answer, but then the question is really broad.
I've done just this in my line of work.
You need to first decide your data model on how you would like the CRM data to look within Google Analytics. This could be just mapping Google Analytics' event category, event label, event action to your data, or perhpas using custom dimensions and metrics.
Then to make it most useful, you would like to be able to link the CRM activity of a customer to their online activity. You can do this if they login online. In that case, you can set the cid and/or uid of the user to your CRM id.
Then, if you send in a GA hit with the same cid/uid in your Measurement Protocol hit, you will link the online sessions with your offline CRM activity.
To make the actual record hit Google Analytics, you will need to program something that takes the CRM data and turns it into a Measurement Protocol hit, which is essentially just a URL with the correct parameters. Look here for reference: https://developers.google.com/analytics/devguides/collection/protocol/v1/reference
An example could be: http://www.google-analytics.com/collect?v=1&tid=UA-123456-1&cid=5555&t=pageview&dp=%2FpageA
We usually have this as a seperate process, that fires when the CRM data is written to its database (the webhook in your example). If its a lot of data, you should probably implement checks to see if the hit was sucessful, and caching in case the service is not online - you have an optional parameter that gives you 4 hours leeway in sending data.
Hope this gets you at least started.

Resources