Smartsheet API Bulk Update Multiple Sheets - smartsheet-api-2.0

I have had a lot of success with the SS REST API. I am currently trying to perfect the "Work at Scale" component of a few of my integrations. As the subject suggests I am trying to figure out if it is possible to construct a single JSON payload that will update multiple sheets with one call. I can construct the payload for a single sheet and then loop through a list of those sheets. However, with the potential of needing to update thousands of sheets the aforementioned functionality would drastically improve my processing time. Any suggestions? Does a comprehensive list of "Bulk Operations" exist?

I'm glad to hear of your success so far. But no, there is no way to update more than one sheet in a single call. You can update multiple rows in a sheet with a single call.

Related

How to Combine multiple files in BizTalk?

I have multiple flatfiles (CSV) (with multiple records) where files will be received randomly. I have to combine them (records) with unique ID fields.
How can I combine them, if there is no common unique field for all files, and I don't know which one will be received first?
Here are some files examples:
In real there are 16 files.
Fields and records are much more then in this example.
I would avoid trying to do this purely in XSLT/BizTalk orchestrations/C# code. These are fairly simple flat files. Load them into SQL, and create a view to join your data up.
You can still use BizTalk to pickup/load the files. You can also still use BizTalk to execute the view or procedure that joins the data up and sends your final message.
There are a few questions that might help guide how this would work here:
When do you want to join the data together? What triggers that (a time of day, a certain number of messages received, a certain type of message, a particular record, etc)? How will BizTalk know when it's received enough/the right data to join?
What does a canonical version of this data look like? Does all of the data from all of these files truly get correlated into one entity (e.g. a "Trade" or a "Transfer" etc.)?
I'd probably start with defining my canonical entity, and then look towards the path of getting a "complete" picture of that canonical entity by using SQL for this kind of case.

2sxc: Merge more streams in a Default one

I have an app that can have one or more streams
Example:
Book of author A
Book of author B
Book of author C
So my queries can have one or more relationship filters.
Assuming that I would like to use only one template for more views, and a view can have more streams so I can't have the names of each one in my template, how can i do that?
Basically in my template I would like to have a unique list even if I got more streams
AsDynamic(Data["Default"]) //This should get all the streams in my data
Is that possible? Maybe aggregating them in Visual query?
I'm trying to have an out stream coming from many but giving the same name I got and error.
At the moment this is not possible (2sxc 8.5.6). There are a few problems related to this idea
the same item could occur multiple times, this is not supposed to happen in a stream
you will probably loose the "which author was this for" information
As of now, I recommend to either just merge them in js or server-side code if this is what you need.

Exporting all Marketo Leads in a CSV?

I am trying to export all of my leads from Marketo (we have over 20M+) into a CSV file, but there is a 10k row limit per CSV export.
Is there any other way that I can export a CSV file with more than 10k row? I tried searching for various dataloader tool on Marketo Launchpoint but couldn't find a tool that would work.
Have you considered using the API? It may not be practical unless you have a developer on your team (I'm a programmer).
marketo lead api
If your leads are in salesforce and marketo/salesforce are in parity, instead of exporting all your leads, do a sync from salesforce to the new MA tool (if you are switching) instead. It's a cleaner easier sync.
For important campaigns etc, you can create smart lists and export those.
There is no 10k row limit for exporting Leads from a list. However, there is a practical limit, especially if you choose to export all columns (instead of only the visible columns). I would generally advise on exporting a maximum of 200,000-300,000 leads per list, so you'd need to create multiple Lists.
As Michael mentioned, the API is also a good option. I would still advise to create multiple Lists, so you can run multiple processes in parallel, which will speed things up. You will need to look at your daily API quota: the default is either 10,000 or 50,000. 10,000 API calls allow you to download 3 million Leads (batch size 300).
I am trying out Data Loader for Marketo on Marketo Launchpoint to export my lead and activity data to my local database. Although it cannot transfer marketo data to CSV file directly, you can download Lead to your local database and then export to get a CSV file. For your reference, we have 100K leads and 1 billion activity data.
You might have to run multiple times for 20M leads, but the tool is quite easy and convenient to use so maybe it’s worth a try.
Initially there are 4 steps to get bulk leads from marketo
1. Creating a Job
2. Enqueue Export Lead Job
2. Polling Job Status
3. Retrieving Your Data
http://developers.marketo.com/rest-api/bulk-extract/bulk-lead-extract/

Should I use Wordpress Transient API in this case?

I'm writing a simple Wordpress plugin for work and am wondering if using the Transients API is practical in this case, or if I should seek out another way.
The plugin's purpose is simple. I'm making a call to USZip Web Service (http://www.webservicex.net/uszip.asmx?op=GetInfoByZIP) to retrieve data. Our sales team is using a Lead Intake sheet that the plugin will run on.
I wanted to reduce the number of API calls, so I thought of setting a transient for each zip code as the key and store the incoming data (city and zip). If the corresponding data for a given zip code already exists, then no need to make an API call.
Here are my concerns:
1. After a quick search, I realized that the transient data is stored in the wp_options table and storing the data would balloon that table in no time. Would this cause a significance performance issue if the db becomes huge?
2. Is this horrible practice to create this many transient keys? It could easily becomes thousands in a few months time.
If using Transient is not the best way, could you please help point me in the right direction? Thanks!
P.S. I opted for the Transients API vs the Options API. I know zip codes don't change often, but they sometimes so. I set expiration time of 3 months.
A less-inflated solution would be:
Store a single option called uszip with a serialized array inside the option
Grab the entire array each time and simply check if the zip code exists
If it doesn't exist, grab the data and save the whole transient again
You should make sure you don't hit the upper bounds of a serialized array in this table (9,000 elements) considering 43,000 zip codes exist in the US. However, you will most likely have a very localized subset of zip codes.

Is it possible to hook into GDI+ and save all strings sent to DrawString?

I need to get a large table (300K+ rows) from an application and there is no export function.
After a lot of unsuccessful attempts I'm left with a Copy Paste macro that goes one row at a time. If there is a way to get the strings as they are drawn I could get a page(40 rows) at once.
If you aren't doing this for commercial purposes, you can use Detours to hook into drawstring really easily. There are some examples using detours on CodingTheWheel's blog series on a pokerbot. Even if the Detours option is unavailable, there is tons of information on windows api hooking on the web.

Resources