Replace single quotes - google-forms

I am using Google Forms to get data and them importing it into Google Fusion Tables. The problem is if someone type an '. The script I am using is here. Could I edit it to replace or remove all apostrophes before syncing? If it is possible I would prefer to do it inside of the Google Spreadsheet using some formula. I think I could also use another script just to do this but I am not really sure how to do that.
I am using the form to gather applications for dogs so people of all ages will be using it. I really wish I could just say don't use apostrophes but people will not always see that.
Thanks for any help!

Sure here is an easy way with JS:
Working Fiddle: https://jsbin.com/hosagu/3/edit?js,console
var str = "don't";
var test = "f's'a'g'd'a's'd'g'";
console.log(str.replace(/\'/g, "")); // "dont"
console.log(test.replace(/\'/g, "")); //"fsagdasdg"
As a 'Script' or function:
function REMOVEAPOSTROPHE (string) {
return str.replace(/\'/g, "");
}

Related

Are there Tags that could solve this?

New to coding, trying to see what/if Tags would make this code work
So I'm a beginner with basic understanding and more of a Graphics/ Designer than code based. I found a codepen by WEDOO that has exactly what I need and want to try to just swap my "animationData" to see if I can get it to work and then modify it as needed for my test (will be assign the button code to various objects for the SVG). I Can't seem to find the right "Tags" or determine if its referencing an external script...I'd image it just needs the right information to function...is that correct?
Thanks in advance!
var animation = bodymovin.loadAnimation({
container: targetAnim,
path: 'https://s3-us-west-2.amazonaws.com/s.cdpn.io/914929/data-testo4.json...
My output from BodyMovin in a JSON file:
var animationData =
{"v":"5.4.2","fr":29.9700012207031,"ip":0,"op":149.000006068894...
Does it make sense to think that I need replace the var animation with info that should be the targetAnim with the code in the JSON file? So far put the var animationData breaks things and does nothing (visually).
The bodymovin.loadAnimation can be passed either a URL to a Bodymovin JSON via the path option OR you can pass the animation JSON inline by setting the animationData option instead.
In you case it would end up looking something like:
var animation = bodymovin.loadAnimation({
container: targetAnim,
animationData = {"v":"5.4.2","fr":29.9700012207031,"ip":0,"op":149.000006068894...
...
})

Stackmob fetchExtended not working

I have started to use Stackmob as a backend for a simple app I am building.
In stackmob I have set a relationship between two schema's and want to use '.fetchExpanded' to grab all of the data from stackmob, see this fiddle (will need to view the console to see the output):
http://jsfiddle.net/mcneela86/65Rax/
.fetchExtended(1);
The same code works using the '.fetch' instead of '.fetchExpanded'.
Has anyone come across this before?
Would really appreciate any help.
Ok, I found a work around for this.
Instead of using '.fetchExtended(1);' I will just use '.fetch();' and when I am defining the model I will change the following:
var Bike = StackMob.Model.extend({
schemaName: "bikes"
});
to:
var Bike = StackMob.Model.extend({
schemaName: "bikes?_expand=1"
});
This seems to remove the need for '.fetchExtended(1);'
Hope this helps someone else.

Add a timestamp and info to a saved CSS/HTML file

Looking for a way to add a custom timestamp(and maybe some extra info) inside a CSS/HTML file everytime you save the file. Is there an extention for brackets, sublime or dreamweaver that does this, Or perhaps some other way to do this?
Thanks
I think this post will give you a Sublime Text plugin that should do what you ask.
In Brackets, you can configure the snippets extension to insert a timestamp manually, like the answer above for Sublime. Here's how: https://stackoverflow.com/a/18844762/1172352.
Similar to the Sublime answer, it would be a bit trickier to do it automatically every time you save. There's not yet a clean hook in Brackets for pre-save processing. Several extensions get around this by listening for a post-save event and saving quickly a second time. You could probably write a timestamp-auto-inserter extension by borrowing their code for that pattern.
Anything that runs automatically would also need a little extra code to find the old timestamp and replace it -- both the snippets solution here and the Sublime solution above just insert the timestamp wherever the cursor/selection is. A regular expression should do the trick for detection.
You'd also want to screen out other file types. Bringing it all together, it would look something like this for Brackets:
function documentSavedHandler(event, doc) {
// TODO: need a little extra code here to ignore save events triggered
// by ourself, to avoid infinite loop
var langId = doc.getLanguage().getId();
if (langId === "html" || langId === "css") {
var pos = /* use regexp to find old timestamp */;
doc.replaceRange(timestampStr, posStart, posEnd);
CommandManager.execute(Commands.FILE_SAVE);
}
}

Fix serialized data broken due to editing MySQL database in a text editor?

Background: I downloaded a *.sql backup of my WordPress site's database, and replaced all instances of the old database table prefix with a new one (e.g. from the default wp_ to something like asdfghjkl_).
I've just learnt that WordPress uses serialized PHP strings in the database, and what I did will have messed with the integrity of the serialized string lengths.
The thing is, I deleted the backup file just before I learnt about this (as my website was still functioning fine), and installed a number of plugins since. So, there's no way I can revert back, and I therefore would like to know two things:
How can I fix this, if at all possible?
What kind of problems could this cause?
(This article states that, a WordPress blog for instance, could lose its settings and widgets. But this doesn't seem to have happened to me as all the settings for my blog are still intact. But I have no clue as to what could be broken on the inside, or what issues it'd pose in the future. Hence this question.)
Visit this page: http://unserialize.onlinephpfunctions.com/
On that page you should see this sample serialized string: a:1:{s:4:"Test";s:17:"unserialize here!";}. Take a piece of it-- s:4:"Test";. That means "string", 4 characters, then the actual string. I am pretty sure that what you did caused the numeric character count to be out of sync with the string. Play with the tool on the site mentioned above and you will see that you get an error if you change "Test" to "Tes", for example.
What you need to do is get those character counts to match your new string. If you haven't corrupted any of the other encoding-- removed a colon or something-- that should fix the problem.
I came to this same problem after trying to change the domain from localhost to the real URL. After some searching I found the answer in Wordpress documentation:
https://codex.wordpress.org/Moving_WordPress
I will quote what is written there:
To avoid that serialization issue, you have three options:
Use the Better Search Replace or Velvet Blues Update URLs plugins if you can > access your Dashboard.
Use WP-CLI's search-replace if your hosting provider (or you) have installed WP-CLI.
Run a search and replace query manually on your database. Note: Only perform a search and replace on the wp_posts table.
I ended up using WP-CLI which is able to replace things in the database without breaking serialization: http://wp-cli.org/commands/search-replace/
I know this is an old question, but better late than never, I suppose. I ran into this problem recently, after inheriting a database that had had a find/replace executed on serialized data. After many hours of researching, I discovered that this was because the string counts were off. Unfortunately, there was so much data with lots of escaping and newlines and I didn't know how to count in some cases and I had so much data that I needed something automated.
Along the way, I stumbled across this question and Benubird's post helped put me on the right path. His example code did not work in production use on complex data, containing numerous special characters and HTML, with very deep levels of nesting, and it did not properly handle certain escaped characters and encoding. So I modified it a bit and spent countless hours working through additional bugs to get my version to "fix" the serialized data.
// do some DB query here
while($res = db_fetch($qry)){
$str = $res->data;
$sCount=1; // don't try to count manually, which can be inaccurate; let serialize do its thing
$newstring = unserialize($str);
if(!$newstring) {
preg_match_all('/s:([0-9]+):"(.*?)"(?=;)/su',$str,$m);
# preg_match_all("/s:([0-9]+):(\"[^\"\\\\]*(?:\\\\.[^\"\\\\]*)*\")(?=;)/u",$str,$m); // alternate: almost works but leave quotes in $m[2] output
# print_r($m); exit;
foreach($m[1] as $k => $len) {
/*** Possibly specific to my case: Spyropress Builder in WordPress ***/
$m_clean = str_replace('\"','"',$m[2][$k]); // convert escaped double quotes so that HTML will render properly
// if newline is present, it will output directly in the HTML
// nl2br won't work here (must find literally; not with double quotes!)
$m_clean = str_replace('\n', '<br />', $m_clean);
$m_clean = nl2br($m_clean); // but we DO need to convert actual newlines also
/*********************************************************************/
if($sCount){
$m_new = $m[0][$k].';'; // we must account for the missing semi-colon not captured in regex!
// NOTE: If we don't flush the buffers, things like <img src="http://whatever" can be replaced with <img src="//whatever" and break the serialize count!!!
ob_end_flush(); // not sure why this is necessary but cost me 5 hours!!
$m_ser = serialize($m_clean);
if($m_new != $m_ser) {
print "Replacing: $m_new\n";
print "With: $m_ser\n";
$str = str_replace($m_new, $m_ser, $str);
}
}
else{
$m_len = (strlen($m[2][$k]) - substr_count($m[2][$k],'\n'));
if($len != $m_len) {
$newstr='s:'.$m_len.':"'.$m[2][$k].'"';
echo "Replacing: {$m[0][$k]}\n";
echo "With: $newstr\n\n";
$str = str_replace($m_new, $newstr, $str);
}
}
}
print_r($str); // this is your FIXED serialized data!! Yay!
}
}
A little geeky explanation on my changes:
I found that trying to count with Benubird's code as a base was too inaccurate for large datasets, so I ended up just using serialize to be sure the count was accurate.
I avoided the try/catch because, in my case, the try would succeed but just returned an empty string. So, I check for empty data instead.
I tried numerous regex's but only a mod on Benubird's would accurately handle all cases. Specifically, I had to modify the part that checked for the ";" because it would match on CSS like "width:100%; height:25px;" and broke the output. So, I used a positive lookahead to only match when the ";" was outside of the set of double quotes.
My case had lots of newlines, HTML, and escaped double quotes, so I had to add a block to clean that up.
There were a couple of weird situations where data would be replaced incorrectly by the regex and then the serialize would count it incorrectly as well. I found NOTHING on any sites to help with this and finally thought it might be related to caching or something like that and tried flushing the output buffer (ob_end_flush()), which worked, thank goodness!
Hope this helps someone... Took me almost 20 hours including the research and dealing with weird issues! :)
This script (https://interconnectit.com/products/search-and-replace-for-wordpress-databases/) can help to update an sql database with proper URLs everywhere, without encountering serialized data issues, because it will update the "characters count" that could throw your URLs out of sync whenever serialized data occurs.
The steps would be:
if you already have imported a messed up database (widgets not
working, theme options not there, etc), just drop that database
using PhpMyAdmin. That is, remove everything on it. Then export and
have at hand an un-edited dump of the old database.
Now you have to import the (un-edited) old database into the
newly created one. You can do this via an import, or copying over
the db from PhpMyAdmin. Notice that so far, we haven't done any
search and replace yet; we just have an old database content and
structure into a new database with its own user and password. Your site will be probably unaccessible at this point.
Make sure you have your WordPress files freshly uploaded to the
proper folder on the server, and edit your wp-config.php to make it
connect with the new database.
Upload the script into a "secret" folder - just for security
reasons - at the same level than wp-admin, wp-content, and wp-includes. Do not forget to remove it all once the search and
replace have taken place, because you risk to offer your DB details
open to the whole internet.
Now point your browser to the secret folder, and use the script's fine
interface. It is very self-explanatory. Once used, we proceed to
completely remove it from the server.
This should have your database properly updated, without any serialized data issues around: the new URL will be set everywhere, and serialized data characters counts will be accordingly updated.
Widgets will be passed over, and theme settings as well - two of the typical places that use serialized data in WordPress.
Done and tested solution!
If the error is due to the length of the strings being incorrect (something I have seen frequently), then you should be able to adapt this script to fix it:
foreach($strings as $key => $str)
{
try {
unserialize($str);
} catch(exception $e) {
preg_match_all('#s:([0-9]+):"([^;]+)"#',$str,$m);
foreach($m[1] as $k => $len) {
if($len != strlen($m[2][$k])) {
$newstr='s:'.strlen($m[2][$k]).':"'.$m[2][$k].'"';
echo "len mismatch: {$m[0][$k]}\n";
echo "should be: $newstr\n\n";
$strings[$key] = str_replace($m[0][$k], $newstr, $str);
}
}
}
}
I personally don't like working in PHP, or placing my DB credentials in an public file. I created a ruby script to fix serializations that you can run locally:
https://github.com/wsizoo/wordpress-fix-serialization
Context Edit:
I approached fixing serialization by first identifying serialization via regex, and then recalculating the byte size of the contained data string.
$content_to_fix.gsub!(/s:([0-9]+):\"((.|\n)*?)\";/) {"s:#{$2.bytesize}:\"#{$2}\";"}
I then update the specified data via an escaped sql update query.
escaped_fix_content = client.escape($fixed_content)
query = client.query("UPDATE #{$table} SET #{$column} = '#{escaped_fix_content}' WHERE #{$column_identifier} LIKE '#{$column_identifier_value}'")

Quotes in Asp.NET MVC View

In View,
#{
Layout = "~/Views/Shared/site.cshtml";
ViewBag.Title = "Organizations";
var organizations = "";
foreach (var org in Model)
{
organizations += "'" + org.Name + "':'" + org.Id + "',";
}
organizations = organizations.Substring(0, organizations.Length - 1);
}
Result operation: organizations = "'Passport':'14f0eac0-43eb-4c5f-b9fe-a09d2848db80','Bank':'ad1d77d8-7eb1-4a4c-9173-b0f2f7269644'";
Output the data in section JS code.
But when viewing the source code of the page in the browser, not getting what wanted.
What's the problem? How to make a normal quotes?
JS: "data": "#organizations"
Result in view webpage returned "data": "'Passport':'14f0eac0-43eb-4c5f-b9fe-a09d2848db80','Bank':'ad1d77d8-7eb1-4a4c-9173-b0f2f7269644'"
OK cool Q,try this source:
#{
var model = Model.ToArray().Select(org => string.Format("'{0}':'{1}'", org.Name, org.Id));
var data = string.Join(",", model);
}
#Html.Raw(data)
What if you change
"data": "#organizations"
to
"data": "#Html.Raw(organizations)"
#Html.Raw(content)
And check this one out: http://jeffreypalermo.com/blog/what-is-the-difference-in-lt-variable-gt-and-lt-variable-gt-in-asp-net-mvc/
In your code example here, the culprit is the
#organizations
that generates the quotes. You could instead use:
#Html.Raw(organizations)
Okay, that's great, but by creating JSON, you are doing work the framework could be doing for you. You probably want a model that the framework can serialize for you. This way you don't even need any code in your view header at all.
#Html.Raw(JsonConvert.SerializeObject(Model.ToDictionary(m => m.Name, m => m.Id))))
Above, note that I'm using Json.NET, which you probably want to NuGET into your project because Microsoft is moving to it. No point in figuring out how to use the old JSON serializer to do this. Also note that unless you are using the model for other purposes, you might choose to do the dictionary conversion outside the view, reducing the code-work in the view.
Next, if you are embedding JSON in the view, you might want to consider one of two other options. If the amount of data will be small, consider using the join method proposed by eli (here), but instead encoding it in HTML 5 "data-" elements and loading it using JavaScript. That way you keep javascript out of your view. It's just one more step of confusing when debugging javascript, looking for variables that are initialized by dynamically-generated HTML.
Better yet, create a reusable HTML helper method to transform your model into data attributes: How to use dashes in HTML-5 data-* attributes in ASP.NET MVC
Finally, if there are MANY JSON elements, consider sending the data separately. Render it with a simple Json() method in your controller, and get it with a simple jQuery $.json() in your client-side code. See here for an example
Good luck!

Resources