So I'm working on moving a web app to Azure and the one thing that is tripping me up is how to deal with image uploads. Currently, I'm handling uploads with ImageResizer and saving them to a directory:
Protected Function DoHeaderUpload() As Guid?
If HeaderUploadControl.HasFile Then
If HeaderUploadControl.PostedFile.ContentType.StartsWith("image/") Then
If HeaderUploadControl.PostedFile.ContentLength < 3145728 Then ' 3MB max size
Dim myFileGuid = System.Guid.NewGuid()
Dim i As ImageResizer.ImageJob = New ImageResizer.ImageJob(HeaderUploadControl.PostedFile,
"~/Images/Headers/" & myFileGuid.ToString() & ".jpg",
New ImageResizer.Instructions("height=100;format=jpg;mode=crop"))
i.Build()
Return myFileGuid
Else
HeaderUploadError.Text = "<br>File exceeds 3MB maximum size"
HeaderUploadError.IsValid = False
End If
Else
HeaderUploadError.Text = "<br>Only image uploads are supported"
HeaderUploadError.IsValid = False
End If
End If
End Function
I found a code excerpt that will populate a blob from the memory stream output from ImageResizer (link).
However, I'm a bit unclear on how to embed/reference these images in my site. Currently I'm just doing this (for example):
HeaderImage.Src = String.Format("/Images/Headers/{0}.jpg", .HeaderImage.ToString())
I guess I could just include the full URL prefix to the blob, but that seems like a bad idea (and would break things when running locally for testing).
Could I use a virtual path that points to an Azure Files share instead of Azure Blob?
I saw AzureReader2 for ImageResizer which is tempting, but it seems super expensive. It seems like that could be a pretty easy solution as well, since it looks like it auto-redirects references with a certain prefix to the Azure Blob location.
I guess the question is: what is the easiest way to do this? I've kind of been going in circles with the various options.
Thanks!
I ended up using the solution where I prefix the appropriate URL to get to the blob storage. To make it work locally and remotely, I added a transform:
<appSettings>
<add key="BlobURLPrefix" value="http://myapp.blob.core.windows.net"
xdt:Locator="Match(key)"
xdt:Transform="SetAttributes(value)"/>
</appSettings>
Then when I need to reference an image location, I can do something like this:
FullUrl = ConfigurationManager.AppSettings("BlobURLPrefix") + ImagePath
Not entirely elegant, but it suits my purposes reasonably well and is pretty easy to implement.
Related
In an ASP.NET application, is there a way to get the nlog output to go in a buffer in memory?
I'd like to make a circular buffer that would display the log's content on a web page.
What about writing to the memory target?
Then you could read them as follow:
var target = LogManager.Configuration.FindTargetByName<MemoryTarget>("target1");
var logs = target.Logs;
You can specify custom targets with NLog: https://github.com/nlog/NLog/wiki/Targets
It does not look like one exists that does exactly what you need, but you can write your own!
https://github.com/NLog/NLog/wiki/Extending-NLog
Why don't you read tail n-lines from the actual log file and display it? Or send to a database and display the table? Seems that this will added more overhead and memory constants to the ASP.Net site.
Background: I downloaded a *.sql backup of my WordPress site's database, and replaced all instances of the old database table prefix with a new one (e.g. from the default wp_ to something like asdfghjkl_).
I've just learnt that WordPress uses serialized PHP strings in the database, and what I did will have messed with the integrity of the serialized string lengths.
The thing is, I deleted the backup file just before I learnt about this (as my website was still functioning fine), and installed a number of plugins since. So, there's no way I can revert back, and I therefore would like to know two things:
How can I fix this, if at all possible?
What kind of problems could this cause?
(This article states that, a WordPress blog for instance, could lose its settings and widgets. But this doesn't seem to have happened to me as all the settings for my blog are still intact. But I have no clue as to what could be broken on the inside, or what issues it'd pose in the future. Hence this question.)
Visit this page: http://unserialize.onlinephpfunctions.com/
On that page you should see this sample serialized string: a:1:{s:4:"Test";s:17:"unserialize here!";}. Take a piece of it-- s:4:"Test";. That means "string", 4 characters, then the actual string. I am pretty sure that what you did caused the numeric character count to be out of sync with the string. Play with the tool on the site mentioned above and you will see that you get an error if you change "Test" to "Tes", for example.
What you need to do is get those character counts to match your new string. If you haven't corrupted any of the other encoding-- removed a colon or something-- that should fix the problem.
I came to this same problem after trying to change the domain from localhost to the real URL. After some searching I found the answer in Wordpress documentation:
https://codex.wordpress.org/Moving_WordPress
I will quote what is written there:
To avoid that serialization issue, you have three options:
Use the Better Search Replace or Velvet Blues Update URLs plugins if you can > access your Dashboard.
Use WP-CLI's search-replace if your hosting provider (or you) have installed WP-CLI.
Run a search and replace query manually on your database. Note: Only perform a search and replace on the wp_posts table.
I ended up using WP-CLI which is able to replace things in the database without breaking serialization: http://wp-cli.org/commands/search-replace/
I know this is an old question, but better late than never, I suppose. I ran into this problem recently, after inheriting a database that had had a find/replace executed on serialized data. After many hours of researching, I discovered that this was because the string counts were off. Unfortunately, there was so much data with lots of escaping and newlines and I didn't know how to count in some cases and I had so much data that I needed something automated.
Along the way, I stumbled across this question and Benubird's post helped put me on the right path. His example code did not work in production use on complex data, containing numerous special characters and HTML, with very deep levels of nesting, and it did not properly handle certain escaped characters and encoding. So I modified it a bit and spent countless hours working through additional bugs to get my version to "fix" the serialized data.
// do some DB query here
while($res = db_fetch($qry)){
$str = $res->data;
$sCount=1; // don't try to count manually, which can be inaccurate; let serialize do its thing
$newstring = unserialize($str);
if(!$newstring) {
preg_match_all('/s:([0-9]+):"(.*?)"(?=;)/su',$str,$m);
# preg_match_all("/s:([0-9]+):(\"[^\"\\\\]*(?:\\\\.[^\"\\\\]*)*\")(?=;)/u",$str,$m); // alternate: almost works but leave quotes in $m[2] output
# print_r($m); exit;
foreach($m[1] as $k => $len) {
/*** Possibly specific to my case: Spyropress Builder in WordPress ***/
$m_clean = str_replace('\"','"',$m[2][$k]); // convert escaped double quotes so that HTML will render properly
// if newline is present, it will output directly in the HTML
// nl2br won't work here (must find literally; not with double quotes!)
$m_clean = str_replace('\n', '<br />', $m_clean);
$m_clean = nl2br($m_clean); // but we DO need to convert actual newlines also
/*********************************************************************/
if($sCount){
$m_new = $m[0][$k].';'; // we must account for the missing semi-colon not captured in regex!
// NOTE: If we don't flush the buffers, things like <img src="http://whatever" can be replaced with <img src="//whatever" and break the serialize count!!!
ob_end_flush(); // not sure why this is necessary but cost me 5 hours!!
$m_ser = serialize($m_clean);
if($m_new != $m_ser) {
print "Replacing: $m_new\n";
print "With: $m_ser\n";
$str = str_replace($m_new, $m_ser, $str);
}
}
else{
$m_len = (strlen($m[2][$k]) - substr_count($m[2][$k],'\n'));
if($len != $m_len) {
$newstr='s:'.$m_len.':"'.$m[2][$k].'"';
echo "Replacing: {$m[0][$k]}\n";
echo "With: $newstr\n\n";
$str = str_replace($m_new, $newstr, $str);
}
}
}
print_r($str); // this is your FIXED serialized data!! Yay!
}
}
A little geeky explanation on my changes:
I found that trying to count with Benubird's code as a base was too inaccurate for large datasets, so I ended up just using serialize to be sure the count was accurate.
I avoided the try/catch because, in my case, the try would succeed but just returned an empty string. So, I check for empty data instead.
I tried numerous regex's but only a mod on Benubird's would accurately handle all cases. Specifically, I had to modify the part that checked for the ";" because it would match on CSS like "width:100%; height:25px;" and broke the output. So, I used a positive lookahead to only match when the ";" was outside of the set of double quotes.
My case had lots of newlines, HTML, and escaped double quotes, so I had to add a block to clean that up.
There were a couple of weird situations where data would be replaced incorrectly by the regex and then the serialize would count it incorrectly as well. I found NOTHING on any sites to help with this and finally thought it might be related to caching or something like that and tried flushing the output buffer (ob_end_flush()), which worked, thank goodness!
Hope this helps someone... Took me almost 20 hours including the research and dealing with weird issues! :)
This script (https://interconnectit.com/products/search-and-replace-for-wordpress-databases/) can help to update an sql database with proper URLs everywhere, without encountering serialized data issues, because it will update the "characters count" that could throw your URLs out of sync whenever serialized data occurs.
The steps would be:
if you already have imported a messed up database (widgets not
working, theme options not there, etc), just drop that database
using PhpMyAdmin. That is, remove everything on it. Then export and
have at hand an un-edited dump of the old database.
Now you have to import the (un-edited) old database into the
newly created one. You can do this via an import, or copying over
the db from PhpMyAdmin. Notice that so far, we haven't done any
search and replace yet; we just have an old database content and
structure into a new database with its own user and password. Your site will be probably unaccessible at this point.
Make sure you have your WordPress files freshly uploaded to the
proper folder on the server, and edit your wp-config.php to make it
connect with the new database.
Upload the script into a "secret" folder - just for security
reasons - at the same level than wp-admin, wp-content, and wp-includes. Do not forget to remove it all once the search and
replace have taken place, because you risk to offer your DB details
open to the whole internet.
Now point your browser to the secret folder, and use the script's fine
interface. It is very self-explanatory. Once used, we proceed to
completely remove it from the server.
This should have your database properly updated, without any serialized data issues around: the new URL will be set everywhere, and serialized data characters counts will be accordingly updated.
Widgets will be passed over, and theme settings as well - two of the typical places that use serialized data in WordPress.
Done and tested solution!
If the error is due to the length of the strings being incorrect (something I have seen frequently), then you should be able to adapt this script to fix it:
foreach($strings as $key => $str)
{
try {
unserialize($str);
} catch(exception $e) {
preg_match_all('#s:([0-9]+):"([^;]+)"#',$str,$m);
foreach($m[1] as $k => $len) {
if($len != strlen($m[2][$k])) {
$newstr='s:'.strlen($m[2][$k]).':"'.$m[2][$k].'"';
echo "len mismatch: {$m[0][$k]}\n";
echo "should be: $newstr\n\n";
$strings[$key] = str_replace($m[0][$k], $newstr, $str);
}
}
}
}
I personally don't like working in PHP, or placing my DB credentials in an public file. I created a ruby script to fix serializations that you can run locally:
https://github.com/wsizoo/wordpress-fix-serialization
Context Edit:
I approached fixing serialization by first identifying serialization via regex, and then recalculating the byte size of the contained data string.
$content_to_fix.gsub!(/s:([0-9]+):\"((.|\n)*?)\";/) {"s:#{$2.bytesize}:\"#{$2}\";"}
I then update the specified data via an escaped sql update query.
escaped_fix_content = client.escape($fixed_content)
query = client.query("UPDATE #{$table} SET #{$column} = '#{escaped_fix_content}' WHERE #{$column_identifier} LIKE '#{$column_identifier_value}'")
I would like to rename a folder with asp.net:
string oldFolderTitlePath = ServerPhyscialPath + oldFolderTitle + "/";
string newFolderTiltePath = ServerPhyscialPath + newFolderTille+ "/";
DirectoryInfo diPath = new DirectoryInfo(oldFolderTitlePath);
if(diPath.Exists)
{
///Now move(Rename) folder on the server
Directory.Move(oldFolderTitlePath, newFolderTiltePath);
}
I wonder that if the old folder contains number of files and the size is more than 1GB. Will it take a lot of time to rename a folder on asp.net?
Thanks in advance.
Generally, no it should not take a lot of time. You're basically changing the name of the directory not actually moving its contents on the disk.
That said, I'd be very careful with doing what you're doing. I'm always wary of IO operations from ASP.NET -- the reason: Many users could potentially be executing this code at the same time. That could lead to all sorts of problems. You need to make sure this operation is thread safe (perhaps by locking a static variable).
http://msdn.microsoft.com/en-us/library/c5kehkcz%28v=vs.71%29.aspx
http://msdn.microsoft.com/en-us/library/system.io.directory.move.aspx
I have a Flex application with multiple modules.
When I redeploy the application I was finding that modules (which are deployed as separate swf files) were being cached in the browser and the new versions weren't being loaded.
So i tried the age old trick of adding ?version=xxx to all the modules when they are loaded. The value xxx is a global parameter which is actually stored in the host html page:
var moduleSection:ModuleLoaderSection;
moduleSection = new ModuleLoaderSection();
moduleSection.visible = false;
moduleSection.moduleName = moduleName + "?version=" + MySite.masterVersion;
In addition I needed to add ?version=xxx to the main .swf that was being loaded. Since this is done by HTML I had to do this by modifying my AC_OETags.js file as below :
function AC_FL_RunContent(){
var ret =
AC_GetArgs
( arguments, ".swf?mv=" + getMasterVersion(), "movie", "clsid:d27cdb6e-ae6d-11cf-96b8-444553540000"
, "application/x-shockwave-flash"
);
AC_Generateobj(ret.objAttrs, ret.params, ret.embedAttrs);
}
This is all fine and works great. I just have a hard time believing that Adobe doesn't already have a way to handle this. Given that Flex is being targeted to design modular applications for business I find it especially surprising.
What do other people do? I need to make sure my application reloads correctly even if someone has once per session selected for their 'browser cache checking policy'.
I had a similar problem, and ended up putting the SWF files in a sub-directory named as the build number. This meant that the URL to the SWF files pointed to a different location each time.
Ideally this should be catered for by the platform, but no joy there. But this works perfectly for us, and integrates very easily into our automated builds with Hudson - no complaints so far.
Flex says:
http://www.adobe.com/livedocs/flex/2/docs/wwhelp/wwhimpl/common/html/wwhelp.htm?context=LiveDocs_Parts&file=00001388.html
What I have done is checksum the SWF file and then add that to its url. Stays the same until the file is rebuilt/redeployed. Handled automagically by a few lines of server-side PHP script
here is sample.
function AC_FL_RunContent(){
var ret = AC_GetArgs(arguments, ".swf?ts=" + getTS(), "movie",
"clsid:d27cdb6e-ae6d-11cf-96b8-444553540000",
"application/x-shockwave-flash");
AC_Generateobj(ret.objAttrs, ret.params, ret.embedAttrs);
}
function getTS() {
var ts = new Date().getTime();
return ts;
}
AC_OETags.js is file and it exists html-template several places.
but as my posting said, I am facing another type of problem.
The caching is not done by Flash Player but by the browser, so it's out of Adobe's control. I think you have found a workable solution. If I want to avoid caching I usually append a random number on the URL.
We're having problems with an ASP.NET application which allows users to upload, and crop images. The images are all scaled to fixed sizes afterwards. We basically run out of memory when a large file is processed; it seems that the handling of JPEG is rather inefficient -- we're using System.Drawing.BitMap. Do you have any general advice, and perhaps some pointers to a more efficient image handling library? What experiences do you have?
I had the same problem, the solution was to use System.Drawing.Graphics to do the transformations and dispose every bitmap object as soon as I was finished with it. Here's a sample from my library (resizing) :
public Bitmap ApplyTo(Bitmap bitmap)
{
using (bitmap)
{
Bitmap newBitmap = new Bitmap(bitmap, CalculateNewSize(bitmap));
using (Graphics graphics = Graphics.FromImage(newBitmap))
{
graphics.SmoothingMode =
SmoothingMode.None;
graphics.InterpolationMode =
InterpolationMode.HighQualityBicubic;
graphics.CompositingQuality =
CompositingQuality.HighQuality;
graphics.DrawImage(
bitmap,
new Rectangle(0, 0, newBitmap.Width, newBitmap.Height));
}
return newBitmap;
}
}
I found imageresizer and its great. and good API. Works Great.
Downloaded from Visual studio 2010 Extension Manager: http://nuget.org/.
Easy Steps to download API in VS-2010:
1). Install Extension http://nuget.org/.
3). Find and Install ImageResizing
4).Then Code: (I m using here cropping. you can use any) Documentation on imageresizing.net
string uploadFolder = Server.MapPath(Request.ApplicationPath + "images/");
FileUpload1.SaveAs(uploadFolder + FileUpload1.FileName);
//The resizing settings can specify any of 30 commands.. See http://imageresizing.net for details.
ResizeSettings resizeCropSettings = new ResizeSettings("width=200&height=200&format=jpg&crop=auto");
//Generate a filename (GUIDs are safest).
string fileName = Path.Combine(uploadFolder, System.Guid.NewGuid().ToString());
//Let the image builder add the correct extension based on the output file type (which may differ).
fileName = ImageBuilder.Current.Build(uploadFolder + FileUpload1.FileName, fileName, resizeCropSettings, false, true);
Try!!! its very awsumm and easy to use. thanks.
A couple of thoughts spring to mind -
What size of images do you allow
your users to upload and can you
impose restrictions on this?
When you're using the
System.Drawing.Bitmap class, are you
remembering to dispose of it
correctly? We found one of the primary causes of System.OutOfMemoryException exceptions on our shared hosting platform was users not disposing of Bitmap objects correctly.
Kev
There was an older bug with .net that all images would default to 32 bits per pixel - at this size you can exhaust your memory pretty fast. Please use PixelFormat structure to make sure this is not the case for your problem.
This link might help: http://msdn.microsoft.com/en-us/library/aa479306.aspx
Do you perhaps have a stack trace to look at?
I have also done some image editing after a user has uploaded an image. The only problems that I ran into were restrictions on file upload size on the browsers and timeouts. But nothing related to .Net's libraries.
Something else to consider. If you are doing multiple images or have some dangerous looping somewhere then are you making sure to flush() and dispose() things.