Difference between GetCacheDir() and button "clear the cache" - xamarin.forms

I need to programmatically delete all caches of my Xamarin.Forms application called MyXFApp, the same way as the button 'Clear Cache' of the Setting/Apps/MyXFApp/Storage page.
I read the Android and Xamarin.Forms official documentations, and used the methods getGetCacheDir() and GetExternalCacheDir() to retrieve the cache directories, and I delete both of them.
After deleting both directories, I expect to see the cache size to 0 byte within the Setting/Apps/MyXFApp/Storage page. But the cache size displayed in the Setting/Apps/MyXFApp/Storage page has indeed diminished, but is not strictly equal to 0 byte, and I do not understand why?
Is there any other directory I must delete in order to fully clear my Xamarin.Forms cache? Or is it a bug of the Setting/Apps/MyXFApp/Storage displayed cache size?
And otherwise, what is the correct way or difference(s) between these two methods?
I tried to delete directories obtained from methods getGetCacheDir() and GetExternalCacheDir().
Expected to see cache size of 0B in the Setting/Apps/MyXFApp/Storage page, but I see 20KB and not 0B.

I have done a sample and use the following code to clear the cache:
public static bool DeleteCache(File file)
{
if(file != null && file.IsDirectory )
{
string[] children = file.List();
for(int i = 0; i < children.Length; i++)
{
bool success = DeleteCache(new File(file,children[i]));
if (!success)
{
return false;
}
}
}
return file.Delete();
}
And call the method in the MainActivity:
var file = this.CacheDir;
var file1 = this.ExternalCacheDir;
DeleteCache(file);
DeleteCache(file1);
When I called the method, I checked the cache size still had 20KB. And then I tapped App info and cleared cache. I also saw the cache size be 0KB. But when I reload the app storage page by going back and into the storage page again, I saw the cache size be 20KB again.
Expected to see cache size of 0B in the Setting/Apps/MyXFApp/Storage page, but I see 20KB and not 0B.
So you saw the 0B was not really 0B, when you reload the storage page, it will be 20KB. It seems the 20KB cache must exist for each app on the android device.
I also try to clear the cache app's cache on the really device. When the app restart, the cache be 28KB on the xiaomi device. So when you use the code in your app to clear the cache, your app is still running. The cache can't be clear as 0KB.

Thanks for your answers
My code is:
var cachePath = this.ApplicationContext.CacheDir.Path;
var cachePath2 = this.ApplicationContext.ExternalCacheDir.Path;
// If exist, delete the cache directory and everything in it recursivly
if (System.IO.Directory.Exists(cachePath))
System.IO.Directory.Delete(cachePath, true);
// If exist, delete the cache directory and everything in it recursivly
if (System.IO.Directory.Exists(cachePath2))
System.IO.Directory.Delete(cachePath2, true);
// If not exist, restore just the directory that was deleted
if (!System.IO.Directory.Exists(cachePath))
System.IO.Directory.CreateDirectory(cachePath);
// If not exist, restore just the directory that was deleted
if (!System.IO.Directory.Exists(cachePath2))
System.IO.Directory.CreateDirectory(cachePath2);`
I try the response of #Liyun Zhang

Related

How to enable LocalStorage in Qt 5.3

I tried the method:
QWebSettings* settings = QWebSettings::globalSettings();
settings->setAttribute(QWebSettings::LocalStorageEnabled, true);
auto path = QStandardPaths::writableLocation(QStandardPaths::GenericDataLocation);
settings->setOfflineStoragePath(path);
window.localStorage is true(not null or undefined), but when I insert a item into the localStorage:
localStorage.setItem("b","isaac");
alert(localStorage["b"]);
The error is happened, and the error messages in the webkit inspector console are:
QuotaExceededError: DOM Exception 22: An attempt was made to add something to storage that exceeded the quota.
I was raging all day, because it was not working after restart of the app.
So I think this will be helpfull for someone:
QWebSettings* settings = QWebSettings::globalSettings();
settings->setAttribute(QWebSettings::LocalStorageEnabled, true);
auto path = QStandardPaths::writableLocation(QStandardPaths::GenericDataLocation);
settings->setOfflineStoragePath(path);
settings->enablePersistentStorage(path);
Notice the enablePersistentStorage
I forgot I've enable a very important swithcer:
settings->setAttribute(QWebSettings::PrivateBrowsingEnabled,true);
This will set the browser to private mode and prevent you to insert value to localStorage. But the official api doc doesn't mention it.
You just set disable the switcher can solve the problem:
settings->setAttribute(QWebSettings::PrivateBrowsingEnabled,false);

Understanding the JIT; slow website

First off, this question has been covered a few times (I've done my research), and, for example, on the right side of the SO webpage is a list of related items... I have been through them all (or as many as I could find).
When I publish my pre-compiled .NET web application, it is very slow to load the first time.
I've read up on this, it's the JIT which I understand (sort of).
The problem is, after the home page loads (up to 20 seconds), many other pages load very fast.
However, it would appear that the only reason they load is because the resources have been loaded (or that they share the same compiled dlls). However, some pages still take a long time.
This indicates that maybe the JIT needs to compile different pages in different ways? If so, and using a contact form as an example (where the Thank You page needs to be compiled by the JIT and first time is slow), the user may hit the send button multiple times whilst waiting for the page to be shown.
After I load all these pages which use different models or different shared HTML content, the site loads quickly as expected. I assume this issue is a common problem?
Please note, I'm using .NET 4.0 but, there is no database, XML files etc. The only IO is if an email doesn't send and it writes the error to a log.
So, assuming my understanding is correct, what is the approach to not have to manually go through the website and load every page?
If the above is a little too broad, then can this be resolved in the settings/configuration in Visual Studio (2012) or the web.config file (excluding adding compilation debug=false)?
In this case, there are 2 problems
As per rene's comments, review this http://msdn.microsoft.com/en-us/library/ms972959.aspx... The helpful part was to add the following code to the global.asax file
const string sourceName = ".NET Runtime";
const string serverName = ".";
const string logName = "Application";
const string uriFormat = "\r\n\r\nURI: {0}\r\n\r\n";
const string exceptionFormat = "{0}: \"{1}\"\r\n{2}\r\n\r\n";
void Application_Error(Object sender, EventArgs ea) {
StringBuilder message = new StringBuilder();
if (Request != null) {
message.AppendFormat(uriFormat, Request.Path);
}
if (Server != null) {
Exception e;
for (e = Server.GetLastError(); e != null; e = e.InnerException) {
message.AppendFormat(exceptionFormat,
e.GetType().Name,
e.Message,
e.StackTrace);
}
}
if (!EventLog.SourceExists(sourceName)) {
EventLog.CreateEventSource(sourceName, logName);
}
EventLog Log = new EventLog(logName, serverName, sourceName);
Log.WriteEntry(message.ToString(), EventLogEntryType.Error);
//Server.ClearError(); // uncomment this to cancel the error
}
The server was maxing out during sending of the email! My code was fine, but, viewing Task Scheduler showed it was hitting 100% memory...
The solution was to monitor the errors shown by point 1 and fix it. Then, find out why the server was being throttled when sending an email!

Please suggest a way to store a temp file in Windows Azure

Here I have a simple feature on ASP.NET MVC3 which host on Azure.
1st step: user upload a picture
2nd step: user crop the uploaded picture
3rd: system save the cropped picture, delete the temp file which is the uploaded original picture
Here is the problem I am facing now: where to store the temp file?
I tried on windows system somewhere, or on LocalResources: the problem is these resources are per Instance, so here is no guarantee the code on an instance shows the picture to crop will be the same code on the same instance that saved the temp file.
Do you have any idea on this temp file issue?
normally the file exist just for a while before delete it
the temp file needs to be Instance independent
Better the file can have some expire setting (for example, 1H) to delete itself, in case code crashed somewhere.
OK. So what you're after is basically somthing that is shared storage but expires. Amazon have just announced a rather nice setting called object expiration (https://forums.aws.amazon.com/ann.jspa?annID=1303). Nothing like this for Windows Azure storage yet unfortunately, but, doesnt mean we can't come up with some other approach; indeed even come up with a better (more cost effective) approach.
You say that it needs to be instance independant which means using a local temp drive is out of the picture. As others have said my initial leaning would be towards Blob storage but you will have cleanup effort there. If you are working with large images (>1MB) or low throughput (<100rps) then I think Blob storage is the only option. If you are working with smaller images AND high throughput then the transaction costs for blob storage will start to really add up (I have a white paper coming out soon which shows some modelling of this but some quick thoughts are below).
For a scenario with small images and high throughput a better option might be to use the Windows Azure Cache as your temporary storaage area. At first glance it will be eye wateringly expensive; on a per GB basis (110GB/month for Cache, 12c/GB for Storage). But, with storage your transactions are paid for whereas with Cache they are 'free'. (Quotas are here: http://msdn.microsoft.com/en-us/library/hh697522.aspx#C_BKMK_FAQ8) This can really add up; e.g. using 100kb temp files held for 20 minutes with a system throughput of 1500rps using Cache is about $1000 per month vs $15000 per month for storage transactions.
The Azure Cache approach is well worth considering, but, to be sure it is the 'best' approach I'd really want to know;
Size of images
Throughput per hour
A bit more detail on the actual client interaction with the server during the crop process? Is it an interactive process where the user will pull the iamge into their browser and crop visually? Or is it just a simple crop?
Here is what I see as a possible approach:
user upload the picture
your code saves it to a blob and have some data backend to know the relation between user session and uploaded image (mark it as temp image)
display the image in the cropping user interface interface
when user is done cropping on the client:
4.1. retrieve the original from the blob
4.2. crop it according the data sent from the user
4.3. delete the original from the blob and the record in the data backend used in step 2
4.4. save the final to another blob (final blob).
And have one background process checking for "expired" temp images in the data backend (used in step 2) to delete the images and the records in the data backend.
Please note that even in WebRole, you still have the RoleEntryPoint descendant, and you still can override the Run method. Impleneting the infinite loop in the Run() (that method shall never exit!) method, you can check if there is anything for deleting every N seconds (depending on your Thread.Sleep() in the Run().
You can use the Azure blob storage. Have look at this tutorial.
Under sample will be help you.
https://code.msdn.microsoft.com/How-to-store-temp-files-in-d33bbb10
you have two way of temp file in Azure.
1, you can use Path.GetTempPath and Path.GetTempFilename() functions for the temp file name
2, you can use Azure blob to simulate it.
private long TotalLimitSizeOfTempFiles = 100 * 1024 * 1024;
private async Task SaveTempFile(string fileName, long contentLenght, Stream inputStream)
{
try
{
//firstly, we need check the container if exists or not. And if not, we need to create one.
await container.CreateIfNotExistsAsync();
//init a blobReference
CloudBlockBlob tempFileBlob = container.GetBlockBlobReference(fileName);
//if the blobReference is exists, delete the old blob
tempFileBlob.DeleteIfExists();
//check the count of blob if over limit or not, if yes, clear them.
await CleanStorageIfReachLimit(contentLenght);
//and upload the new file in this
tempFileBlob.UploadFromStream(inputStream);
}
catch (Exception ex)
{
if (ex.InnerException != null)
{
throw ex.InnerException;
}
else
{
throw ex;
}
}
}
//check the count of blob if over limit or not, if yes, clear them.
private async Task CleanStorageIfReachLimit(long newFileLength)
{
List<CloudBlob> blobs = container.ListBlobs()
.OfType<CloudBlob>()
.OrderBy(m => m.Properties.LastModified)
.ToList();
//get total size of all blobs.
long totalSize = blobs.Sum(m => m.Properties.Length);
//calculate out the real limit size of before upload
long realLimetSize = TotalLimitSizeOfTempFiles - newFileLength;
//delete all,when the free size is enough, break this loop,and stop delete blob anymore
foreach (CloudBlob item in blobs)
{
if (totalSize <= realLimetSize)
{
break;
}
await item.DeleteIfExistsAsync();
totalSize -= item.Properties.Length;
}
}

Silverlight MediaElement refusing to play audio

I am having the hardest time figuring this problem out. I have a Silverlight 4 application that loads audio and video files from URLs. The URLs are the same domain as the application is hosted on and it works great for video.
The URLs are actually asp.net mvc controllers that are responsible for reading the file from a shared location on and the server and serving back a filestream. The URLs look something like this:
http://localhost:31479/CourseMedia?path=\omnisandbox1\ILMSShare2\Demo-Fire+Behavior\media\Disclaim.wma&encrypted=False&id=00000000-0000-0000-0000-000000000000
If I put the URL directly into the browser the file loads and plays in windows media player just fine, and if I use a separate test silverlight project to load the url it also works, but for the life of me I can not get it to work properly in my main project.
This is the routine I use to actually do the source setting:
protected void SetPlayerURL(MediaElement player, string url)
{
if (player != null && url.Length > 0)
{
player.ClearValue(MediaElement.SourceProperty);
player.Source = new Uri(this.Packet.GetMediaUrl(url, false, Guid.Empty));
}
}
and the GetMediaURL function simply builds the URL format seen above:
public string GetMediaUrl(
string path,
bool encrypted,
Guid key)
{
StringBuilder builder = new StringBuilder();
builder.AppendFormat("http://{0}/CourseMedia?path={1}&encrypted={2}&id={3}",
this.Host,
System.Windows.Browser.HttpUtility.UrlEncode(path),
encrypted,
key);
return builder.ToString();
}
The request to the controller is never made for the media when it is audio. Seems odd to me as this exact code works fine for video. The MediaElement state never leaves "Closed" and the CurrentStateChanged,, MediaOpened, and MediaFailed events are never triggered.
I am at a loss!
Try setting ScrubbingEnabled of the MediaElement to false, there were some problems with Framework version 3.5 and audio and the workaround was setting that to false. Might be worth trying.
Also try capturing BufferingStarted, BufferingEnded, MediaEnded along with your MediaFailed and MediaOpened events. I'm curious if it is a buffering issue.

Loading Sound gives exception on Sound.id3

When loading a MP3 to a flash.media.Sound object the id3 property gives an error:
SecurityError: Error #2000: No active security context.
Offcourse, like many errors in Flex, the Flex documentation doesn't mention a thing about this, except that it exists...
The MP3 is valid (i've checked it with MediaPlayer and iTunes), the Sound object is in a good state (bytesTotal and bytesLoaded both reflect the correct amount of bytes).
Has anyone had this problem too? Any solutions or suggestions?
Your MP3 should be fine.
If you want to access more data about your mp3 file, rather than just play, you will need a policy file that allows it. Similar to loading an image, if you just add it to the display it and don't access the pixels, it's all good, but if you want to access the pixels you should have permission(a crossdomain xml).
For images, when you call the load image, you can pass a LoaderContext in which you explicitly say you want to check for a crossdomain.xml file and get access to the content.
Similarly you should create a SoundLoaderContext with the second parameter set to true(to check) and use that in the sound load call.
e.g.
var snd:Sound = new Sound();
var req:URLRequest = new URLRequest("yourSound.mp3");
var context:SoundLoaderContext = new SoundLoaderContext(0, true);
snd.load(req, context);
snd.play();
For ID3 data you should listen for the ID3 event:
sound.addEventListener(Event.ID3, onID3);
function onID3(event:Event) {
for(var i in sound.id3)
trace('prop: ' + i + ' value: ' + sound.id3[i]);
}
For more info, you might find the mp3infoutil library handy.
HTH,
George

Resources