FLEX: getting a folder size - apache-flex

I'm triying to get a folder size by doing:
var FolderFile:File = new File("file:///SomePath/Folder");
var FolderSize: FolderFile.size;
But this gives me a value of 0, how can I get the folder size? is there anyway to do this?
Tranks

No, there's no way to do it automagically. Getting the size of the directory is a complex and potentially painfully slow operation. There could be 10s of thousands of files in a directory, or a directory could be located on a (slow?) network, not to mention tape storage and similar scenarios.
The file systems themselves don't store directory size information, and the only way to know it is to calculate it file-by-file, there's no quick/easy shortcut. So, you will have to rely on the solution you posted above, and, yes, it is going to be slow.

I want to know the size of the folder (like 10mb). Sorry for the second line, I write it wrong, it's:
var Foldersize:Number = FolderFile.size;
I just made a new class wich executes this function:
public function GetFolderSize(Source:Array):Number
{
var TotalSizeInteger:Number = new Number();
for(var i:int = 0;i<Source.length;i++){
if(Source[i].isDirectory){
TotalSizeInteger += this.GetFoldersize(Source[i].getDirectoryListing());
}
else{
TotalSizeInteger += Source[i].size;
}
}
return TotalSizeInteger;
}
In "Source" you pass the FolderFile.getDirectoryListing(), something like this:
var CC:CustomClass = new CustomClass();
var FolderSize:Number = CustomClass.GetFolderSize(FolderFile.getDirectoryListing());
But this is a very slow method, is there a more quick and easy way to know the folder size?
Sorry for my grammar, i'm just learning english.
Thanks

Related

Google Apps Script To Copy Entire Google Drive File Structure; How To Avoid Timeouts?

My organization is switching to a Google Business account, and everyone needs to transfer their Drive files to their new accounts. Drive will not allow transfer of ownership between these accounts, so I've created a script to copy files and folders from the old account to the new account. (The old account's contents have been moved into a folder shared with the new account.)
Here's what I have so far:
function copyDrive() {
var originFolder = DriveApp.getFolderById(originFolderID);
var destination = DriveApp.getFolderById(destinationID);
copyFiles(originFolder, destination);
};
function copyFiles(passedFolder, targetFolder) {
var fileContents = passedFolder.getFiles();
var file;
var fileName;
while(fileContents.hasNext()) {
file = fileContents.next();
fileName = file.getName();
file.makeCopy(fileName, targetFolder);
}
copySubFolders(passedFolder, targetFolder);
};
function copySubFolders(passedFolder, targetFolder) {
var folderContents = passedFolder.getFolders();
var folder;
var folderName;
while(folderContents.hasNext()) {
folder = folderContents.next();
folderName = folder.getName();
var subFolderCopy = targetFolder.createFolder(folderName);
copyFiles(folder, subFolderCopy);
}
};
Please pardon any inelegance; I am new at this. The script actually works great and preserves the folder structure, but it times out after copying ~150 files and folders. I've been looking into how to use continuation tokens, and I've read this post closely. I think I'm stuck on a conceptual level, because I'm not sure how the continuation tokens will interact with the recursive functions I've set up. It seems like I will end up with a stack of my copySubFolders function, and they will each need their own continuation tokens. Of course they all use the same variable name for their iterators, so I really have no idea how to set that up.
Any thoughts? Sorry for posting such a helpless newbie question; I hope it will at least be an interesting problem for someone.
I think I have solved the conceptual problem, though I am getting
We're sorry, a server error occurred. Please wait a bit and try again. (line 9, file "Code")
when I try to execute it.
Basically, I set it up to only try to copy one top-level folder at a time, and for each one of those it uses the recursive functions I had before. It should save continuation tokens for that first level of folders and any files in the root folder so it can pick up in the next execution where it left off. This way, the tokens are not involved in my recursive stack of functions.
function copyDrive() {
var originFolder = DriveApp.getFolderById(originFolderID);
var destination = DriveApp.getFolderById(destinationID);
var scriptProperties = PropertiesService.getScriptProperties();
var fileContinuationToken = scriptProperties.getProperty('FILE_CONTINUATION_TOKEN');
var fileIterator = fileContinuationToken == null ?
originFolder.getFiles() : DriveApp.continueFileIterator(fileContinuationToken);
var folderContinuationToken = scriptProperties.getProperty('FOLDER_CONTINUATION_TOKEN');
var folderIterator = folderContinuationToken == null ?
originFolder.getFolders() : DriveApp.continueFolderIterator(folderContinuationToken);
try {
var rootFileName;
while(fileIterator.hasNext()) {
var rootFile = fileIterator.next();
rootFileName = rootFile.getName();
rootFile.makeCopy(rootFileName, destination);
}
var folder = folderIterator.next();
var folderName = folder.getName();
var folderCopy = folder.makeCopy(folderName, destination);
copyFiles(folder, folderCopy);
} catch(err) {
Logger.log(err);
}
if(fileIterator.hasNext()) {
scriptProperties.setProperty('FILE_CONTINUATION_TOKEN', fileIterator.getContinuationToken());
} else {
scriptProperties.deleteProperty('FILE_CONTINUATION_TOKEN');
}
if(folderIterator.hasNext()) {
scriptProperties.setProperty('FOLDER_CONTINUATION_TOKEN', folderIterator.getContinuationToken());
} else {
scriptProperties.deleteProperty('FOLDER_CONTINUATION_TOKEN');
}
};
function copyFiles(passedFolder, targetFolder) {
var fileContents = passedFolder.getFiles();
var file;
var fileName;
while(fileContents.hasNext()) {
file = fileContents.next();
fileName = file.getName();
file.makeCopy(fileName, targetFolder);
}
copySubFolders(passedFolder, targetFolder);
};
function copySubFolders(passedFolder, targetFolder) {
var subFolderContents = passedFolder.getFolders();
var subFolder;
var subFolderName;
while(folderContents.hasNext()) {
subFolder = subFolderContents.next();
subFolderName = subFolder.getName();
var subFolderCopy = targetFolder.createFolder(folderName);
copyFiles(subFolder, subFolderCopy);
}
};
I know you would like a easy, programmatic way to do this, but it may be easiest to install Google Drive for Desktop and have them right-click, copy, paste.
The idea:
Create a single folder in which the user puts every item of their Drive. (I see you have already done that.)
Share that folder with their new account. (I see you have already done that, as well.)
Sign into their new account with Drive for Desktop.
Copy the folder in Drive for Desktop and paste it right back in. Ownership gets transferred to the new account.
Just a thought.
You're going to need to store an array of folder iterators and file iterators since each folder could have a nested array of folders. If you're reusing the same folder iterator as in the accepted solution, you won't be able to resume on more top level folders.
Take a look at my answer here for a template that you can use to recursively iterate over all the files in a drive with resume functionality built-in.

Get size of Site directory in Orchard

crosspost: https://orchard.codeplex.com/discussions/456226
In Orchard, each site (whether or not you enable multitenancy) seems to have it's own Folder within Media (main file folder for Orchard). I want to get the entire filesize of a current site (ergo, the folder under Media).
I've digged into the Framework and got into FileSystemStorageProvider which seems to be promising with the FileSystemStorageFolder class and GetSize() method.
However, I was wondering if anyone else checked this out before I go into experimenting with that class.
Any piece of information or advise would be highly apreciated. Thanks!
Didn't really find an easy way to do it but copied mostly from the Orchard framework. You will need the following:
private FileSystemStorageProvider _filesystemProvider;
private ShellSettings _settings;
And then you need to define the Site Storage Path:
var mediaPath = HostingEnvironment.IsHosted
? HostingEnvironment.MapPath("~/Media/") ?? ""
: Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Media");
storagePath = Path.Combine(mediaPath, _settings.Name);
Finally, here is my function to compute the storage for a specific folder (in this case, the tenant's/site's root media folder):
public double GetSiteStorage()
{
var folders = _filesystemProvider.ListFolders(storagePath);
long totalSize = 0;
foreach (var folder in folders)
{
totalSize += folder.GetSize();
}
return (totalSize / 1024 / 1024);
}
This returns a double for the MB used. Hope this helps someone :)

Adobe Flex ActionScript prevent model tainting

I have some values stores in my model. I need to create a copy of those values, make some changes, and then output those changes without affecting the model values.
var my_source:Array = model.something.source
var output:Array = new Array();
for each (var vo:my_vo in my_source) {
if (vo.id == 1) {
vo.name = 'Foo';
output.push(vo);
}
else if (vo.id == 21) {
vo.name = 'Bar';
output.push(vo);
}
}
return output;
So, this works fine, except that any changes that are made when looping through my_source also seems to affect model.something. Why do changes to the my_source array affect the model? How do I prevent this from happening?
I've mentioned how to do this in my blog, but short answer is use ObjectUtil.copy(). What you're trying to do isn't copying since Flash uses reference based objects, so you're only copying the reference to the other array. By using ObjectUtil.copy(), you're doing what's called a 'deep copy' which is actually recreates the object in a new memory location.
You are dealing with references to data, not copies of data. This is how ActionScript-3 (and many other languages) works.
When you create the my_source variable, you are creating a reference to model.something.source, which also includes all of the references to your model objects. Further, when you loop through the my_vo objects, you are also getting a reference to these objects. This means that if you make changes to the object in this loop, you are making changes to the objects in the model.
How do you fix this? Inside your loop, you will need to make a copy of your object. I don't know what my_vo looks like, but if you have any other objects in that object tree, they would be references as well, which would probably require a "deep copy" to achieve what you want.
The easiest way (but usually not the most efficient way) to achieve a "deep copy" is to serialize and de-serialze. One way to achieve this:
function deepCopy(source:Object):* {
var serializer:ByteArray = new ByteArray();
serializer.writeObject(source);
serializer.position = 0;
return serializer.readObject();
}
Then, in your loop, you can make your copy of the data:
for each(var vo:my_vo in my_source) {
var copy:my_vo = deepCopy(vo);
// act on copy instead of vo
}

Finding duration of a video using directshowlib-2005

My asp.net(c#) method looks as follows:
static public bool GetVideoLength(string fileName, out double length)
{
DirectShowLib.FilterGraph graphFilter = new DirectShowLib.FilterGraph();
DirectShowLib.IGraphBuilder graphBuilder;
DirectShowLib.IMediaPosition mediaPos;
length = 0.0;
try
{
graphBuilder = (DirectShowLib.IGraphBuilder)graphFilter;
graphBuilder.RenderFile(fileName, null);
mediaPos = (DirectShowLib.IMediaPosition)graphBuilder;
mediaPos.get_Duration(out length);
return true;
}
catch
{
return false;
}
finally
{
mediaPos = null;
graphBuilder = null;
graphFilter = null;
}
}
I got the duration with the above method. But my problem is i can't delete the physical file
after my operation. I used
File.Delete(FilePath);
While performing this action i got an exception as follows:
"The process cannot access the file because it is being used by another process."
My Os is windows 7(IIS 7)
Any one please help me to sort this out?
I've got no experience in coding directshow apps in C#, but plenty of experience in C++.
DirectShow is based on a technology called COM - which uses reference counting to tell it when an object is in use.
It would use a COM object to represent the IGraphBuilder for example.
In C++, we would have to deconstruct the graph, by removing all its filters, then release the graph.
I understand that C# has its own garbage collection etc., but unless you explicitly release the objects you use, they'll remain in memory.
It seems from the code you've quoted, that the graph is still opened, even though playback may have finished. In that case, it'll hold a reference to the file which you've played back, which would explain why you can't delete it - e.g. there's a read lock on the file.
Hope this points you in the right direction!

Flex 3 & Air 2: Automatically detect when files on a directory are updated

A crazy idea just dropped from the sky and hit me in the head xD. I was wondering if it is possible to make and App capable of listening when the user "adds" new files to a directory.
Example:
The User opens up our Application.
The user adds new files on the desktop (using the Microsoft Explorer).
Our application automatically detects that new files have been added and executes a function or whatever.
Sound interesting right?
Maybe, this could be done using a programming language like Visual Basic and open the executable with the NativeProcess api and listen for an stdOut event... (:
Anyone got and idea to share with us? :)
Thanks
Lombardi
AIR can handle this natively...
the FileSystemList class fires an event directoryChange whenever a file in the watched directory changes.
You can even use it to watch for drives being mounted (I think Christian Cantrell showed that one off)
Ok, I think I'm getting closer, check out this solution! :)
private var CheckDelay:Timer = new Timer(5000, 0);
private function InitApp():void
{
CheckDelay.addEventListener(TimerEvent.Timer, CheckForNewFiles, false, 0, true);
CheckDelay.start();
}
private function CheckForNewFiles(event:TimerEvent):void
{
var FS:FileStream = new FileStream();
var Buffer:File = File.applicationStorageDirectory.resolvePath("FilesBuffer.cmd");
FS.open(Buffer, FileMode.Write);
FS.writeUTFBytes("cd " + File.desktopDirectory.nativePath + "\r\n" +
"dir /on /b > " + File.applicationStorageDirectory.resolvePath("FileList.txt").nativePath);
FS.close();
var Process:NativeProcess = new NativeProcess();
var NPI:NativeProcessStartupInfo = NativeProcessStartupInfo(); // What a large name! xD
NPI.executable = Buffer;
Process.start(NPI);
Process..addEventListener(NativeProcessExitEvent.EXIT, ReadFileList, false, 0, true);
}
private function ReadFileList(event:Event):void
{
var FS:FileStream = new FileStream();
var Buffer:File = File.applicationStorageDirectory.resolvePath("FilesBuffer.cmd");
FS.open(Buffer, FileMode.Read);
var FileData:String = FS.readUTFBytes(FS.bytesAvailable);
FS.close();
var FileArray:Array = FileData.split("\r\n");
var TempArray:ArrayCollection = new ArrayColletion();
var TempFile:File;
for(var i:int = 0;i<FileArray.length;i++){
TempFile = new File(FileArray[i]);
TempArray.addItem(TempFile);
}
}
At the end we got an Array (TempArray) that we could use on a datagrid (for example) with colums like: "extension, File Name, FilePath, etc.."
The files are updated every 5 seconds.
And, why we use all that code instead of a simple "File.getDirectoryListing()"? Because we are updating our application every 5 seconds, if why use getDirectoryListing(), our application will take much more RAM and also, the cmd command is much faster... :)
If you have a better idea, please share it with us! Thank you! :D
1 excellent solution for Windows: use Visual Studio, build the .net app found here http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
In Adobe AIR use the native process to listen for change events dispatched by .net

Resources