Currently I'm working on a script that will go through a given folder and search for all files with a specific extension. It will then print out the name and sum the file size. I believe i have most of the issues sorted out, so this question isn't about how to do this.
Instead, I want to know what would be the best practice for using FileSystemObject streams in a recursive function. Should I be using a single stream for all calls (either global, or passed), or should I be creating a new stream for each recursive step?
For extra fun, I'm planning on having this access multiple PCs, and over UNC path. And yes, I expect there's a better way of doing this, but I'm relatively new with VBS.
Current code:
'Recursive Function handles search of files in folder and subfolders.
Function UNCSearchFolder(strUNCRootPath, strUNCNextFolder)
Dim objFSOUNCSearchFolder, objFSOUNCSearchFolder2, colFolderList, objFolder, strCurrentFolder, strSubFolder
'Get list of Subfolders in folder: <Rootpath>\<Nextfolder>
strCurrentFolder = strUNCRootPath & "\" & strUNCNextFolder & "\"
Set objFSOUNCSearchFolder = CreateObject("Scripting.FileSystemObject")
Set objFSOUNCSearchFolder2 = objFSOUNCSearchFolder.GetFolder(strCurrentFolder)
Set colFolderList = objFSOUNCSearchFolder2.SubFolders
'Subfolder dive
For Each objFolder in colFolderList
strSubFolder = objFolder.name
'REMOVE THIS ECHO LATER
wscript.echo strSubFolder
UNCSearchFolder(strCurrentFolder, strSubFolder)
Next
'Search for files here
'GC on File Streams
Set objFSOUNCSearchFolder2 = Nothing
Set objFSOUNCSearchFolder = Nothing
End Function
So, should one filestream be used for all accesses or should each step use one separately? Is it a moot point? Will this cause multiple connections to each system or should it only use one? Basically I want the script to work without disrupting users, or causing weird responses (ie, running out of active connections). The script will only be used a couple times for an audit we're doing, but may eventually be repurposed for future audits.
Let me know what you think. Thanks for any help,
If you choose to set a reference to FSO inside your function,
then in each recursion will be used new FSO object.
Using single FSO object (either global, or passed) is quite enough.
At least I don't know any benefit of using multiple FSO instances.
[EDIT] I appreciate #AnsgarWiechers comment, and to make the code ready for re-using, while kept the FSO out of the function, we can wrap our function in a class.
With New FileInfo
WScript.Echo .FileSize("C:\temp", "txt", True)
End With
Class FileInfo
Private m_oFSO
Public Function FileSize(sRootDir, sExtension, bRecursive)
Dim oFolder, oFile, sFExt
sFExt = LCase(sExtension)
Set oFolder = m_oFSO.GetFolder(sRootDir)
For Each oFile In oFolder.Files
If LCase(m_oFSO.GetExtensionName(oFile.Name)) = sFExt Then
FileSize = FileSize + oFile.Size
End If
Next
If bRecursive Then
Dim oSubFolder
For Each oSubFolder In oFolder.SubFolders
FileSize = FileSize + FileSize(oSubFolder, sExtension, True)
Next
End If
End Function
Private Sub Class_Initialize
Set m_oFSO = CreateObject("Scripting.FileSystemObject")
End Sub
Private Sub Class_Terminate
Set m_oFSO = Nothing
End Sub
End Class
Related
In VB6, I am using a Webbrowser form to display all files and subfolders as icons in the style of Windows Explorer. On the form, I display a count of all files in the main folder and all subfolders. This is my code so far, though haven't been able to find any information about counting subfolders.
Private Sub Load_Form()
Dim myPathName as String
Dim iTotalCount as Long
some stuff here ....
WebBrowser1.Navigate myPathName
WebBrowser1.Document.CurrentViewMode = 5 'medium icons
iTotalCount = FileCount(myPathName)
lblLabel.Caption = "Total Files = " & iTotalCount
more stuff here ....
End Sub
Public Function FileCount(myPathName as String) as Long
Dim FSO as New FileSystemObject
Dim fld as Folder
If FSO.FolderExists(myPathName) Then
Set fld = FSO.GetFolder(myPathName)
FileCount = fld.Files.Count
End If
End Function
This StackOverFlow question is similar, though for vb.net (which I don't know). I'd appreciate someone pointing me in the right direction. Many thanks.
I am very new to VBScript, and I am trying to write a simple script that will extract a file in a directory to a new directory. So far this is what I have (and it works well):
'USER VAR REPRESENTS WINDOWS USERNAME
Set oShell = CreateObject( "WScript.Shell" )
user=oShell.ExpandEnvironmentStrings("%UserName%")
'FOLDER TO BE EXTRACTED
ZipFile="C:\Users\"&user&"\Downloads\Test.zip"
'LOCATION TO EXTRACT FILES
ExtractTo="C:\Users\"&user&"\desktop"
'EXTRACT ZIP FILE
Set objShell = CreateObject("Shell.Application")
Set FilesInZip=objShell.NameSpace(ZipFile).items
objShell.NameSpace(ExtractTo).CopyHere(FilesInZip)
Set fso = Nothing
Set objShell = Nothing
Set oShell = Nothing
Now, if possible, if the "Desktop" folder cannot be found, or the "Test.zip" file cannot be found, I would like to search the C Drive for them, and then proceed with extracting, etc. I have seen some examples, but I cannot understand how to replicate them. How can I search the entire C drive and sub folders for these files?
Help would be appreciated, thanks in advance!
In general a recursive search can be done like this:
Function SearchFolder(fldr, name)
Set SearchFolder = Nothing
For Each f In fldr.Files
If LCase(f.Name) = LCase(name) Then
Set SearchFolder = f
Exit Function
End If
Next
For Each sf In fldr.SubFolders
Set result = SearchFolder(sf, name)
If Not result Is Nothing Then
Set SearchFolder = result
Exit Function
End If
Next
End Function
Set fso = CreateObject("Scripting.FileSystemObject")
Set f = SearchFolder(fso.GetFolder("C:\"), "Test.zip")
However, searching a whole drive that way will take quite some time. Also there are several folders that users don't have access to, so you'll have to account for that if you want to implement a search like this.
I've got various web apps (containing WCF services) in IIS under the default website. As long as they are all running in the same app pool they can access a shared isolated storage file no problem.
However, once I move them to different app pools I get "System.IO.IsolatedStorage.IsolatedStorageException: Unable to create mutex" when one tries to access a file created by another. They are all running under NetworkService user. I tried GetUserStoreForAssembly and GetMachineStoreForAssembly all with the same result. Any ideas why they couldn't use a shared file?
I made sure to close the stream and even dispose it in case one was holding onto it, but I am running a simple test where one service writes it, then another tries to read from it later, and it always fails.
Also, I am accessing the isolated store from a signed assembly.
Does anybody have any ideas?
Here is the code:
Private Sub LoadData()
Dim filename = FullFilePath(_fileName)
Dim isoStorage As IsolatedStorageFile = IsolatedStorageFile.GetUserStoreForAssembly()
' Tried GetMachineStoreForAssembly, same failure
isoStorage.CreateDirectory(ROOT_DIRECTORY)
If (isoStorage.GetFileNames(filename).Length = 0) Then
Return
End If
Dim stream As Stream = New IsolatedStorageFileStream(filename, FileMode.OpenOrCreate, isoStorage)
If stream IsNot Nothing Then
Try
Dim formatter As IFormatter = New BinaryFormatter()
Dim appData As Hashtable = DirectCast(formatter.Deserialize(stream), Hashtable)
Dim enumerator As IDictionaryEnumerator = appData.GetEnumerator()
While enumerator.MoveNext()
Me(enumerator.Key) = enumerator.Value
End While
Finally
stream.Close()
stream.Dispose()
stream = Nothing
End Try
End If
End Sub
Public Sub Save()
Dim filename = FullFilePath(_fileName)
' Open the stream from the IsolatedStorage.
Dim isoFile As IsolatedStorageFile = IsolatedStorageFile.GetUserStoreForAssembly()
' Tried GetMachineStoreForAssembly, same failure
Dim stream As Stream = New IsolatedStorageFileStream(filename, FileMode.Create, isoFile)
If stream IsNot Nothing Then
Try
Dim formatter As IFormatter = New BinaryFormatter()
formatter.Serialize(stream, DirectCast(Me, Hashtable))
Finally
stream.Close()
stream.Dispose()
stream = Nothing
End Try
End If
End Sub
Looks like it was a trust issue.
After adding the assembly accessing the isolated storage file to the gac it magically worked as everything in the gac has full trust set automatically.
This works for me, but it might not always be an option to do this for other solutions. Check out the .NET Framework caspol utility if this is the case.
Hope this helps somebody! It was a huge pitafor me.
Trying to use a loop to check if images exists however it is always returning false. I am sure I am doing something simple and stupid but here is the code:
dim fs, sql_except
set fs=Server.CreateObject("Scripting.FileSystemObject")
if Not rs.eof then
arrRS = rs.GetRows(30,0)
set rs = nothing
If IsArray(arrRS) Then
For i = LBound(arrRS, 2) to UBound(arrRS, 2)
sku = arrRS(0, i)
if (fs.FileExists("../i/"&sku&".gif")=false) Then
response.write sku&"does not exist<br>"
end if
next
end if
erase arrRS
end if
set fs=nothing
You appear to be operating under the impression that the current folder context the your call to FileExists will assume is the physical folder containing the ASP script being executed. This is not so, it most likely will be "C:\windows\system32\inetsrv". You are also using URL path element separator / where FileExists is expecting windows physical path folder separator \.
You need to use Server.MapPath to resolve the path. This may work:
if Not fs.FileExists(Server.MapPath("../i/"&sku&".gif")) then
However you may run in to trouble with the parent path "..", this may not be allowed for security reasons. This might be a better approach:
Dim path : path = Server.MapPath("/parentFolder/i") & "\"
For i = LBound(arrRS, 2) to UBound(arrRS, 2)
sku = arrRS(0, i)
if Not fs.FileExists(path & sku & ".gif") Then
response.write Server.HTMLEncode(sku) & " does not exist<br>"
end if
next
Where "parentFolder" is the absolute path from the site root.
This is a method in ASP Classic that saves a file to disk. It takes a very long time but I'm not sure why. Normally, I wouldn't mind so much, but the files it handles are pretty large so need this needs to faster than 100kB a second save. Seriously slow. (old legacy system, band aid fix till it gets replaced...)
Public Sub SaveToDisk(sPath)
Dim oFS, oFile
Dim nIndex
If sPath = "" Or FileName = "" Then Exit Sub
If Mid(sPath, Len(sPath)) <> "\" Then sPath = sPath & "\" '"
Set oFS = Server.CreateObject("Scripting.FileSystemObject")
If Not oFS.FolderExists(sPath) Then Exit Sub
Set oFile = oFS.CreateTextFile(sPath & FileName, True)
For nIndex = 1 to LenB(FileData)
oFile.Write Chr(AscB(MidB(FileData,nIndex,1)))
Next
oFile.Close
End Sub
I'm asking because there are plenty of WTF's in this code so I'm fighting those fires while getting some help on these ones.
I don't see your definition for "FileData" anywhere in your code - where is this coming from? Is there a reason you're writing it to disk a single character at a time? I'd suspect this is your problem - writing 100K of data takes 100K trips through this loop, which could be the reason for your slowdown. Why can't you replace the write loop at the bottom:
For nIndex = 1 to LenB(FileData)
oFile.Write Chr(AscB(MidB(FileData,nIndex,1)))
Next
with a single statement to write the file all at once?
oFile.Write FileData
What you should do is read the binary request into an ADODB.Stream object and convert it to plain ASCII text in a single fast step.
Set objStream = Server.CreateObject("ADODB.Stream")
objStream.Type = 1
objStream.Open
objStream.Write Request.BinaryRead(Request.TotalBytes)
objStream.Position = 0
objStream.Type = 2
objStream.Charset = "ISO-8859-1"
FormData = objStream.ReadText
objStream.Close
Set objStream = Nothing
Notice how the variable FormData now contains the form data as text. Then you parse this text and locate the start and length of each file, and use ADODB.Stream CopyTo method to extract the specific portion of the file and save it do disk.