I have an application that downloads files from within a gridview. When a single file is attached to a row I can easily download the files but when I try to have multiple files within a row my application can no longer download. Could anybody possibly suggest a way I could accomplish this?
Thanks in advance
When user click and as to download a file you can give this javacript command
window.location = "FileForDownload.jpg";
If you won to give the user the ability to download more than one files is more complicate.
Firs you must need to know that the browser will alert that action and stop it until user accept the multiple downloads.
Now you could easy say that you can send many files at window.location but if this happens with out synchronization then the one file can stop the other. The synchronization here can be done with this trick of cookies and give the command the second download only after the first have been start (or end).
One more solution is to call the window.open("FileForDownload.jpg"); many times (with a timer delay of course) that is not need synchronization.
This is a general idea in javascript, but need some improvement to open the files with a delay.
function DownloadAllFiles()
{
for (var i=0; i< arguments.length; i++){
window.open( "http://www.domain.com/downloadpath/" + arguments[i]);)
}
}
// example of call
DownloadAllFiles("File1.zip","File2.zip","File3.zip");
Related
In an ASP.NET application, is there a way to get the nlog output to go in a buffer in memory?
I'd like to make a circular buffer that would display the log's content on a web page.
What about writing to the memory target?
Then you could read them as follow:
var target = LogManager.Configuration.FindTargetByName<MemoryTarget>("target1");
var logs = target.Logs;
You can specify custom targets with NLog: https://github.com/nlog/NLog/wiki/Targets
It does not look like one exists that does exactly what you need, but you can write your own!
https://github.com/NLog/NLog/wiki/Extending-NLog
Why don't you read tail n-lines from the actual log file and display it? Or send to a database and display the table? Seems that this will added more overhead and memory constants to the ASP.Net site.
I have made a script that takes files from directory, and sends them to backburner for network rendering. When I run the script it renders fine but without the render elements they dont show in the backburner monitor nor do they save.
If I open some of the files manualy and send them to render with backburner it works fine, but not with the script?
The render element is VrayAlpha, but I dont think it matters.
This is the code Im using
on btnRender pressed do
(
outputFilesDir = textModelsOut.text + "*.max"
toRender = getFiles outputFilesDir
man = NetRender.GetManager()
man.connect #automatic "255.255.255.0"
man.GetControl()
for s in toRender do
(
renderModelPath = getFilenamePath s + filenameFromPath s
job = man.newJob file:renderModelPath
job.Submit()
)
man.Disconnect()
)
And this is quote from maxscript documentation, it says that render element data will not be available but it will be processed.
Jobs can not have maps included, and render element data will not be
available for submitted job but render elements will process
correctly. These problems are resent when submitting a job from a
file, but not when submitting the current scene.
Anyways my solution was to use job.newJob() to open each scene and submit the current scene.
You should always include your code (or at least some of it) so that we can check it for issues and test it our selves.
However, I usually use a struct called NetRenderAutomation, developed by Gravey.
You can find it here:
http://forums.cgsociety.org/showthread.php?f=98&t=1059510&page=1&pp=15
I haven't had any problems with it, and it is fairly easy to use, and you are even allowed to modify it, if you need some special features for your self.
Hope you can use the answer.
Else feel free to post some code, and I'll look into it.
I'm using IMFSequencerSource for creating a cut-list of media files. I'm also handling the MENewPresentation for queuing the next file. All return values are S_OK.But, when the first file ends, there is no output on the screen. I can see the HDD is still active and the reading is still in progress, but not a single frame on the screen.
Here is a few events from the Invoke() method:
MEEndOfPresentationSegment
MF_TOPOSTATUS_ENDED
MESessionNotifyPresentationTime
MF_TOPOSTATUS_SINK_SWITCHED
MF_TOPOSTATUS_READY
MF_TOPOSTATUS_STARTED_SOURCE
So, it seems like the sink is switched and the new source started, but why there are no frames on the screen?
The problem occurs ONLY when I put two file with the same format back to back. So, I guess it's a bug in MS super-super new technology.
I'm trying to build a website (I learning this whole subject now), and maybe the anwser is very simple.
I am devaloping in ASPX/C#, and I want that in form, there is a select field (<select>)
with option of number of files to upload, the max files to upload is 4.
I want that after I select the number of files, there will be some up;oad fields (in the number that I already chose).
My question is how can I do that? (maybe with javascript of AJAX ? I have no idea how)
Wish for help, Thanks.
I am not sure if this is what you are looking for, but give it a try
Try this:
http://jsfiddle.net/2bZwD/
`$('#select1').change(function(){
var count = $(this).val();
var uploadcount = 0;
$('.upload').each(function(){
if (count > uploadcount)
{
$(this).show('slow');
uploadcount++;
}
else
{
$(this).hide('slow');
}
});
});`
There will be two approach
1) Javascript : Using javascript you can read the max file number and add the Upload html tag on the document . As you are using ASPX , it will not work because when the form was build and viewstate was genetated these fields were not the part. If you will use ASP.NET MVC it will work and you easily using the jquery
2) If you want to use the ASP.NET webform you have to do the AutoPostback equals to true for the dropdown list and then read the value on the Selected Index change event on the server and file upload control on the server side. It has a drawback that it will require full post back. You can use the Updatepanel to do the partial post back and get the file controls in the page.
I have a Flex application with multiple modules.
When I redeploy the application I was finding that modules (which are deployed as separate swf files) were being cached in the browser and the new versions weren't being loaded.
So i tried the age old trick of adding ?version=xxx to all the modules when they are loaded. The value xxx is a global parameter which is actually stored in the host html page:
var moduleSection:ModuleLoaderSection;
moduleSection = new ModuleLoaderSection();
moduleSection.visible = false;
moduleSection.moduleName = moduleName + "?version=" + MySite.masterVersion;
In addition I needed to add ?version=xxx to the main .swf that was being loaded. Since this is done by HTML I had to do this by modifying my AC_OETags.js file as below :
function AC_FL_RunContent(){
var ret =
AC_GetArgs
( arguments, ".swf?mv=" + getMasterVersion(), "movie", "clsid:d27cdb6e-ae6d-11cf-96b8-444553540000"
, "application/x-shockwave-flash"
);
AC_Generateobj(ret.objAttrs, ret.params, ret.embedAttrs);
}
This is all fine and works great. I just have a hard time believing that Adobe doesn't already have a way to handle this. Given that Flex is being targeted to design modular applications for business I find it especially surprising.
What do other people do? I need to make sure my application reloads correctly even if someone has once per session selected for their 'browser cache checking policy'.
I had a similar problem, and ended up putting the SWF files in a sub-directory named as the build number. This meant that the URL to the SWF files pointed to a different location each time.
Ideally this should be catered for by the platform, but no joy there. But this works perfectly for us, and integrates very easily into our automated builds with Hudson - no complaints so far.
Flex says:
http://www.adobe.com/livedocs/flex/2/docs/wwhelp/wwhimpl/common/html/wwhelp.htm?context=LiveDocs_Parts&file=00001388.html
What I have done is checksum the SWF file and then add that to its url. Stays the same until the file is rebuilt/redeployed. Handled automagically by a few lines of server-side PHP script
here is sample.
function AC_FL_RunContent(){
var ret = AC_GetArgs(arguments, ".swf?ts=" + getTS(), "movie",
"clsid:d27cdb6e-ae6d-11cf-96b8-444553540000",
"application/x-shockwave-flash");
AC_Generateobj(ret.objAttrs, ret.params, ret.embedAttrs);
}
function getTS() {
var ts = new Date().getTime();
return ts;
}
AC_OETags.js is file and it exists html-template several places.
but as my posting said, I am facing another type of problem.
The caching is not done by Flash Player but by the browser, so it's out of Adobe's control. I think you have found a workable solution. If I want to avoid caching I usually append a random number on the URL.