I am running a server with owncloud for a bunch of users.
However, i totally forbid the usage of this cloud for illegal stuff, like movies/audio/album/zip containing tv shows etc.
How can i be sure that users won't store files downloaded from torrent websites ?
Is there a unix binary that can fetch some screenshots from the mkv/avi file, and check whether there is a watermark or a known picture (20th century fox, warner etc.)
I cannot search for weird strange names like 'DVDRIP', since filenames are encrypted.
Related
I have a file system in which the files are stored with the numeric number (SQL index) but my VM size is full and I can't shift my files to different Cloud or anything.
My file system URL will be
https://example.com/5374/randomstring.jpg
5374 is file number which is saved in SQL DB and a random string is generated.
What I'm planning to do is using nginx redirecting right now I have 56770 in a vm if a user tries to upload it will go and save in different vm and if user wants to access 56771 means using nginx it should point to that VM.
You will make your life easier by choosing the cutoff point yourself, it's not essential but it will make matching a regular expression a lot more concise.
If you said 56000 and above was on VM2 then your regex is as simple as /([5-9][6-9][0-9][0-9][0-9])/
I have mp3 players set up on my site to play mp3s. At the moment, users can easily look through the source, run a search for "mp3" and download all of the music on my site. I know it's virtually impossible to completely prevent a determined user from downloading the music but I want to make it harder for the average user. Is there any way I can obfuscate the links to the mp3s?
Relevant site: http://boyceavenue.com/music
You did not specify the language you are using. To expand upon what Marc B wrote, I would recommend using the PHP http_send_file command along with the checksum of the file.
To send the file, use the following:
$filename = "/absolute/or/relative/path/to/file.ext";
$mime_type = "audio/mpeg"; // See note below
http_send_content_disposition($filename, true);
http_send_content_type($mime_type);
http_throttle(0.1, 2048);
http_send_file($filename);
If you are serving up multiple types of files using PHP 5.3.0 or later, you could determine the mimetype this way:
$filename = "/absolute/or/relative/path/to/file.ext";
$finfo = finfo_open(FILEINFO_MIME_TYPE);
$mime_type = finfo_file($finfo, $filename);
finfo_close($finfo);
Calculating the checksum is simple enough. Just use md5_file.
So, what can you do with the checksum?
Well, you could create an array of checksums and filenames that cross-reference each other. That is, you include the checksum in the links to the file, but have a little routine that looks up the checksum and delivers the mp3 file. You also could do this in a database. You also could do like some apps that store files in a directory structure based on their checksums (music/3/3a/song.mp3 with a checksum of 3a62f6 or whatever).
If you don't care about the filenames being mangled, you could save the files with a checksum for the filename. That could be done at upload time (if your files are being uploaded) or through a batch script (using the CLI).
Another thing you should do is to put a default document (index.php or whatever) that tells people to look elsewhere. Also disable browsing the directory. If only a very small number of people will need access, you could also throw a password on the directory, thus requiring a login to access the files.
In a ASP.NET application, users are having a hard time trying to upload files.
The application is access by about 30 people through 1Mb fiber optic dedicated line to the server.
The file upload implementation doesn't seem to have problems since the problem doesn't occur frequently only occasionally.Also i already tried to upload larger files (up to 50 Mb) and i didn't have any problem.
So i assume the problem must be on the network connection that some days must have some kind of bandwidth problem.
How can i diagnose effectively the root cause of the problem?
Thanks.
Do you log your exceptions? Logging modules for ASP.NET
http://code.google.com/p/elmah/
http://www.codeproject.com/Articles/14819/How-to-use-log4net
If don't you can look web server eventlog.
Error may come upload limitations of web server and security problems.
The fileupload module was filled with lots of try catch blocks , so the message showing on log4net log files was the error message provided by the developer. After trying step by step to reproduce the problem we got to a part where the error message was from the system "the file cannot be saved because it excedes 50 chars", but the file was about 40 chars. The developer was using a function Server.HTMLEncode on the filename to store the name on the database but when the file had special chars the file name sometimes increased beyond the 50 chars because of the text encoding. The solution was increasing the size of the file name by some chars and when validating the file name, we validated with HTML.Encode length and warned the user to decrease the exceeding x number of chars for the file to be valid.
We have one SAP system in the US (let's call it TKIJVPL1), this system has an SAP client 241. We have another SAP system in Germany (lets call it Lockweiler).
We need to move this client 241 from our TKIJVPL1 server to this new server.
Can I simply use transaction SCC8? It says client export, but when I look at the options, it says
Source client : 241 (which is good),
Profile Name SAP_ALL (which is also good as I need all data),
but Target System all that is coming up is PL1 / QL1.
What is the easiest way to export one client from one system to another system in SAP system?
Or I would rather export it to a hard drive, take these files place them on a DVD and mail them to Germany. But I do not see an export to the local disk transaction???
I got it working.
You still need to use transaction SCC8, it could potentially take 8-20+ hours doing a client export. As it is exporting you need to look at the folder \usr\sap\trans\data and in there SAP will create some files with the system name as an extension. In my case it created files like RT03292.PL1 and a few others.
Now one needs to take these files to do an import...
Here are some good links pertaining to this:
http://basissap.blogspot.com/2008/05/what-is-client-copy.html
http://forums.sdn.sap.com/thread.jspa?threadID=1713405&tstart=0
Make sure you look at /usr/sap/trans/data for the data files (usually start with R).
Also make sure you look at /usr/sap/trans/cofiles (usually starts with a K).
We have a vendor that sends CSV files as email attachments. These CSV files contain statuses that are imported into our application. I'm trying to automate the process end-to-end, but it currently depends on someone opening an email, saving the attachment to a server share, so the application can use the file.
Since I cannot convince the vendor to change their process, such as offering an FTP location or a Web Service, I'm stuck with trying to automate the existing process.
Does anyone know of a way to programmatically open an email from a POP3 account and extract an attachment? The preferred solution would reside on a Windows 2003 server, be written VB.NET and secure. The application can reside on the same server as the POP3 server, for example, we could setup the free POP3 server that comes with Windows Server and pull against the mail file stored on the file system.
BTW, we are willing to pay for an off-the-shelf solution, if one exists.
Note: I did look at this question but the answer points to a CodeProject solution that doesn't deal with attachments.
Try Mail.dll email component, it's very affordable, supports attachments national characters and is easy to use, it also supports SSL:
Using pop3 As New Pop3()
pop3.Connect("mail.server.com")
pop3.Login("user", "password")
Dim builder As New MailBuilder()
For Each uid As String In pop3.GetAll()
' Receive email message'
Dim mail As IMail = builder.CreateFromEml(pop3.GetMessageByUID(uid))
'Write out received message'
Console.WriteLine(mail.Subject)
'Here you can use mail.Attachmets collection'
For Each attachment As MimeData In mail.Attachments
Console.WriteLine(attachment.FileName)
attachment.Save("c:\" + attachment.FileName)
' you can also use attachment.Data here'
Next attachment
Next
pop3.Close(true)
End Using
You can download it here: http://www.lesnikowski.com/mail.
possible duplication of Reading Email using Pop3 in C#
Atleast, there's a shed load of suggestions there that you may find useful
I'll throw in a late suggestion for a more generalized "download POP3 messages and extract attachments" solution using existing software and minimal programming. I needed to do this for a client who switched to receiving faxes via email and was not pleased with manually saving the attachments to a location where they could be imported into an application.
For downloading messages on *nix systems fetchmail seems to be the standard and is very capable, but I chose mpop for both simplicity and Windows compatibility (but it is cross-platform). If mpop hadn't done the trick for me, I probably would have ended up doing something with the Python-based getmail, which was created when fetchmail's development stalled for a time (it's since resumed).
Mpop is controlled either via command line or configuration file, so I simply created multiple configuration files and specify via command line which file to load. I'm using it in "Exchange pickup directory" mode, which means it simply downloads the messages and drops them as text (.eml) files in a specified directory.
For extraction of the message attachments, UUDeview appears to be the standard (I'm using the Windows port of UUDeview) across just about any system you could want with just about any features you could want. My main alternative to this was a much-less-capable Python script that I'd developed for a different client back in 2007, but I'm happy to go with a precompiled executable over either installing Python or packaging with any of the Python-to-exe options.
Finally there's the configuration - along with the two mpop configuration files mentioned above (which I could do away with by using command-line options), I also have two 2-line .cmd files launched every 10 minutes by scheduled task - the first line to launch mpop to download into a working directory and the second line to launch UUDeview and extract attachments of specified types (.pdf or .tif) then delete each file from which it extracted attachments. Output is sent to another directory from which staff can directly attach files as needed.
This is overall not the most elegant way to reach these ends, but it was quick, simple, functional and reasonably robust - at each stage if something goes wrong it fails such that no data is lost. The only places where data could be lost are any non-attachment messages being sent to the dedicated fax email addresses, and even those will sit in the processing directory and be caught eventually.