Java read from a specific volume address - volume

I have to read some bytes from a specific address of a volume of external memory card.
I do not have a file to read but a drive address, how can I read with Java? I'm used to read by file path, I never read from a volume address...

This is the solution:
File diskRoot = new File ("\\\\.\\PhysicalDrive0");
RandomAccessFile diskAccess = new RandomAccessFile (diskRoot, "r");
byte[] content = new byte[1024];
diskAccess.readFully(content);
However it needs to be run as Administrator.

Related

Upload dynamically created XML file using SFTP guidance

Currently I am generating an xml file for download using posted fields with the following code:
string attachment = "attachment; filename=" + FileName + ".xml";
Response.ClearContent();
Response.ContentType = "application/xml";
Response.AddHeader("content-disposition", attachment);
Response.Write(Session["FileForDownload"]);
Response.End();
This is working fine.
However I want to sftp upload the generated file to a specified directory on a server.
I have had success in connecting using ssh.net and have been able to create a new directory etc.
My question is how can I generate the file and then sftp it using ssh.net?
I've tried using a file stream with no success. I'm guessing the file needs to be temporarily stored and then retrieved for upload.
This is my current code segment for the specified problem:
SftpClient sftp = new SftpClient("host", "user", "pwd");
sftp.Connect();
sftp.ChangeDirectory("directory/");
Stream fs = File.OpenRead(Server.MapPath(#"filetobeuploaded"));
sftp.UploadFile(fs, Session["FileName"].ToString());
sftp.Disconnect();
I recognize that there won't be a file already on the server to upload.
Any help would be much appreciated as this is the final piece of the puzzle in my application.
Cheers
Fixed: I found a solution by generating a temp XML file in the server, uploading and deleting it. Thanks for your reply anyway

How to identify files uploaded already if it is renamed

Is there a way to know if a user is uploading a files which already has been uploaded before.
This is without comparing file names. This is in case the user renames the file.
Scenario
User uploads file via their web browser.
User renames file locally. User uploads file.
Webserver detects the renamed file and saves it as the renamed file name while removing the older file.
You could do a checksum on the file first submitted, store this checksum in a datatable with the filename. When the user submits again the renamed file you calculate again the checksum and search in the database if the checksum is already present.
The weakness of this solution is in the uniqueness of the checksum.
With this example I think you have good chances to get an unique checksum
(Expecting to be disowned)
public string GetChecksum(string filePath, HashAlgorithm algorithm)
{
using (var stream = new BufferedStream(File.OpenRead(filePath), 100000))
{
HashAlgorithm SHA512 = new SHA512Managed();
byte[] hash = SHA512.ComputeHash(stream);
return BitConverter.ToString(hash).Replace("-", String.Empty);
}
}

File upload in servlet corrupts the file

I am uploading a file using using (Valums uploader) and I using servlet at server side. File type is application/pdf. Code is :
String filename= request.getHeader("X-File-Name");
InputStream is = request.getInputStream();
File tmp = File.createTempFile(filename, "");
tmp.deleteOnExit();
FileOutputStream fos = new FileOutputStream(tmp);
IOUtils.copy(is, fos);
byte[] bytes = new byte[(int) tmp.length()];
is.read(bytes);
Now these bytes are getting stored into database as longblob. But it seems that inputStream in above code is adding some more data in the file thats why file data is getting corrupted. I download the same data as pdf file, found that both- original uploaded file and now downloaded file have the same size, but when the downloaded file is opened in Acrobat, it reports "File is corrupted". For upload request I have used only file input. So there are no chances of other input params in inputStream. Also the bytes array in above code are as it is passed for download. Why is data getting corrupted?
Your problem might be the data length you are reading. I had similar problem and posted on this issue link
Java: Binary File Upload using Restlet + Apache Commons FileUpload
Hope this helps

Unsuccessful upload to a ftp server on Linux using Qt![File zero size at destination]

I am trying to upload a file to a FTP server, which is a local server created by vsftpd. I have set necessary parameters needed for connecting and transferring files in the vsftpd.conf file. My requirement is to upload a file to this server. When i logged statechanged messages, HostLookup, Connecting, Connected, Logged in, closing, and Unconnected messages were emitted by my ftp object. But when i check in the destination directory the file is there but of 0 size...What could be wrong? following is the code I used...
QImage img("./Sample.jpg");
QBuffer* buf = new QBuffer();
buf->open(QBuffer::ReadWrite);
buf->seek(0);
img.save(buf, "jpg");
connection = new QFtp();
connection->connectToHost("localhost");
connection->login();
connection->cd("ftpshare/");
connection->put(buf, "Sample.jpg", QFtp::Binary);
qDebug(QString::number(connection->error()).toLatin1());
qDebug(connection->errorString().toLatin1());
connect(connection,SIGNAL(stateChanged(int)),this,SLOT(ftpstatechanged(int)));
connection->close();
Are you sure the first line of finds the Sample.jpg in the folder it is looking in? Maybe the working directory is not what you think it is. Otherwise this should work just fine.
The problem was with usage of buffer. It got solved when I used QByteArray instead.
QImage img("./Sample.png");
QByteArray ba;
QBuffer buffer(&ba);
buffer.open(QIODevice::WriteOnly);
img.save(&buffer, "PNG");
connection = new QFtp();
connection->connectToHost("localhost");
connection->login();
connection->cd("ftpshare/");
connection->put(ba, "Sample.png", QFtp::Binary);

Split zip file using DotNetZip Library

I'm using DotNetZip Library to create a zip file with about 100MB.I'm saving the zip file directly to the Response.OutputStream
Response.Clear();
// no buffering - allows large zip files to download as they are zipped
Response.BufferOutput = false;
String ReadmeText= "Dynamic content for a readme file...\n" +
DateTime.Now.ToString("G");
string archiveName= String.Format("archive-{0}.zip",
DateTime.Now.ToString("yyyy-MMM-dd-HHmmss"));
Response.ContentType = "application/zip";
Response.AddHeader("content-disposition", "attachment; filename=" + archiveName);
using (ZipFile zip = new ZipFile())
{
// add a file entry into the zip, using content from a string
zip.AddFileFromString("Readme.txt", "", ReadmeText);
// add the set of files to the zip
zip.AddFiles(filesToInclude, "files");
// compress and write the output to OutputStream
zip.Save(Response.OutputStream);
}
Response.Close();
what i need is to split this 100MB file in to with about 20MB sections and provide the download facility to the user.how can i achieve this?
Your question is sort of independent of the ZIP aspect. Basically it seems you want to make available for download a large file of 100mb or more, and you want to do it in parts. Some options:
Save it to a regular file, then transmit it in parts. The client would have to make a distinct download request for each of the N parts, selecting the appropriate section of the file via the HTTP Range header. The server would have to be set up to server ZIP files with the appropriate MIME type etc.
save it to a split (spanned) zip file, which implies N different files. The client would then make an HTTP GET for each of the distinct files. The server would have to be set up to server .zip, .z00, .z01, etc. I'm not sure if built-in OS tools handle split zip files appropriately.
save the file as one large blob, and have the client use BITS or some other restartable download facility.

Resources