asp.net System.IO.IOException - asp.net

I am trying to programatically build a list of files in a folder, with certain attributes like file size and modified date.
I can return the file name, but any other attribute throws an error: System.IO.IOException: The filename, directory name, or volume label syntax is incorrect.
What am I missing here?
private void BuildDocList()
{
var files = Directory.GetFiles(Server.MapPath(FilePath));
foreach (var f in files)
{
var file = new FileInfo(FilePath + f);
var fileItem = new ListItem();
// this line works fine
fileItem.Text = file.Name.Split('.')[0] + ", ";
// this line causes the runtime error
fileItem.Text = file.CreationTime.ToShortDateString();
FileList.Items.Add(fileItem);
}
}

You're trying to use the wrong filename for the FileInfo - you're using the unmapped path. You should use something like this:
string directory = Server.MapPath(FilePath);
string[] files = Directory.GetFiles(directory);
foreach (string f in files)
{
FileInfo file = new FileInfo(Path.Combine(directory, f));
// Now the properties should work.

Related

how do read and write 15G txt file with 50 million record in asp core 6?

I want to read a 50milion record from 15G txt file and write in to elastic search
if (file.Length > 0)
{
string wwroot = _he.WebRootPath;
string contentpath = _he.ContentRootPath;
string path = Path.Combine(wwroot, "file/" + foldername);
if (!Directory.Exists(path))
{
var rcheck = Directory.CreateDirectory(path);
}
var filename = file.FileName;
var filepath = Path.Combine(path, filename);
if (filepath.Any())
{
using (FileStream stream = new FileStream(Path.Combine(path, filename), FileMode.Create))
{
file.CopyTo(stream);
}
}
string[] lines = System.IO.File.ReadAllLines(filepath);
var Plist = new List<Person>();
int i = 0;
foreach (var line in lines)
{
var newperson = new Person();
string[] sub = line.Split(":");
newperson.PId = sub[1];
newperson.FirstName = sub[2];
newperson.LastName = sub[3];
newperson.Gender = sub[4];
Plist.Add(newperson);
}
return View();
I can read and upload file but when in want to add to list I get error and only read 16000 items and my application is shutdown.
You need to read the file using a buffer. With a proper reading logic based on a buffer, you'll be able to read a file of any size.
This line here:
System.IO.File.ReadAllLines(filepath);
Reads ALL the content of 15 GB file at once, and attempts to put it all into memory. I don't know how your code managed to get past that line without throwing an OutOfMemoryException (reading "only" 4.62 GB file ate 19.2 GB of my memory when debugging).
Instead, use a buffer of a single line:
using var streamReader = File.OpenText(bigFilePath);
var fileLine = string.Empty;
while ((fileLine = streamReader.ReadLine()) != null)
{
// Your string line reading logic.
}
You will most probably not be able to keep all the records in the memory (depending on memory available), also sending them one by one to Elasticsearch would be an opposite of efficiency... so, you'll need to find a middle ground between those limitations. I would suggest batching, that is, sending records in a fixed-size groups. The size is for you to pick, but note that it shouldn't be super large or minimal, otherwise the benefits of using batching will be smaller.
Full code:
static void Main()
{
string wwroot = _he.WebRootPath;
string contentpath = _he.ContentRootPath;
string path = Path.Combine(wwroot, "file/" + foldername);
var peopleListBatch = new List<Person>();
const int BatchSize = 1024;
using var streamReader = File.OpenText(path);
var fileLine = string.Empty;
while ((fileLine = streamReader.ReadLine()) != null)
{
var lineParts = fileLine.Split(":");
var newperson = new Person
{
PId = lineParts[1],
FirstName = lineParts[2],
LastName = lineParts[3],
Gender = lineParts[4],
};
peopleListBatch.Add(newperson);
// Add to Elastic, but only when batch is full.
if (peopleListBatch.Count == BatchSize)
{
AddPersonsToElasticSearch(peopleListBatch);
peopleListBatch.Clear();
}
}
// Add remaining people, if any.
if (peopleListBatch.Count > 0)
{
AddPersonsToElasticSearch(peopleListBatch);
peopleListBatch.Clear();
}
}
Inserting to Elasticsearch is another story, and I leave that task to you:
static void AddPersonsToElasticSearch(List<Person> people)
{
// TODO: Add your inserting logic here.
}

System.IO.File.ReadAllBytes Access to the path denied

Am running the project on the visual studio 2015, When I tried to read the PDF its giving me the following error;
Access to the path 'E:\FILE\FILEUPLOAD\InnerFile\File' is denied.
Function Defination
var cd = new System.Net.Mime.ContentDisposition { FileName = "PDF.pdf", Inline = true };
string contentType = MimeMapping.GetMimeMapping("PDF.pdf");
Response.AppendHeader("Content-Disposition", cd.ToString());
var innerPath = "InnerFile/File" ;
FileInfo fi = new FileInfo(PDFUploadRootPath + innerPath + "/PDF.pdf");
byte[] bytes = System.IO.File.ReadAllBytes(PDFUploadRootPath + innerPath);
return File(bytes, contentType);
NOTE:
Given Full permission to user
Physically File Exists
I dont understand what to do now please help!
Your FileInfo instance indeed references 'E:\FILE\FILEUPLOAD\InnerFile\File\PDF.pdf':
FileInfo fi = new FileInfo(PDFUploadRootPath + innerPath + "/PDF.pdf");
but when trying to read the file contents you forgot the file name and only use the path 'E:\FILE\FILEUPLOAD\InnerFile\File':
byte[] bytes = System.IO.File.ReadAllBytes(PDFUploadRootPath + innerPath);
Thus, also add the file name for reading all file bytes:
byte[] bytes = System.IO.File.ReadAllBytes(PDFUploadRootPath + innerPath + "/PDF.pdf");
Furthermore, as others have mentioned in comments, you should really use Path.Combine to glue path parts together, not simple string concatenation...
Try using FileStream instead of byte array for reading the pdf file.
FileStream templateFileStream = File.OpenRead(filePath);
return templateFileStream;
Also check (through code) if user has write permission to directory or path:
public static bool HasUserWritePermission(String path, String NtAccountName)
{
DirectoryInfo di = new DirectoryInfo(path);
DirectorySecurity acl = di.GetAccessControl(AccessControlSections.All);
AuthorizationRuleCollection rules = acl.GetAccessRules(true, true, typeof(NTAccount));
Boolean hasPermission = false;
//Go through the rules returned from the DirectorySecurity
foreach (AuthorizationRule rule in rules)
{
//If we find one that matches the identity we are looking for
if (rule.IdentityReference.Value.Equals(NtAccountName, StringComparison.CurrentCultureIgnoreCase))
{
//Cast to a FileSystemAccessRule to check for access rights
if ((((FileSystemAccessRule)rule).FileSystemRights & FileSystemRights.WriteData) > 0)
{
hasPermission = true;
}
else
{
hasPermission = false;
}
}
}
return hasPermission;
}

Retrieving object model is blank?

I have a foreach loop which cycles through a bunch of images, and uploads them using this code:
foreach(var image in fetchimages) {
string fileName = "https://www.mywebsite.co.uk" + image.ImageOriginalURL;
var uploadParams = new ImageUploadParams()
{
File = new FileDescription(fileName)
};
var uploadResult = cloudinary.Upload(uploadParams);
var mytest = new ImageUploadResult();
myurl = mytest.SecureUri;
db.Execute("UPDATE Property_Images SET NewURL = #0 WHERE ImageID = 145", myurl);
}
However, each time, the myurl variable is empty. I'm thinking i possibly have the ImageUploadResult() in the wrong place in the foreach loop?
You are creating an empty upload result and checking the value there. Please try uploadResult.SecureUri
Also, it's best to check uploadResult.Error to see if an error occurred in the upload.

How do I write in log file from start instead append it to end of file

I have following code for log file
which will create a log file and write log into that file.
But I want to write that log into file in DESC order
so that recent log text will come first.
string FilePath = Path(MYLOG.txt");
if (!File.Exists(FilePath))
{
byte[] fileBytes = null;
fileBytes = Encoding.GetEncoding(1252).GetBytes("My Log -\n");
using (Stream streamToWrite = File.Create(FilePath))
{
streamToWrite.Write(fileBytes, 0, fileBytes.Length);
streamToWrite.Flush();
}
I mean want to write new content from start not to end of the file
I think this will do
string currentContent = String.Empty;
if (File.Exists(filePath))
{
currentContent = File.ReadAllText(filePath);
}
File.WriteAllText(filePath, newContent + currentContent );

get files with DirectoryInfo in Virtual directory

I have a arrav string like this:
string[] imgList = new[] { "/Images/10000489Back20130827.jpg", "/Images/2F.jpg", "/Images/10000489Front20130827.jpg" };
that contain names of file, contained in an virtaul directory.
If this parameteres I assigned to an ImageUrl, the image is displayed. In the detail of the pages show the propertie like this:
src="/Images/1F.jpg"
But when I try to looking for the files in specific directory all the files and assigned to an ImageUrl rpoperties, the images it's not displayed. I note that the path retrieve complete, and not a reference of the virtaul directory
string path = "/Images"; ///Obtener el path virtual
DirectoryInfo directory = new DirectoryInfo(path);
FileInfo[] files = directory.GetFiles("*.jpg");
imgList = files.Where(x => x.FullName.Contains(clientNumber)).Select(x => x.FullName).ToList().ToArray();
I retrieve this path:
src="C:/Images/1F.jpg"
How can I get only the virtual path with the name of the file using DirectoryInfo class?
try this:
string path = "/Images";
DirectoryInfo directoryInfo = new DirectoryInfo(Server.MapPath(path));
I resolved it:
Instead to pass the full path:
x.FullName
I concatenate the virtual path with the file name.
path + x.Name
Example
imgList = files.Where(x => x.FullName.Contains(clientNumber)).Select(x => path + x.Name).ToList().ToArray();
Show Image In Virtual Path Like (C:\Users\User\Desktop\Signature.png) : Its Working
I used Try Catch To Avoid Error
Cliend Side :
asp:Image runat="server" ID="Image1" />
Server Side :
try
{
Byte[] bytes = System.IO.File.ReadAllBytes(#"C:\Users\User\Desktop\Signature.png");
Image1.ImageUrl = "data:image/png;base64," + Convert.ToBase64String(bytes, 0, bytes.Length);
Image1.Visible = true;
}
catch (Exception e)
{
}

Resources