How to clear java.lang.OutOfMemoryError in android PDFView? - android-fragments

friends i have created a pdfview in my app using webview successfully. But sometimes, i am facing crashes when i load pdf files with more pages like above 40, it will be helpfull if anyone of you guys can tell me why am i getting this message and what i have to do to avoid getting this error.Below is my logcat error message, which i got.
Logcat error:
Out of memory on a 1322872-byte allocation.
FATAL EXCEPTION: AsyncTask #2
java.lang.RuntimeException: An error occured while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:300)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:355)
at java.util.concurrent.FutureTask.setException(FutureTask.java:222)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
at java.lang.Thread.run(Thread.java:841)
Caused by: java.lang.OutOfMemoryError
at android.graphics.Bitmap.nativeCreate(Native Method)
at android.graphics.Bitmap.createBitmap(Bitmap.java:836)
at android.graphics.Bitmap.createBitmap(Bitmap.java:813)
at android.graphics.Bitmap.createBitmap(Bitmap.java:780)
at com.sun.pdfview.PDFPage.getImage(PDFPage.java:219)
at android.os.AsyncTask$2.call(AsyncTask.java:288)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
Below is my code:
RandomAccessFile f = new RandomAccessFile(file, "r");
data1 = new byte[(int)f.length()];
f.readFully(data1);
RandomAccessFile f = new RandomAccessFile(file, "r");
byte[] data1 = new byte[(int)f.length()];
f.readFully(data1)
ByteBuffer bb = ByteBuffer.NEW(data1);
PDFFile pdf = new PDFFile(bb);
PDFPage PDFpage = pdf.getPage(1, true);
final float scale = ViewSize / PDFpage.getWidth() * 0.95f;
float pdfWid = PDFpage.getWidth() * scale;
float pdfH = PDFpage.getHeight() * scale;
Bitmap page;
if(page != null){
page.recycle();
page = null;
}
page = PDFpage.getImage((int)pdfWid, (int)pdfH, null, true, true);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
page.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
stream.reset();
String base64 = Base64.encodeToString(byteArray, Base64.NO_WRAP);
String html = "<!DOCTYPE html><html><body bgcolor=\"#b4b4b4\"><img src=\"data:image/png;base64,"+base64+"\" hspace=10 vspace=10><br>";
int size = pdf.getNumPages();
try{
for(int i = 2; i <= size; i++)
{
PDFpage = pdf.getPage(i, true);
page = PDFpage.getImage((int)pdfWid, (int)pdfH, null, true, true);
page.compress(Bitmap.CompressFormat.PNG, 100, stream);
byteArray = stream.toByteArray();
stream.reset();
page.recycle();
base64 = Base64.encodeToString(byteArray, Base64.NO_WRAP);
html += "<img src=\"data:image/png;base64,"+base64+"\" hspace=10 vspace=10><br>";
}
}catch (Exception e){
Log.d("error", e.toString());
}
stream.close();
html += "</body></html>";
wv.loadDataWithBaseURL("", html, "text/html","UTF-8", "");

Well, i found what's the problem i was facing in my code, it was the heap size which caused the OutofMemory Exception. So, i just increased my heap size by declaring android:largeHeap="true" in Manifest which solved the issue.

Related

Uploading large pictures to FTP freezes the app

I am trying to upload some pictures to my FTP in a form. It´s working great on my Huawei P20, but it has been reported to me that on a phone with less RAM the app freezes when they are trying to upload larger pictures.
After the picture selection (max of 4) I resize the images and compress them to reduce the size, but with no luck
Code:
public static byte[] RotateImage(string path)
{
byte[] imageBytes;
var originalImage = BitmapFactory.DecodeFile(path);
var rotation = GetRotation(path);
//Width 3000 Height 4000
var width = (originalImage.Width * 0.25);
var height = (originalImage.Height * 0.25);
if(originalImage.Height>2400)
{
width = (originalImage.Width * 0.20);
height = (originalImage.Height * 0.20);
}
if (originalImage.Height < 600)
{
width = (originalImage.Width * 0.80);
height = (originalImage.Height * 0.80);
}
var scaledImage = Bitmap.CreateScaledBitmap(originalImage, (int)width, (int)height, true);
Bitmap rotatedImage = scaledImage;
if (rotation != 0)
{
var matrix = new Matrix();
matrix.PostRotate(rotation);
rotatedImage = Bitmap.CreateBitmap(scaledImage, 0, 0, scaledImage.Width, scaledImage.Height, matrix, true);
scaledImage.Recycle();
scaledImage.Dispose();
}
using (var ms = new MemoryStream())
{
if (rotatedImage.Width > 1000 || rotatedImage.Height > 1000)
{
rotatedImage.Compress(Bitmap.CompressFormat.Jpeg, 30, ms);
}
if (rotatedImage.Width < 500 || rotatedImage.Height < 500)
{
rotatedImage.Compress(Bitmap.CompressFormat.Jpeg, 60, ms);
}
if (rotatedImage.Width <= 1000 && rotatedImage.Width >= 500)
{
rotatedImage.Compress(Bitmap.CompressFormat.Jpeg, 45, ms);
}
imageBytes = ms.ToArray();
}
originalImage.Recycle();
rotatedImage.Recycle();
originalImage.Dispose();
rotatedImage.Dispose();
GC.Collect();
return imageBytes;
}
Then I send them to the MessagingCenter and retrieve them in PCL.
The application freezes when I try to upload it to FTP
Code in PCL:
for (int i = 0; i < _images.Count; i++)
{
DependencyService.Get<IFtpWebRequest>().upload("FTP", _images[i], "SITE", "PASSWORD", "DIRECTORY");
}
and the platform specific code I am calling is:
public string upload(string FtpUrl, string fileName, string userName, string password, string UploadDirectory = "")
{
try
{
WebRequest request = WebRequest.Create(FtpUrl+UploadDirectory);
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential(userName, password);
using (var resp = (FtpWebResponse)request.GetResponse())
{
}
}
catch(Exception e) { }
try
{
string PureFileName = new FileInfo(fileName).Name;
String uploadUrl = String.Format("{0}{1}/{2}", FtpUrl, UploadDirectory, PureFileName);
FtpWebRequest req = (FtpWebRequest)FtpWebRequest.Create(uploadUrl);
req.Proxy = null;
req.Method = WebRequestMethods.Ftp.UploadFile;
req.Credentials = new NetworkCredential(userName, password);
req.UseBinary = true;
req.UsePassive = true;
byte[] data = File.ReadAllBytes(fileName);
req.ContentLength = data.Length;
Stream stream = req.GetRequestStream();
stream.Write(data, 0, data.Length);
stream.Close();
FtpWebResponse res = (FtpWebResponse)req.GetResponse();
return res.StatusDescription;
}
catch (Exception err)
{
return err.ToString();
}
}
The expected result should be the app not freezing on any phone.
What could I do to prevent it?
Further increasing compression isnt best solution either as some phones upload it no problem and therefore I could achieve higher quality.
EDIT: When uploading a large picture to FTP and I check the pic on FTP its like 1/10 of the picture is uploaded, rest is blank
EDIT2: Moving the function to a different thread does not freeze the application anymore but still only part of the iamge is uploaded on devices with less memory, how do I somehow force the whole image to be uploaded?
When uploading a large picture to FTP and I check the pic on FTP its like 1/10 of the picture is uploaded, rest is blank
If the uploaded image is very large, then this will be a time-consuming operation. If it is placed in the main UI thread, it will consume a lot of memory and time. When the memory of the mobile phone is large, it may be able to do the task, but when When the phone is not big, then the problem will arise.
You need to move upload method to a backgroud thread, then it will not affect UI thread.
If in Android , Have a try with Task :
public async Task<string> upload(string FtpUrl, string fileName, string userName, string password, string UploadDirectory = "")
{
await Task.Run(() =>
{
try
{
WebRequest request = WebRequest.Create(FtpUrl+UploadDirectory);
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential(userName, password);
using (var resp = (FtpWebResponse)request.GetResponse())
{
}
}
catch(Exception e) { }
...
});
}

Get image from URL and upload to Amazon S3

I'd like to load an image directly from a URL but without saving it on the server, I want to upload it directly from memory to Amazon S3 server.
This is my code:
Dim wc As New WebClient
Dim fileStream As IO.Stream = wc.OpenRead("http://www.domain.com/image.jpg")
Dim request As New PutObjectRequest()
request.BucketName = "mybucket"
request.Key = "file.jpg"
request.InputStream = fileStream
client.PutObject(request)
The Amazon API gives me the error "Could not determine content length". The stream fileStream ends up as "System.Net.ConnectStream" which I'm not sure if it's correct.
The exact same code works with files from the HttpPostedFile but I need to use it in this way now.
Any ideas how I can convert the stream to become what Amazon API is expecting (with the length intact)?
I had the same problem when I'm using the GetObjectResponse() method and its propertie ResponseStream to copy a file from a folder to another in same bucket. I noted that the AWS SDK (2.3.45) have some faults like a another method called WriteResponseStreamToFile in GetObjectResponse() that simply doesn't work. These lacks of functions needs some workarounds.
I solved the problem openning the file in array of bytes and putting it in a MemoryStream object.
Try this (C# code)
WebClient wc = new WebClient();
Stream fileStream = wc.OpenRead("http://www.domain.com/image.jpg");
byte[] fileBytes = fileStream.ToArrayBytes();
PutObjectRequest request = new PutObjectRequest();
request.BucketName = "mybucket";
request.Key = "file.jpg";
request.InputStream = new MemoryStream(fileBytes);
client.PutObject(request);
The extesion method
public static byte[] ToArrayBytes(this Stream input)
{
byte[] buffer = new byte[16 * 1024];
using (MemoryStream ms = new MemoryStream())
{
int read;
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
return ms.ToArray();
}
}
You can also create a MemoryStream without an array of bytes. But after the first PutObject in S3, the MemoryStream will be discarted. If you need to put others objects, I recommend the first option
WebClient wc = new WebClient();
Stream fileStream = wc.OpenRead("http://www.domain.com/image.jpg");
MemoryStream fileMemoryStream = fileStream.ToMemoryStream();
PutObjectRequest request = new PutObjectRequest();
request.BucketName = "mybucket";
request.Key = "file.jpg";
request.InputStream = fileMemoryStream ;
client.PutObject(request);
The extesion method
public static MemoryStream ToMemoryStream(this Stream input)
{
byte[] buffer = new byte[16 * 1024];
int read;
MemoryStream ms = new MemoryStream();
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
return ms;
}
I had the same problem in a similar scenario.
The reason for the error is that to upload an object the SDK needs to know the whole content length that is going to be uploaded. To be able to obtain stream length it must be seekable, but the stream returned from WebClient is not. To indicate the expected length set Headers.ContentLength in PutObjectRequest. The SDK will use this value if it cannot determine length from the stream object.
To make your code work, obtain content length from the response headers returned by the call made by WebClient. Then set PutObjectRequest.Headers.ContentLength. Of course this relies on the server returned content length value.
Dim wc As New WebClient
Dim fileStream As IO.Stream = wc.OpenRead("http://www.example.com/image.jpg")
Dim contentLength As Long = Long.Parse(client.ResponseHeaders("Content-Length"))
Dim request As New PutObjectRequest()
request.BucketName = "mybucket"
request.Key = "file.jpg"
request.InputStream = fileStream
request.Headers.ContentLength = contentLength
client.PutObject(request)
I came up with a solution that uses UploadPart when the length is not available by any other means, plus this does not load the entire file into memory.
if (args.DocumentContents.CanSeek)
{
PutObjectRequest r = new PutObjectRequest();
r.InputStream = args.DocumentContents;
r.BucketName = s3Id.BucketName;
r.Key = s3Id.ObjectKey;
foreach (var item in args.CustomData)
{
r.Metadata[item.Key] = item.Value;
}
await S3Client.PutObjectAsync(r);
}
else
{
// if stream does not allow seeking, S3 client will throw error:
// Amazon.S3.AmazonS3Exception : Could not determine content length
// as a work around, if cannot use length property, will chunk
// file into sections and use UploadPart, so do not have to load
// entire file into memory as a single MemoryStream.
var r = new InitiateMultipartUploadRequest();
r.BucketName = s3Id.BucketName;
r.Key = s3Id.ObjectKey;
foreach (var item in args.CustomData)
{
r.Metadata[item.Key] = item.Value;
}
var multipartResponse = await S3Client.InitiateMultipartUploadAsync(r);
try
{
var completeRequest = new CompleteMultipartUploadRequest
{
UploadId = multipartResponse.UploadId,
BucketName = s3Id.BucketName,
Key = s3Id.ObjectKey,
};
// just using this size, because it is the max for Azure File Share, but it could be any size
// for S3, even a configured value
const int blockSize = 4194304;
// BinaryReader gives us access to ReadBytes
using (var reader = new BinaryReader(args.DocumentContents))
{
var partCounter = 1;
while (true)
{
byte[] buffer = reader.ReadBytes(blockSize);
if (buffer.Length == 0)
break;
using (MemoryStream uploadChunk = new MemoryStream(buffer))
{
uploadChunk.Position = 0;
var uploadRequest = new UploadPartRequest
{
BucketName = s3Id.BucketName,
Key = s3Id.ObjectKey,
UploadId = multipartResponse.UploadId,
PartNumber = partCounter,
InputStream = uploadChunk,
};
// could call UploadPart on multiple threads, instead of using await, but that would
// cause more data to be loaded into memory, which might be too much
var part2Task = await S3Client.UploadPartAsync(uploadRequest);
completeRequest.AddPartETags(part2Task);
}
partCounter++;
}
var completeResponse = await S3Client.CompleteMultipartUploadAsync(completeRequest);
}
}
catch
{
await S3Client.AbortMultipartUploadAsync(s3Id.BucketName, s3Id.ObjectKey
, multipartResponse.UploadId);
throw;
}
}

Content service returns old content for some time

I'm using following snippet for saving content:
private void writeToFile(NodeRef nodeRef, String content) throws IOException {
ContentWriter writer = contentService.getWriter(nodeRef, ContentModel.PROP_CONTENT, true);
InputStream contentStream = new ByteArrayInputStream(content.getBytes(encoding));
writer.setMimetype(mimeType);
writer.setEncoding(encoding);
writer.putContent(contentStream);
Map<QName, Serializable> repoProps = nodeService.getProperties(nodeRef);
ContentData contentData = (ContentData) repoProps.get(ContentModel.PROP_CONTENT);
if(contentData == null)
contentData = writer.getContentData();
contentData = ContentData.setEncoding(contentData, encoding);
contentData = ContentData.setMimetype(contentData, mimeType);
repoProps.put(ContentModel.PROP_CONTENT, contentData);
contentStream.close();
nodeService.setProperties(nodeRef, repoProps);
}
When I read content written this way within short period of time (depends on server load) in other place, old content is returned. So it looks like that maybe indexing is in progress, so before final commit old content is returned, is that possible? If so, is it possible to override this behavior and access newest possible content? Via contentUrl?
To avoid this behavior I'm using thread for each read request, which sleeps for some time at the beginning, but I really dislike this "solution".
Edit: I built from newest SVN source, running on Tomcat 6.0.35 on Linux (CentOS and Ubuntu); system load - i mean hundreds of files changing every few seconds.
Edit: reading looks like this:
private byte[] readFileContent(NodeRef nodeRef) throws IOException {
ContentReader reader = contentService.getReader(nodeRef, ContentModel.PROP_CONTENT);
if(reader == null)
return null;
InputStream originalInputStream = reader.getContentInputStream();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
final int BUF_SIZE = 1 << 8; // 1KiB buffer
byte[] buffer = new byte[BUF_SIZE];
int bytesRead = -1;
while ((bytesRead = originalInputStream.read(buffer)) > -1) {
outputStream.write(buffer, 0, bytesRead);
}
originalInputStream.close();
return outputStream.toByteArray();
}
Ok, solved with simplier saving like this:
ContentWriter writer = contentService.getWriter(nodeRef, ContentModel.PROP_CONTENT, true);
InputStream contentStream = new ByteArrayInputStream(content.getBytes(encoding));
writer.setMimetype(mimeType);
writer.setEncoding(encoding);
writer.putContent(contentStream);
contentStream.close();
Previous saving was at place because of some content encoding problems, so testing shows if this works.

File not found exception once deployed to Server

I am using the below code to Upload an Image file to a SharePoint Document Library. The code works fine locally but once deployed to server, i get the Exception as file not found.
String fileToUpload = FlUpldImage.PostedFile.FileName; //#"C:\Users\admin.RSS\Desktop\Photos\me_skype.jpg";
String documentLibraryName = "SiteAssets";
if (!System.IO.File.Exists(fileToUpload))
throw new FileNotFoundException("File not found.", fileToUpload);
SPFolder myLibrary = web.Folders[documentLibraryName];
// Prepare to upload
Boolean replaceExistingFiles = true;
String fileName = CheckStringNull(txtFirstName.Text) + CheckStringNull(txtLastName.Text) + CheckDateNull(txtDOB) + System.IO.Path.GetFileName(fileToUpload); ;
if (fileName.Contains('/'))
{
fileName = fileName.Replace("/", "");
}
if (fileName.Contains(':'))
{
fileName = fileName.Replace(":", "");
}
FileStream fileStream = File.OpenRead(fileToUpload);
//Upload document
SPFile spfile = myLibrary.Files.Add(fileName, fileStream, replaceExistingFiles);
string url = site.ToString() + "/" + spfile.ToString();
if (url.Contains("="))
{
url = url.Split('=')[1];
}
//Commit
myLibrary.Update();
The string fileupload contains URL as C:\Users\admin.RSS\Desktop\Photos\me.jpg This URL is actually the client system and the server side code throws exception as file not found. How to handle this issue?
UPDATE:
I removed the lines of code that checks if the file exists and now i get the exeption on FileStream fileStream = File.OpenRead(fileToUpload); as c:\windows\system32\inetsrv\20120605_133145.jpg cold not be found
Kindly help. Thank You
if (this.fuAvatarUpload.HasFile && this.fuAvatarUpload.PostedFile.FileName.Length > 0)
{
string extension = Path.GetExtension(file.FileName).ToLower();
string mimetype;
switch (extension)
{
case ".png":
case ".jpg":
case ".gif":
mimetype = file.ContentType;
break;
default:
_model.ShowMessage("We only accept .png, .jpg, and .gif!");
return;
}
if (file.ContentLength / 1000 < 1000)
{
Image image = Image.FromStream(file.InputStream);
Bitmap resized = new Bitmap(image, 150, 150);
byte[] byteArr = new byte[file.InputStream.Length];
using (MemoryStream stream = new MemoryStream())
{
resized.Save(stream, System.Drawing.Imaging.ImageFormat.Png);
byteArr = stream.ToArray();
}
file.InputStream.Read(byteArr, 0, byteArr.Length);
profile.ImageUrl = byteArr;
profile.UseGravatar = false;
profileService.UpdateProfile(profile);
this._model.ShowApprovePanel();
}
else
{
_model.ShowMessage("The file you uploaded is larger than the 1mb limit. Please reduce the size of your file and try again.");
}
}
Saving the file physically onto server and than working on the same helped me resolve my issue.

Retrieving images from servlet to Midlet(J2ME)

This is my current Midlet
if (display.getCurrent() == mainform) {
selectedparam = activity.getString(activity.getSelectedIndex());
url = "http://localhost:8080/TOMCATServer/RetrieveServlet?";
parameter = "activity=" + selectedparam;
System.out.println(url + parameter);
try {
hc = (HttpConnection) Connector.open(url + parameter);
hc.setRequestMethod(HttpConnection.POST);
hc.setRequestProperty("CONTENT_TYPE", "application/x-www-from-urlencoded");
hc.setRequestProperty("User-Agent", "Profile/MIDP-2.0 Configuration/CLDC-1.0");
out = hc.openOutputStream();
byte[] postmsg = parameter.getBytes();
for (int i = 0; i < postmsg.length; i++) {
out.write(postmsg[i]);
}
out.flush();
in = hc.openInputStream();
int ch;
while ((ch = in.read()) != -1) {
b.append((char) ch);
}
String result = b.toString().trim();
System.out.println(result);
activitiesform.deleteAll();
activitiesform.append(result);
display.setCurrent(activitiesform);
} catch (Exception c) {
Alert alert = new Alert("Error", "The connection has failed. Please try again.", null, AlertType.ERROR);
display.setCurrent(alert);
}
And this is my current Servlet
public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
PrintWriter out = response.getWriter();
String activityparam = request.getParameter("activity");
String[] sports = new String[5];
sports[0] = ("Football competition");
if (activityparam.equals("Sports")) {
out.println("These are the list of sporting activities \n");
for (int i = 0; i < 5; i++) {
out.println(sports[i]);
//Wanted to output the images of different sports here
}
What I wanted to achieve is that the Servlet could display an image back to the Midlet along with the string of sports[i], after the query request is sent. Right now, it is only dealing with String texts using PrintWriter. The image file is stored locally, so an absolute path should be fine. Please help me. Thanks.
I think you should go with two Servlets: one for Strings and another for images (one at a time).
MIDlet could retrieve an image with:
Image img = Image.createImage(in);
If you need to cache image data, you can use:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte buff[] = new byte[1024];
int len = in.read(buff);
while (len > 0) {
baos.write(buff, 0, len);
len = in.read(buff);
}
baos.close();
buff = baos.toByteArray();
// write to RecordStore
Image img = Image.createImage(new ByteArrayInputStream(buff));

Resources