I'm generating a 32x32 (tiles, each tile less than a quarter inch x a quater inch size) heatmap in ggplot2 in my MacBook Pro, this is relatively simple stuff. However, the pdf output for this is huge (something like 7MB) and when I load it in pdflatex, loading and changing pages in the document becomes very slow. What are my options? Is there a better way to save a PDF in R that plays nicely with ggplot2 and pdflatex?
A common source of PDFs that are way too big is specifying dimensions when saving, thinking you're working in pixels, when in fact the default is in inches.
Try changing either the units (in ggsave) or the sizes in pdf.
Related
I have created a figure for my scientific work using ggplot (an R package for plotting data). It's a scatterplot that contains ~25.000 data points in a normal x-y-style plot. Each data point has a border and a color fill. The output vector PDF is 1.3 Mb in size. Now, I would like to make some final adjustments regarding font size and text position and merge it with other panels in a bigger figure which I normally do in Illustrator. So I add/embed the scatterplot to the rest of my figures which nicely loads all elements correctly. However, when I then simply save this file as .ai or .pdf, the output will be more than ~30 Mb. How is it possible, that all elements are still preserved in the original (small) PDF, but after Illustrator it is inflated to much? It is critical for me to keep the file size small.
I tried many things, including different PDF exporting options in Illustrator and macOS Preview PDF file compression, but nothing worked. I even tried merging all those ~25.000 overlapping dots together in one or at least few shapes, but either Illustrator crashes in the process (Illustrator > Pathfinder unite/merge) or the resulting PDF shows some erratic behaviour, i.e. become black/white in Word (Illustrator > Flatten Transparency) What am I missing here?
Any help is appreciated!
When saving, make sure you're not enabling Illustrator editing capabilities. Leaving Illustrator editing capabilities enabled will essentially cause a copy of the Illustrator file (as an AI version) to be written into the PDF that's being saved. This often causes the PDF to increase dramatically in size, especially for files with many vector or path elements.
I had the same issue. What worked was me was this:
Export as eps instead of pdf in ggplot. You may need to use device=cairo_ps as an option (I did).
In Adobe Illustrator, create a new document and select the web option
Combine all your figures into this new figure by dragging and dropping them there. Use "Embed" to embed those figures into the new one.
Make all changes you need
Save as pdf with default options (I used preset {Smallest File Size (PDF 1.6)}.
This preserved the small file size for me. I think the only thing that matters here is the use of eps instead of pdf when exporting from ggplot.
I am preparing a Latex document and a slide show for my Bayesian analysis results. Trace plots generated by "coda" package in R are very large in size. By size, I mean kilobytes (KB), and loading time. When I am scrolling down the pdf files in a slow computer or IPAD, it takes quite a lot of time to load the pages that are involving trace plots. Is there any way to "lighten" those plots, so that the scrolling times decreases substantially? (such as converting to another format without losing much detail).
Note: I am using Rstudio and knitr to produce latex documents.
For example, I generated a plot using following code. If I export it to a single page PDF document, the size of the PDF will be 439 KB (compared to basic plots with sizes 7 KB).
library(coda)
temp <- mcmc(matrix(rnorm(100000),ncol=1))
traceplot(temp)
I would recommend you dump the images not as pdf, but as png. If you ensure that the png has a high enough resolution, it will be hard to see the difference between the pdf and the png. The png will be much faster than the pdf, speeding up scrolling.
PDF would have the advantage to scale, but the disadvantage is the rendering of bigger vector data.
In order to keep the scalability, what can be done is flattening and simplifying the "plot output" (I am sure that curves are split up into hundreds of minuscule straight lines). There should be tools out there which can do it (if needed get the PDF into Illustrator and do it there).
But even with simplifying, you may eventually get beyond the tolerable limits, and in this case, rasterizing the plot is the way to go. PNG has been suggested as format; TIFF would work as well. However, NEVER EVER do JPEG from plots; the quality would become horrendously bad.
While producing scatter plots of many points in R (using ggplot() for example), there might be many points that are behind the others and not visible at all. For instance see the plot below:
This is a scatter plot of several hundreds of thousands points, but most of them are behind the other points. The problem is when casting the output to a vector file (a PDF file for example), the invisible points make the file size so big, and increase memory and cpu usage while viewing the file.
A simple solution is to cast the output to a bitmap picture (TIFF or PNG for example), but they lose the vector quality and can be even larger in size. I tried some online PDF compressors, but the result was the same size as my original file.
Is there any good solution? For example some way to filter the points that are not visible, possibly during generating plot or after it by editing PDF file?
As a start you can do something like this:
set.seed(42)
DF <- data.frame(x=x<-runif(1e6),y=x+rnorm(1e6,sd=0.1))
plot(y~x,data=DF,pch=".",cex=4)
PDF size: 6334 KB
DF2 <- data.frame(x=round(DF$x,3),y=round(DF$y,3))
DF2 <- DF[!duplicated(DF2),]
nrow(DF2)
#[1] 373429
plot(y~x,data=DF2,pch=".",cex=4)
PDF size: 2373 KB
With the rounding you can control how many values you want to remove. You only need to modify this to handle the different colours.
Simply saving the plot as a high-res png file will very drastically cut the size, while keeping the quality more than good enough. At least I've never had journals complain about any of the png's I sent them, just keep sure to use > 600 dpi.
I think it might be done with some post-processing of the pdf-file. In linux, if I have to reduce a pdf, I would do
pdf2ps input.pdf output.ps
ps2pdf output.ps output.pdf
which for some reason works quite efficiently.
You can see some discussion at https://askubuntu.com/questions/113544/how-to-reduce-pdf-filesize.
First a caveat: I posted this question here on SuperUser, but it is clearly the wrong place to ask R questions. I recognize that it is not directly a programming question, but I believe it can be solved by changing how plots are produced (i.e. by coding appropriately). So I hope readers find this appropriate for the forum.
R plots usually consist entirely of vector graphics elements (i.e. points, lines, polygons, text). R permits you to save your figure (or copy-paste) in various formats including various raster formats, as a PDF, or as a Windows meta-file.
I usually save my images as PDFs and print them. This renders the images exactly as I intended them on paper, in the highest quality. I avoid raster formats (e.g. JPG, TIFF) for printing as inevitably the quality is poorer and publishers prefer vector formats.
However, I need to make a large multi-page desktop published document using Microsoft Word 2007, and therefore using PDFs is not an option. When I import my figures from meta-files, or copy and paste directly from R into Word both the screen and print rendering of the image changes slightly (e.g. polygons and their fills become slightly misaligned).
Given that I want to retain high vector quality (and not use raster formats), what can I do to make R vector graphics work with Word? (Of course Sweave and LaTeX would be nice, but again, not a realistic option).
Consider this example:
plot(c(1:100), c(1:100), pch=20)
## Copy and paste to Word 2007 as Windows metafile
## Print
## Quality is poorer (e.g. dot fills misaligned with borders)
pdf("printsPerfectly.pdf")
plot(c(1:100), c(1:100), pch=20)
dev.off()
## Now print PDF
## Quality is as expected
EDIT: Further to suggestions by #John I produced it as an EPS postscript file (see below), inserted it as a picture into Word. Because ultimately it will be printed from a PDF created from Word, I converted it to a PDF using default Word 2007 settings, printed it on my HP Laserjet P1606dn laser printer, and then took aphotograph to illustrate the issue of polygons borders and fills misaligning (image on left, below). I also produced it directly as PDF from R using pdf() and printed the PDF and took a photograph (image on right, below).
It may seem like small potatoes! But when you have gone to a lot of trouble to achieve high quality, it is disappointing to be thwarted at the end. In addition, it is not really obvious here, but the numerals are not as high-quality (left) as in the PDF (right), disregarding differences in focus on the photograph.
The accepted answer to me is not acceptable, since if one goes to the trouble of making a nice vector based figure, the last thing one would like to do is just rasterize it to a bitmap... Unless it's an increadibly complex graph that takes ages to render in vector format, or something like that, but for most graphs that's not the case.
The best solution is to export to Word directly in native Office vector format. I just made a new package, export, that allows one to do exactly that an allows export of either graphs or statistical tables to Word and Powerpoint, see
https://cran.r-project.org/web/packages/export/index.html and for demo see
https://github.com/tomwenseleers/export
For example:
library(devtools)
devtools::install_github("tomwenseleers/export")
library(export)
?graph2ppt
?graph2doc
?table2ppt
?table2doc
## export of ggplot2 plot
library(ggplot2)
qplot(Sepal.Length, Petal.Length, data = iris, color = Species,
size = Petal.Width, alpha = I(0.7))
# export to Word
graph2doc(file="ggplot2_plot.docx", width=7, height=5)
# export to Powerpoint
graph2ppt(file="ggplot2_plot.pptx", width=7, height=5)
You can also export to enhanced metafile using the function
graph2emf(file="ggplot2_plot.emf", width=7, height=5)
but the quality of the native Office format is better.
For final production you can also readily print it to PDF from Powerpoint if need be, and it will stay nicely in vector format then.
Your only option is to use high resolution raster graphics. Once you're over 300 dpi it will be completely indistinguishable from vector printed; it will just make larger files.. Your copy and paste method is coming in at 72 dpi and will look terrible. If you import from a file you can get the resolution in the file and things will be much better. Fortunately Office 2007 is supposed to handle png images, which have the best compression for typical graphs. Let's say you wanted the image 4" wide and 6" high...
png('printsGreat.png', width = 4, height = 6, units = 'in', res = 300)
plot(c(1:100), c(1:100), pch=20)
dev.off()
Also, Office 2007 is supposed to be able to handle EPS files and R postscript files are by default EPS compatible when you print one page.
postscript("printsPerfectly.eps", width = 4, height = 6, horizontal = FALSE, onefile = FALSE)
plot(c(1:100), c(1:100), pch=20)
dev.off()
But if you don't have luck with them go back to the high resolution image.
My preferred solution is to use the windows metafile device for plotting, e.g.:
win.metafile("mygraph.wmf")
print(gg1)
dev.off()
This produces a *.wmf file that can be copy-pasted into the word file.
The devEMF package seems to produce graphics that look nicer than the default wmf when pasted into PowerPoint.
Since I tried to produce png at high res in R and it didn't seem to work on my PC (if I set the resolution higher than, say, 300 dpi, R would produce an error like "cannot start png device"), the way I found was to save the figure using postscript() and then use GSView to convert the ps file into png with 600 dpi resolution. MS Word consumes the png's happily and the quality of print seems to be perfect.
What #Tom Wenseleers said:
The current best answer above to me is not acceptable, since if one
goes to the trouble of making a nice vector based figure, the last
thing one would like to do is just rasterize it to a bitmap... Unless
it's an increadibly complex graph that takes ages to render in vector
format, or something like that, but for most graphs that's not the
case.
For me, there is a new best answer to this question, since graph2ppt and graph2doc tend to move axis labels around (which apparently cannot be fixed; see here: https://github.com/davidgohel/rvg/blob/master/R/body_add_vg.R and here: export::graph2office moves axis labels around).
I think that .svg is the most appropriate vector format for usage with publication graphics. The only drawback is that older versions of e.g. MS Word cannot handle it. IN R, you could use the native graphics::svg - device. However, I'd recommend to use CairoSVG from the Cairo - Package, especially when you are working with non-native fonts (e.g. via the extrafont - package), because in contrast to graphics::svg, Cairo::CairoSVG embeds fonts quite nicely (without relying on GhostScript, if I am right).
If you are working with an older version of MS Word, you could use incscape (a free vector graphic editor) and convert your graph to .wmf, for example (which might be better than printing to .wmf directly, because R rasterizes points when exporting .wmf files).
An example:
## create plot
library (ggplot2)
library (extrafont)
# note: if you want to use other fonts than the standard ones - in this example "ChantillyLH" -
# you must register your fonts via
# font_import () ##run only once (type "y" in the console)
# and
# loadfonts (device = "win") ##run only once.
# Otherwise, the extrafont - package is not needed.
beautiful_plot <-
ggplot (data = iris, mapping = aes (x = Sepal.Length, y = Petal.Length)) +
geom_point () +
theme (text = element_text (size = 18,
family = "ChantillyLH")
)
# export SVG
library (Cairo)
CairoSVG ("My_Path/My_Plot.svg", width = 6, height = 6)
print (beautiful_plot)
dev.off ()
# the resulting SVG-file is in the the "My_Path" - Folder.
In Incscape, it looks like this:
Newer versions of Word can import raster graphics from SVG files. R 3.6.2 has built-in support for creating SVG files with the svg function - no extra packages needed.
Your example then becomes
svg("printsPerfectly.svg", width=4, height=4)
plot(c(1:100), c(1:100), pch=20)
dev.off()
Note that there is a known issue when you try to create PDF files from Word documents with embedded SVG files with thin lines. If you are using thin lines, e.g. with lwd=0.7 somewhere, you need to apply this workaround.
i am plotting some data in R using the following commands:
jj = ts(read.table("overlap.txt"))
pdf(file = "plot.pdf")
plot(jj, ylab="", main="")
dev.off()
The result looks like this:
The problem I have is that the pdf file that I get is quite big (25Mb). Is the a way to reduce the file size? JPEG is not an option because I need a vector graphic.
Take a look at tools::compactPDF - you need to have either qpdf or ghostscript installed, but it can make a huge difference to pdf file size.
If reading a PDF file from disk, there are 3 options for GostScript quality (gs_quality), as indicated in the R help file:
printer (300dpi)
ebook (150dpi)
screen (72dpi)
The default is none. For example to convert all PDFs in folder mypdfs/ to ebook quality, use the command
tools::compactPDF('mypdfs/', gs_quality='ebook')
You're drawing a LOT of lines or points. Vector image formats such as pdf, ps, eps, svg, etc. maintain logical information about all of those points, lines, or other items that increase complexity, which translates to size and drawing time, as the number of points increases. Generally vector images are the best in a number of ways, most compact, scale best, and highest quality reproduction. But, if the number of graphical elements becomes very large then it's often best to go to a raster image format such as png. When you switch to raster it's best to have a good idea what size image you want, both in pixels and also in things like print measurements, in order to produce the best image.
For information from the other direction, too large a raster image, see this answer.
One way of reducing the file size is to reduce the number of values that you have. Assuming you have a dataframe called df:
# take sample of data from dataframe
sampleNo = 10000
sampleData <- df[sample(nrow(df), sampleNo), ]
I think the only other alternative within R is to produce a non-vector. Outside of R you could use Acrobat Professional (which is not free) to optimize the pdf. This can reduce the file size enormously.
Which version of R are you using? In R 2.14.0, pdf() has an argument compress to support compression. I'm not sure how much it can help you, but there are also other tools to compress PDF files such as Pdftk and qpdf. I have two wrappers for them in the animation package, but you may want to use command line directly.
Hard to tell without seeing what the plot looks like - post a screenshot?
I suspect its a lot of very detailed lines and most of the information probably isn't visible - lots of things overlapping or very very small detail. Try thinning your data in one dimension or another. I doubt you'll lose visible information.