Convert jpg images to webp - wordpress

I'm trying to convert .jpg images to webp format.
I have site in WordPress, and I need to add webp to my images. I read https://developers.google.com/speed/webp/ but it does not work for me.
And I tried using https://wordpress.org/plugins/wp-webp/ but that also does not work.
And I tried this article, also doesn't work http://www.wpexplorer.com/webp-files-wordpress/
Can anyone help me?

apt get install moreutils
apt get install parallels
parallel -eta cwebp {} -o {.}.webp ::: *.jpg
works with *.png too 100000 images at the time if you want
One image at the time can be done with cwebp -q 100 image1.jpg -o image1.webp
Sorry got this from another thread but he missed a dot before webp

To convert images from WebP format into JPG and PNG format you can use WebPconv 6.0 (Windows only).
It's a free app and you can find it here.

Related

How to get image width and height in pixels without loading file in memory

My question is very similar to this, but I have thousands of images on disk and I want to fast read their width and height in pixels without loading each file in memory.
On my Linux machine, I could do something like this for each file:
path_to_file <- 'img/13600061.jpg'
system(sprintf("file %s", path_to_file), intern = TRUE)
But the output of file can differ for jpg, jpeg and png files and then I need to catch the pixel info differently depending on the file. I was wondering if there is a general fast solution out there already.
I think exiftool fits the bill nicely here. It runs on all platforms, is very controllable and crucially, it can recurse on its own so it doesn't incur the overhead of being started once per file.
As a rough first attempt, you'd want something like this if processing PNGs and JPEGs and recursing down starting at current directory, i.e. .
exiftool -csv -ImageHeight -ImageWidth -r -ext jpg -ext jpeg -ext png .
Sample Output
black.png,80,80
blue.png,80,80
c.jpg,1,1
deskew/skew40.png,800,800
deskew/gradient.png,800,800
You may want to add -q to exclude the summary if you are parsing the output.
As a rough guide, the above command runs in 9 seconds on a directory containing 10,000 images on my Mac.

is there a way to crop multiple netcdf files using CDO?

I have multiple global climate model (GCM) data files. I successfully cropped one file but it takes time to crop one by one over thousand data files manually. How can I do multiple crops? Thanks
What you need is a loop combined with some patience... ;-)
for file in $(ls *.nc) ; do
cdo sellonlatbox,lon1,lon2,lat1,lat2 $file ${file%???}_crop.nc
done
the %??? chops off the ".nc" and then I add "_crop" to the output file name...
I know I am a little late to add to the answer, but still wanted to add my knowledge for those who would still pass through this question.
I followed the code given by Adrian Tompkins and it seems to work exceptionally fine. But there are somethings to be considered which I'd like to highlight. And because I am still novice at programming, please forgive my much limited answer. So here are my findings for the code above to work...
The code used calls CDO (Climate Data Operators) which is a non-GUI standalone program that can be utilized in Linux terminal. In my case, I used it in my Windows 10 through WSL (Ubuntu 20.04 LTS). There are good videos in youtube for using WSL in youtube.
The code initially did not work for until I made a slight change. The code
for file in $(ls *.nc) ; do
cdo sellonlatbox,lon1,lon2,lat1,lat2 $file ${file%???}_crop.nc
done
worked for me when I wrote it as
for file in $(ls *.nc) ; do
cdo sellonlatbox,lon1,lon2,lat1,lat2 $file ${file%???}_crop.nc;
done
see the presence of a ; in the code in line 2.
The entire code (in my case to work) was put in a text file (can be put as script in other formats as well) and passed as a script in the Linux terminal. The procedure to execute the script file is as follows:
3.a) create the '.txt' file containing the script above. Do note that the directory should be checked in all steps.
3.b) make the file executable by running the command line
chmod +x <name_of_the_textfile_with_extension> in the terminal.
3.c) run the script (in my case it is the textfile) by running the command line ./<name_of_the_textfile_with_extension>
The above procedures will give you cropped netcdf files for the corresponding netcdf files in the same folder.
cheers!

mencoder settings for PowerPoint

I have a series of jpg files that I would like to encode into video with mencoder. Currently I'm using the mpeg codec like so:
mencoder mf://*.jpg -mf fps=30:type=jpg -ovc lavc -lavcopts vcodec=mpeg4:vbitrate=8000 -o video.mp4
...but I've been having some problems with mp4 and PowerPoint. Can someone offer a recommendation for an encoding format that plays nicely with PowerPoint and provide an example? asf maybe? I am a newbie when it comes to video encoding.
i'm using mencoder version 4.4.6

Converting PSD file to JPG using graphic magick

I 'm trying to convert psd file to jpg format file by using following commands:
gm -convert [input file name].psd -colorspace rgb -resize 150x150 -strip -quality 92 - sharpen 2 [output file name].jpg
gm -convert -clip -negate [input file name].psd -thumbnail 150x150 [output file name].jpg
Both the commands are working fine.But for some psd file having rgb colorspace it does not generate correct rendition.
Any suggestions??
Have you tried ImageMagick? ImageMagick has the same command-line, but also have a thumbnail of PSD, get by this:
convert a.psd[0] b.jpg
The PSD reader in GraphicsMagick is not very robust and PSD is very complex and poorly documented. The best free implementation I have seen is in The GIMP.
I have just created a Node module that deals with PSD files from the CLI.
For all non-Windows users out there that don't want to crack Photoshop just to see a proper rendered file (Gimp is not really a solution and has a poor understanding of modern PSD files).
Instructions from the Github repo :
Install it :
npm install -g psd-cli
Convert myfile.psd to a new myfile.png :
psd myfile.psd
Hope you find this useful ! Any feature requests is most enjoyed, I have a ton in mind and help with improving the code is appreciated :)

wget or lynx not able to download some webpages due to frames contained in it

I want to download http://www.wordwebonline.com/search.pl?w=humane this webpage on UNIX.
I tried to use wget and lynx but the page is not download. Instead following text is seen in it
FRAME: [2]fr_top
FRAME: [3]fr_bottom
Your browser doesn't support frames: Click on the link below to
proceed to the
With wget -U Mozilla option was also tried but still the same result
So how can I overcome this? How to get the data within frames using either wget or lynx. Or any other command line tool to do this?
Not a SO question but check out HTTrack

Resources