I have generated a pixel-based image by encoding each input character to a certain color in the image. For example, in input txt <- "ABACDAAFFEDDADFAFAED" i plotted 'A' as a red pixel, 'B' as purple, 'C' by blue and 'D' by some other color. I used R for it. Here is the answer from where I have taken help for this
Generate pixel based image in R from character array
Now, I want to update this for handling a case as well where I have a character presents 2 or three times consecutively and I want to give it a different color. For example txt <- "ABBACDAABBBEDDADCACABBDB", i want to give
A- red, AA maroon, AAA dark red.
B-green, BB- Pink, BBB-yellow,
C-light brown, CC brown, CCC dark brown etc.
I still want to give 1 pixel to each char but for consecutive 2 or 3 appearances color those 2 or 3 pixels with a different color. I am unable to code a reasonable solution for it in R. Your help will be appreciated. Thanks
I changed the function to support multiple character :
library(png)
library(tiff)
library(abind)
# function which plots the image
createImage <- function(txt,charToColorMap,destinationFile,format=c('png','tiff'),debugPlot=FALSE,unused.char='#'){
if(nchar(unused.char) != 1){
stop('unused.char must be a single character, and you should be sure that it will never be present in your text')
}
# helper function which finds all the divisors of a number
divisors <- function(x){
y <- seq_len(x)
y[ x%%y == 0 ]
}
# split the string in charaters
chars <- strsplit(txt,'')[[1]]
# find the most "squared" rectangle that contains all the characters without padding
d <- divisors(length(chars))
y <- d[length(d) %/% 2]
x <- length(chars) / y
# create an array with 4 matrices (or planes) one for each RGBA channel
RGBAmx <- col2rgb(charToColorMap,alpha=TRUE) / 255
colorIndexes <- match(chars,names(charToColorMap))
######################################
# MULTIPLE CHAR
######################################
# check if color map contains multiple character names
multiple <- names(charToColorMap)[nchar(names(charToColorMap)) > 1]
multiple <- multiple[order(nchar(multiple),decreasing=TRUE)]
txtForMultiple <- txt
for(m in multiple){
idxs <- gregexpr(pattern=m,text=txtForMultiple,fixed=TRUE)[[1]]
charRanges <- unlist(lapply(idxs,seq,length.out=nchar(m)))
colorIndexes[charRanges] <- which(names(charToColorMap)==m)[1]
tmp <- strsplit(txtForMultiple,'')[[1]]
tmp[charRanges] <- unused.char
txtForMultiple <- paste(tmp,collapse='')
}
#########################################################
colorIndexesR <- matrix(RGBAmx['red',colorIndexes],nrow=y,ncol=x,byrow = TRUE)
colorIndexesG <- matrix(RGBAmx['green',colorIndexes],nrow=y,ncol=x,byrow = TRUE)
colorIndexesB <- matrix(RGBAmx['blue',colorIndexes],nrow=y,ncol=x,byrow = TRUE)
colorIndexesA <- matrix(RGBAmx['alpha',colorIndexes],nrow=y,ncol=x,byrow = TRUE)
planes <- abind(colorIndexesR,colorIndexesG,colorIndexesB,colorIndexesA,along=3)
# write the PNG image
if(format[1] == 'png'){
writePNG(planes,destinationFile)
}else if(format[1] == 'tiff'){
writeTIFF(planes,destinationFile)
}else{
stop('usupported format')
}
# for debug purpose only we plot the image...
if(debugPlot){
mx <- matrix(colorIndexes,nrow=y,ncol=x,byrow = TRUE)
image(z=t(mx[nrow(mx):1,]),col=charToColorMap)
}
invisible()
}
Usage example ('AAA' set to white) :
charToColorMap <- c(A='red',B='blue',C='green',D='black',E='yellow',F='orange',AAA='white')
txt <- "ABACAAAFFEDDADFAFAED"
# please note that unused.char will be used to mark the characters of txt already analyzed
# during the multi-char handling, so it must not be present in txt
createImage(txt,charToColorMap,destinationFile = "test.png",debugPlot=TRUE,unused.char='#')
Result (zoom 800 %):
Related
I would like to remove the pixels that form a large cluster and keep only the small cluster to analyse (means get pixels number and locations). First I apply a filter to color in white all pixels that has a value lower to 0.66. Then I use the function clump() in R. The model works but I cannot remove only the large cluster. I do not understand how clump function works.
Initial image:
Results image: plot_r is the image where the pixels with value < 0.66 are changed to 0. plot_rc is the results after clump() function. As observed I cannot remove only the large cluster of pixels (on top of the image plot_r). I changed the value (700 in the code) but not better, how to do?
Here the code:
library(magick)
library(pixmap)
library(raster)
library(igraph)
f <- "https://i.stack.imgur.com/2CjCh.jpg"
x <- image_read(f)
x <- image_convert(x, format = "pgm", depth = 8)
# Save the PGM file
f <- tempfile(fileext = ".pgm")
image_write(x, path = f, format = "pgm")
# Read in the PGM file
picture <- read.pnm(file = f, cellres = 1)
str(picture)
picture#size
mat <- picture#grey
mat[mat<0.66] <- 0; x
##############################################################
##Remove clumps of pixels in R using package Raster and igraph
#Detect clumps (patches) of connected cells
r <-raster(mat)
rc <- clump(r)
#extract IDs of clumps according to some criteria
clump9 = data.frame(freq(rc))
#remove clump observations with frequency smaller/larger than N
clump9 = clump9[ ! clump9$count > 700, ]
# record IDs from clumps which met the criteria in previous step
clump9 = as.vector(clump9$value)
#replace cells with IDs which do not belong to the group of interest
rc[rc != clump9[1] & rc != clump9[2]] = NA
# converting rasterlayer to matrix
n <- as.matrix(r)
m <- as.matrix(rc)
Perhaps something like this
library(raster)
library(igraph)
Short-cutting your approach a bit
f <- "https://i.stack.imgur.com/2CjCh.jpg"
b <- brick(f)
x <- sum(b)
r <- x > 450
rc <- clump(r)
f <- freq(rc, useNA="no")
Replace the clumps with the number of cells they consist of and then set the larger one (here more than 100 cells) to NA, and use the result to mask the original raster
rs <- subs(rc, data.frame(f))
rsc <- reclassify(rs, cbind(100,Inf,NA))
m <- mask(b, rsc)
plotRGB(m)
I'd like to merge two datasets based on a common column. Dataset A is a geoTIFF image, representing RGB values of an area. Dataset B is a point cloud with xyz values of the same area.
I want to merge the RGB info in the image to the 3d data. I thougth to use the x y coordinates of the two datasets (which are in the same coordinate system).
I wrote a script inspired by code snippets found in stackoverflow, but I need to implement my whole code (sources are 1, 2, and 3).
The issue is that the x y coordinates in thwe two files have different precision (decimal numbers). Dataset A has 0 to 2 digits; dataset B has much more. I rounded the dataset B digits to be 2. Now, I'd like to pad with zeros when the digits of datset A are less than 2, so that the final merge will hopefully work.
Would a simple if statement be fine considering that my datset has >280000 rows? Or should I go for indexing? Anyway, I'm fairly new in using R, so I hope the possible posters woud help me with a code example. Below is my code:
require(raster)
require(rgl)
setwd("C:/my/folder")
# Read tiff file
img <- stack("image.tif")
vals <- extract(img, 1:ncell(img))
coord <- xyFromCell(img, 1:ncell(img))
combine <- cbind(coord, vals)
remove(vals)
remove(coord)
# read POINTCLOUD and assign names
lidar <- read.table("lidardata.txt")
names(lidar) <- c("x","y","z")
decimalplaces <- function(x) {
if ((x %% 1) != 0) {
nchar(strsplit(sub('0+$', '', as.character(x)), ".", fixed=TRUE)[[1]][[2]])
} else {
return(0)
}
}
# HERE I SHOULD PAD THE LIDAR VARIABLE WITH ZEROS IN DECIMAL POSITIONS WHEN THE DIGITS ARE LESS THAN 2!!!
lidar$xy <- do.call(paste0,lidar[,1:2])
combine$x <- round(combine$x, digits = 2)
combine$y <- round(combine$y, digits = 2)
combine$xy <- do.call(paste0,combine[1:2])
finaldata <- merge(combine,lidar,by = 'xy', all = FALSE)
EDIT 1
As suggested by #Heroka, here is also an example of how the lidar (the dataset A) looks like, and how it should be after padding it with zeros.
LIDAR (original)
x y z
12 9 87
11 23.4 100
LIDAR (altered, and with 'xy' column added for joining)
x y z xy
12.00 9.00 87 12.009.00
11.00 23.40 100 11.0023.40
EDIT 2
I somehow managed to retrieve the number of digits in all x and y of my 'lidar' variable (dataset B) with counting <- sapply(lidar$x, decimalplaces)
In the example above (LIDAR-original), this would give [0 0] for the first (x) column, and [0 1] for the second (y) column. I should be able to find each row in my x y datset with a value of 0 or 1 as digits (not 2) and pad with 0 like in LIDAR-altered above.
I do not understand why you need to pad with zeros. If the coordinates are of class numeric and both were rounded using round (which should avoid issues of floating point precision) you should be able to just merge by them. Something like this:
lidar$x <- round(lidar$x, 2)
lidar$y <- round(lidar$y, 2)
combine$x <- round(combine$x, digits = 2)
combine$y <- round(combine$y, digits = 2)
finaldata <- merge(combine, lidar, by = c("x", "y") , all = FALSE)
I have six files containing 6 million entries in a space delimited ascii file. I am using the maptools package to read in the ascii file (read.ascii). Each file represents a pixel in an image. I need to sum over each individual pixel entity (data point 1 in table 1 + data point 1 table 2 + .... + data point 1 table 6). I have created a program that can pull and sum the i-th pixel in the image. I am however, having issues figuring out how to write these summations to one ascii file. Any Ideas?
My code:
library(maptools)
#Variable Declaration
num <- 6210775
i <- 1
#Open the 6 Factor files
tablex <- data.frame(readAsciiGrid("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_01.asc"))
tabley <- data.frame(readAsciiGrid("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_02.asc"))
tablez <- data.frame(readAsciiGrid("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_03.asc"))
tablea <- data.frame(readAsciiGrid("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_04.asc"))
tableb <- data.frame(readAsciiGrid("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_05.asc"))
tabled <- data.frame(readAsciiGrid("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_06.asc"))
repeat{
#Variable declaration for position within data frame
x <- tablex[i,1]
y <- tabley[i,1]
z <- tablez[i,1]
a <- tablea[i,1]
b <- tableb[i,1]
d <- tabled[i,1]
#Adding up ALL six factors
ALL <- x+y+z+a+b+d
#Write to file--This is my issue...
print(ALL)
#Iterative variable
i=i+1
#Condition to break if i is GT the number of preset lines
if(i > num){
break
}
}
I haven't tested this, as you haven't provided sample data, but I think you can simplify and shorten your code considerably. In this version, you get rid of the repeat loop, do all the sums first, and then write to a file only once.
# Read the 6 factor files and store them in a list
tables = lapply(1:6, function(x) {
readAsciiGrid(paste0("E:/KOC/Satellite/Daytime/PCA_R_CART/PSPP_PCA_0", x, ".asc"))
})
# Instead of hard-coding num, you can also do, for example: num=nrow(tables[[1]])
num = 6210775
# Function to sum one set of values
oneSum = function(row) {
sum(sapply(1:length(tables), function(x) {
tables[[x]][row,1]
}))
}
# Run the oneSum function on every row of the ascii grids and store the results
# in allSums
allSums = sapply(1:num, oneSum)
# Write the data to a file
write.table(allSums, file="output.file.txt")
UPDATE: I changed the code to use sapply instead of lapply, which simplifies things a bit.
In my problem there are subregions of a larger region that can be classified as positive or negative. I have several files with different classifications, in the following format:
start | end
10 | 20
60 | 120
178 | 220
They are sorted, and they have only positive subregions, the rest are assumed negative.
I would like to represent this data in a 2D graphic in R, but I don't know what type of graph I should use. It's something like this:
http://i.imgur.com/VaSvEKr.jpg
That kind of chart is called "Gantt", here's a possible way to draw it in base R :
# input example
DF <-
read.csv(text=
'"file","start","end"
"file1",10,20
"file1",60,120
"file1",178,220
"file2",10,20
"file2",25,100
"file2",130,140
"file2",190,210
"file3",0,50
"file3",55,400',stringsAsFactors=F)
minval <- min(DF$start) # or different if you know the limits
maxval <- max(DF$end) # or different if you know the limits
files <- rev(unique(DF$file))
nfiles <- length(files)
# empty plot to make space for everything
filehigh <- 1.0
plot(c(minval,maxval),c(filehigh/2,nfiles+filehigh/2),type='n', xlab='Time',ylab=NA,yaxt='n' )
# add y labels
axis(side=2,at=1:nfiles,labels=files,las=1)
# plot the rectangles
negcolor <- 'red'
poscolor <- 'green'
for(i in 1:nfiles){
file <- files[i]
subDF <- DF[DF$file == file,]
lastend <- minval
for(r in 1:nrow(subDF)){
yTop <- i+(filehigh/2)
yBottom <- i-(filehigh/2)
start <- subDF[r,'start']
end <- subDF[r,'end']
if(start > lastend){
rect(lastend,yBottom,start,yTop,col=negcolor )
}
rect(start,yBottom,end,yTop,col=poscolor)
lastend <- end
}
if(lastend < maxval){
rect(lastend,yBottom,maxval,yTop,col=negcolor )
}
}
Result :
require("EBImage")
I essentially want to loop this set of commands in R for generating masks of images
a <- readImage('BF_Position004_time021.tif') # load rgb image
g <- channel(a, "asgreen") #take green channel from image
writeImage(g,"/Users/...path.../GFP_Position004_time021.tif") #save green channel image
r <- channel(a, "asred") #take red channel from image
writeImage(r,"/Users/...path.../RFP_Position004_time021.tif")#save red
b <- channel(r, "gray") #makes red channel into gray scale
#the following creates a white mask over pixels with high intensity
nmask2 = thresh(sqrt(b), 15, 15, .001)
# nmask2 = fillHull(nmask2)
mk3 = makeBrush(3, shape = 'diamond')
nmask3 = opening(nmask2, mk3)
nseg = bwlabel(nmask3)
nf = computeFeatures.shape(nseg)
nr = which(nf[,'s.area'] < 150)
nseg = rmObjects(nseg,nr) #resulting image called nseg
writeImage(nseg,"/Users/...path.../BF_Position004_time021.tif") #save nseg with the following name
I want to do across multiple positions from 000 to 100 and for time points 001 to 100. I have annotated the above code for some clarity
thanks for any and all help
Put the commands above into a function that takes the common part of the file name and the directory where you'd like to write to
fun <- function(fl, dirpath) {
a <- readImage(sprintf("BF_%s", fl))
...
writeImage(g, file.path(dirpath, sprintf("GFP_%s", fl)))
....
}
Create a vector of file names
fls <- sprintf("Position%03d_time%03d.tif", rep(0:100, each=100), 1:100)
and go!
for (fl in fls)
fun(fl, "/Users/...path...")