Get component of Date / ISODate in mongo - datetime

How to get a component like minute from ISODate stored in MongoCollection?

Since you don't specify a language, I'm going to assume you mean JavaScript, as in the shell.
One of the nice features of the shell is it has tab completion. So you can do something like this:
> db.test.insert({x:new Date()});
> var doc = db.test.findOne();
> doc
{
"_id" : ObjectId("4fa131851932655dc45027a9"),
"x" : ISODate("2012-05-02T13:07:17.012Z")
}
> doc.x
ISODate("2012-05-02T13:07:17.012Z")
> doc.x.<TAB><TAB>
doc.x.constructor doc.x.getSeconds( doc.x.getUTCMinutes( doc.x.setHours( doc.x.setUTCHours( doc.x.toLocaleDateString(
doc.x.getDate( doc.x.getTime( doc.x.getUTCMonth( doc.x.setMilliseconds( doc.x.setUTCMilliseconds( doc.x.toLocaleString(
doc.x.getDay( doc.x.getTimezoneOffset( doc.x.getUTCSeconds( doc.x.setMinutes( doc.x.setUTCMinutes( doc.x.toLocaleTimeString(
doc.x.getFullYear( doc.x.getUTCDate( doc.x.getYear( doc.x.setMonth( doc.x.setUTCMonth( doc.x.toString(
doc.x.getHours( doc.x.getUTCDay( doc.x.hasOwnProperty( doc.x.setSeconds( doc.x.setUTCSeconds( doc.x.toTimeString(
doc.x.getMilliseconds( doc.x.getUTCFullYear( doc.x.propertyIsEnumerable( doc.x.setTime( doc.x.setYear( doc.x.toUTCString(
doc.x.getMinutes( doc.x.getUTCHours( doc.x.setDate( doc.x.setUTCDate( doc.x.toDateString( doc.x.tojson(
doc.x.getMonth( doc.x.getUTCMilliseconds( doc.x.setFullYear( doc.x.setUTCFullYear( doc.x.toGMTString( doc.x.valueOf(
What you want is probably:
> doc.x.getSeconds();
17
> doc.x.getMinutes();
7
> doc.x.getHours();
9
> doc.x.getDate();
2
> doc.x.getMonth();
4
> doc.x.getFullYear();
2012

Related

Best R regex for intercepting a fragment in the middle of a string?

How do I construct a regex to detect the entire middle chunk of a string, so I can gsbu it out? Here's an example of what I'm aiming to do.
MATCH: enter > shop.christopherspenn.com/test > convert
MATCH: enter > shop.christopherspenn.com/page5 > convert
MATCH: enter > shop.christopherspenn.com/ > convert
NO MATCH: enter > christopherspenn.com/test > convert
The goal is to find something like shop.christopherspenn.com/test > and be able to delete it from the string.
I've tried gsub("(shop.christopherspenn.com.*)/ > ","",string) as my call but it's not able to grab the appropriate chunks.
Thanks in advance for any advice!
You can use
string <- c("enter > shop.christopherspenn.com/test > convert","enter > shop.christopherspenn.com/page5 > convert","enter > shop.christopherspenn.com/ > convert","enter > christopherspenn.com/test > convert")
sub('\\bshop\\.christopherspenn\\.com[^>]*>\\s*', '', string)
See the online R demo and the regex demo. Output:
[1] "enter > convert"
[2] "enter > convert"
[3] "enter > convert"
[4] "enter > christopherspenn.com/test > convert"
Details:
\b - a word boundary
shop\.christopherspenn\.com - a shop.christopherspenn.com string
[^>]* - zero or more chars other than >
> - a > char
\s* - zero or more whitespaces.

r mongolite - date query

Question
Using the mongolite package in R, how do you query a database for a given date?
Example Data
Consider a test collection with two entries
library(mongolite)
## create dummy data
df <- data.frame(id = c(1,2),
dte = as.POSIXct(c("2015-01-01","2015-01-02")))
> df
id dte
1 1 2015-01-01
2 2 2015-01-02
## insert into database
mong <- mongo(collection = "test", db = "test", url = "mongodb://localhost")
mong$insert(df)
Mongo shell query
To find the entries after a given date I would use
db.test.find({"dte" : {"$gt" : new ISODate("2015-01-01")}})
How can I reproduce this query in R using mongolite?
R attempts
So far I have tried
qry <- paste0('{"dte" : {"$gt" : new ISODate("2015-01-01")}}')
mong$find(qry)
Error: Invalid JSON object: {"dte" : {"$gt" : new ISODate("2015-01-01")}}
qry <- paste0('{"dte" : {"$gt" : "2015-01-01"}}')
mong$find(qry)
Imported 0 records. Simplifying into dataframe...
data frame with 0 columns and 0 rows
qry <- paste0('{"dte" : {"gt" : ', as.POSIXct("2015-01-01"), '}}')
mong$find(qry)
Error: Invalid JSON object: {"dte" : {"gt" : 2015-01-01}}
qry <- paste0('{"dte" : {"gt" : new ISODate("', as.POSIXct("2015-01-01"), '")}}')
mong$find(qry)
Error: Invalid JSON object: {"dte" : {"gt" : new ISODate("2015-01-01")}}
#user2754799 has the correct method, but I've made a couple of small changes so that it answers my question. If they want to edit their answer with this solution I'll accept it.
d <- as.integer(as.POSIXct(strptime("2015-01-01","%Y-%m-%d"))) * 1000
## or more concisely
## d <- as.integer(as.POSIXct("2015-01-01")) * 1000
data <- mong$find(paste0('{"dte":{"$gt": { "$date" : { "$numberLong" : "', d, '" } } } }'))
as this question keeps showing up at the top of my google results when i forget AGAIN how to query dates in mongolite and am too lazy to go find the documentation:
the above Mongodb shell query,
db.test.find({"dte" : {"$gt" : new ISODate("2015-01-01")}})
now translates to
mong$find('{"dte":{"$gt":{"$date":"2015-01-01T00:00:00Z"}}}')
optionally, you can add millis:
mong$find('{"dte":{"$gt":{"$date":"2015-01-01T00:00:00.000Z"}}}')
if you use the wrong datetime format, you get a helpful error message pointing you to the correct format: use ISO8601 format yyyy-mm-ddThh:mm plus timezone, either "Z" or like "+0500"
of course, this is also documented in the mongolite manual
try mattjmorris's answer from github
library(GetoptLong)
datemillis <- as.integer(as.POSIXct("2015-01-01")) * 1000
data <- data_collection$find(qq('{"createdAt":{"$gt": { "$date" : { "$numberLong" : "#{datemillis}" } } } }'))
reference: https://github.com/jeroenooms/mongolite/issues/5#issuecomment-160996514
Prior converting your date by multiplying it with 1000, do this: options(scipen=1000), as the lack of this workaround will affect certain dates.
This is explained here:

Error in lis[[i]] : attempt to select less than one element

This code is meant to compute the total distance of some given coordinates, but I don't know why it's not working.
The error is: Error in lis[[i]] : attempt to select less than one element.
Here is the code:
distant<-function(a,b)
{
return(sqrt((a[1]-b[1])^2+(a[2]-b[2])^2))
}
totdistance<-function(lis)
{
totdis=0
for(i in 1:length(lis)-1)
{
totdis=totdis+distant(lis[[i]],lis[[i+1]])
}
totdis=totdis+distant(lis[[1]],lis[[length(lis)]])
return(totdis)
}
liss1<-list()
liss1[[1]]<-c(12,12)
liss1[[2]]<-c(18,23)
liss1[[4]]<-c(29,25)
liss1[[5]]<-c(31,52)
liss1[[3]]<-c(24,21)
liss1[[6]]<-c(36,43)
liss1[[7]]<-c(37,14)
liss1[[8]]<-c(42,8)
liss1[[9]]<-c(51,47)
liss1[[10]]<-c(62,53)
liss1[[11]]<-c(63,19)
liss1[[12]]<-c(69,39)
liss1[[13]]<-c(81,7)
liss1[[14]]<-c(82,18)
liss1[[15]]<-c(83,40)
liss1[[16]]<-c(88,30)
Output:
> totdistance(liss1)
Error in lis[[i]] : attempt to select less than one element
> distant(liss1[[2]],liss1[[3]])
[1] 6.324555
Let me reproduce your error in a simple way
>list1 = list()
> list1[[0]]=list(a=c("a"))
>Error in list1[[0]] = list(a = c("a")) :
attempt to select less than one element
So, the next question is where are you accessing 0 index list ?
(Indexing of lists starts with 1 in R )
As Molx, indicated in previous posts : "The : operator is evaluated before the subtraction - " . This is causing 0 indexed list access.
For ex:
> 1:10-1
[1] 0 1 2 3 4 5 6 7 8 9
>1:(10-1)
[1] 1 2 3 4 5 6 7 8 9
So replace the following lines of your code
>for(i in 1:(length(lis)-1))
{
totdis=totdis+distant(lis[[i]],lis[[i+1]])
}

neography get actual node or node id from index

I am using the below to get nodes from an index:
neo.get_node_index('nodes_index', 'type', 'repo')
Which works fine. However, the data returned is a Hash object, as below:
> {"indexed"=>"http://localhost:7474/db/data/index/node/nodes_index/type/repo/12", "outgoing_relationships"=>"http://localhost:7474/db/data/node/12/relationships/out",
> "data"=>{"name"=>"irc-logs"},
> "traverse"=>"http://localhost:7474/db/data/node/12/traverse/{returnType}",
> "all_typed_relationships"=>"http://localhost:7474/db/data/node/12/relationships/all/{-list|&|types}",
> "property"=>"http://localhost:7474/db/data/node/12/properties/{key}",
> "self"=>"http://localhost:7474/db/data/node/12",
> "properties"=>"http://localhost:7474/db/data/node/12/properties",
> "outgoing_typed_relationships"=>"http://localhost:7474/db/data/node/12/relationships/out/{-list|&|types}",
> "incoming_relationships"=>"http://localhost:7474/db/data/node/12/relationships/in",
> "extensions"=>{},
> "create_relationship"=>"http://localhost:7474/db/data/node/12/relationships", "paged_traverse"=>"http://localhost:7474/db/data/node/12/paged/traverse/{returnType}{?pageSize,leaseTime}",
> "all_relationships"=>"http://localhost:7474/db/data/node/12/relationships/all",
> "incoming_typed_relationships"=>"http://localhost:7474/db/data/node/12/relationships/in/{-list|&|types}"}
I would like either the actual node object to be returned, or be able to retrieve the id easily. By id, I am referring to the integer inside http://localhost:7474/db/data/node/12.
I could get it by regex, but this surely isn't the best way?
You could use the 'Phase 2' API to find it as below;
n = Neography::Node.find('nodes_index', 'type', 'repo')
n.neo_id # 12

Memory limit for running external executables within Asp.net

I am using WkhtmltoPdf in my C# web application running in .NET 4.0 to generate PDFs from HTML files. In general everything works fine except when the size of the HTML file is below 250KB. Once the HTML file size increases beyond that, the process which runs the wkhtmltopdf.exe gives an exception as below. On the Task Manager, I have seen that the Memory value for the wkhtmltopdf.exe process does not increase beyond a value of 40,096 K, which I believe is the reason why the process is abandoned in between.
How can we configure such that the memory limit for external exes can be increased? Is there any other way of solving this issue?
More info:
When I run the conversion from the command line directly, the PDF is generated fine. So, its unlikely to be a problem with WkhtmlToPdf.
The error is from localhost. I have tried the same on the DEV server, with the same result.
EDIT:
More specific exception message: - For the MainModule property of the
Process object, the error says - {"Only part of a ReadProcessMemory or
WriteProcessMemory request was completed"}, with the NativeErrorCode
value - 299.
Exception:
> [Exception: Loading pages (1/6) [>
> ] 0% [======> ]
> 10% [======> ] 11%
> [=======> ] 13%
> [=========> ] 15%
> [==========> ] 18%
> [============> ] 20%
> [=============> ] 22%
> [==============> ] 24%
> [===============> ] 26%
> [=================> ] 29%
> [==================> ] 31%
> [===================> ] 33%
> [=====================> ] 35%
> [======================> ] 37%
> [========================> ] 40%
> [=========================> ] 42%
> [==========================> ] 44%
> [============================> ] 47%
> [=============================> ] 49%
> [==============================> ] 51%
> [============================================================] 100%
> Counting pages (2/6)
> [============================================================] Object
> 1 of 1 Resolving links (4/6)
> [============================================================] Object
> 1 of 1 Loading headers and footers (5/6)
> Printing pages (6/6) [>
> ] Preparing [=>
> ] Page 1 of 49 [==>
> ] Page 2 of 49 [===>
> ] Page 3 of 49 [====>
> ] Page 4 of 49 [======>
> ] Page 5 of 49 [=======>
> ] Page 6 of 49 [========>
> ] Page 7 of 49 [=========>
> ] Page 8 of 49 [==========>
> ] Page 9 of 49 [============>
> ] Page 10 of 49 [=============>
> ] Page 11 of 49 [==============>
> ] Page 12 of 49 [===============>
> ] Page 13 of 49 [================>
> ] Page 14 of 49 [==================>
> ] Page 15 of 49 [===================>
> ] Page 16 of 49 [====================>
> ] Page 17 of 49 [=====================>
> ] Page 18 of 49 [======================>
> ] Page 19 of 49 [========================>
> ] Page 20 of 49 [=========================>
> ] Page 21 of 49 [==========================>
> ] Page 22 of 49 [===========================>
> ] Page 23 of 49 [============================>
> ] Page 24 of 49 [==============================>
> ] Page 25 of 49 [===============================>
> ] Page 26 of 49 [=================================>
> ] Page 27 of 49 [==================================>
> ]
Code that I use:
var fileName = " - ";
var wkhtmlDir = ConfigurationManager.AppSettings[Constants.AppSettings.ExportToPdfExecutablePath];
var wkhtml = ConfigurationManager.AppSettings[Constants.AppSettings.ExportToPdfExecutablePath] + "\\wkhtmltopdf.exe";
var p = new Process();
string switches = "";
switches += "--print-media-type ";
switches += "--margin-top 10mm --margin-bottom 10mm --margin-right 5mm --margin-left 5mm ";
switches += "--page-size A4 ";
switches += "--disable-smart-shrinking ";
var startInfo = new ProcessStartInfo
{
CreateNoWindow = true,
FileName = wkhtml,
Arguments = switches + " " + url + " " + fileName,
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true,
RedirectStandardInput=true,
WorkingDirectory=wkhtmlDir
};
p.StartInfo = startInfo;
p.Start();
Debugger Screenshot of WkHtmlToPdf.exe Process:
This is what your looking for:
http://jobobjectwrapper.codeplex.com/
I couldn't find anything else that pertains to "increasing" the Memory Limit for a process, although I've heard of people limiting the Process Memory with MaxWorkingSet, but I believe this is only for virtual memory after the application has used all it can.
The Job Object is a good place to start, they're just a collection of processes that are easily controllable.
"With this library you can create job objects, create and assign a
process to the job, control process and job limits, and register for
the various process- and job-related notification events."
This might be of use too:
Calling wkhtmltopdf to generate PDF from HTML
Have a look at this to see whether the Threads Per Process limit may be coming in to play here. It's a long shot (and I'm not aware of any memory limit imposed by IIS on external processes), but note this from the documentation:
Because this property defines the maximum number of ASP requests that
can execute simultaneously, this setting should remain at the default
value unless your ASP applications are making extended calls to
external components. In this case, you may increase the value of
Threads Per Processor Limit. Doing so allows the server to create more
threads to handle more concurrent requests.

Resources