I'm currently using the following LUA code to log nginx request details within openresty:
json = require("cjson")
local data = {request={}, response={}}
local req = data["request"]
local resp = data["response"]
req["method"] = ngx.req.get_method()
req["uri"] = ngx.var.uri
local postArgs = ngx.req.get_post_args()
local postArgs = "\"" .. postArgs .. "\""
req["post_args"] = postArgs
resp["status"] = ngx.status
resp["time"] = ngx.now()
resp["body"] = ngx.var.response_body
ngx.log(ngx.CRIT, json.encode(data));
Although I run into the following error:
lua:22: attempt to concatenate local 'postArgs' (a table value)
I've tried doing different forms of this but none seem to work due to one being a table and the other a string, is there an easy way to do what I'm trying to achieve, enclosing a particular value in speech marks before its added in table like in this example?
Related
i'm currently making a roblox whitelist system and it's almost finished but i need 1 thing more i scripted it and its not work (code below) i didn't found nothing to fix what i have (script and screenshoot of error below), thanks.
local key = 1
local HttpService = game:GetService("HttpService")
local r = HttpService:RequestAsync({
Url = "https://MyWebsiteUrl.com/check.php?key="..key,
Method = "GET"
})
local i = HttpService:JSONDecode(r.Body)
for n, v in pairs(i) do
print(tostring(n)..", "..tostring(v))
end
I assume the website that you are using to validate the key
returns the response in raw if so then
local key = 1
local HttpService = game:GetService("HttpService")
local r = HTTPService:GetAsync("https://MyWebsiteUrl.com/check.php?key="..key)
local response = JSON:Decode(r)
print(response)
I think this is because you tried to concat a string (the url) with a number (the key variable) try to make the key a string
I have a table in a cosmosDB where I would like to store some query coming in from a azure data explorer.
In data explorer (using python sdk) I run a query to filter and aggregate my data and I would like to push this query result into a cosmosDB.
I did set all the configurations and connection, but when I run the command upset_item or create_item
I get the following error:
azure.cosmos.exceptions.CosmosHttpResponseError: Status code: 400
{"odata.error":{"code":"InvalidInput","message":{"lang":"en-us","value":"Request url is invalid.\r\nActivityId: d8994113-504e-4198-9976-41316aaafb5f, documentdb-dotnet-sdk/2.11.0 Host/64-bit MicrosoftWindowsNT/6.2.9200.0\nRequestID:d8994113-504e-4198-9976-41316aaafb5f\n"}}}
this is how I configured the azure-cosmos:
url = "url"
key = {"masterkey": "my-key"}
client = CosmosClient(url,key)
database_name = "database"
container_name = "table"
database = client.get_database_client(database_name)
container = database.get_container_client(container_name)
for i in df:
container.upsert_item(i)
df is the result of my query from azure data explorer
the error I believe is due how I am passing the body of the query to the upsert_item
Any advice about this?
UPDATE:
so I tried to follow the sample. I updated my cosmosdb to a sql API.
my current code now looks like this:
url = "url"
key = {"masterkey": "my-key"}
client = CosmosClient(url,key)
database_name = "database"
container_name = "table"
database = client.get_database_client(database_name)
container = database.get_container_client(container_name)
data = json.dumps(test)
data_dict = json.loads(data)
for i in data_dict:
container.create_item(body=i)
I converted the data frame to a json. but when I run the for I get this error:
AttributeError: 'str' object has no attribute 'get'
And I have no idea what am I doing wrong
Thanks for your reply and I think we need more contact on debug.
Here's my code and it worked well.
from azure.cosmos import exceptions, CosmosClient, PartitionKey,ProxyConfiguration
import family
# Initialize the Cosmos client
endpoint = "https://cosmosdb_name.documents.azure.com:443/"
key = 'primary_key_in_keys_blade'
client = CosmosClient(endpoint, key)
database_name = 'AzureSampleFamilyDatabase'
database = client.create_database_if_not_exists(id=database_name)
# </create_database_if_not_exists>
# <create_container_if_not_exists>
container_name = 'FamilyContainer'
container = database.create_container_if_not_exists(
id=container_name,
partition_key=PartitionKey(path="/lastName"),
offer_throughput=400
)
family_items_to_create = [
{
"id":"112233",
"name":"andersen"
},
{
"id":"455646",
"name":"johnson"
},
{
"id":"999999",
"name":"smith"
}
]
# <create_item>
for family_item in family_items_to_create:
container.upsert_item(body=family_item)
Per your error, I think you need to check the data format of data_dict or you can use my sample data for testing to rule out other problems. In my position, I don't know why you'd like to use json.loads() here.
Im bigrquery to upload data from google bucket to bigquery and i like to set the maximum number of errors i will allow (default is 0 )
for that im using bq_perform_load
bq_job<-
bq_perform_load(bq_table(destination_project,destination_dataset,destination_table_temp),
file_name,
source_format ="CSV",
nskip="1",
create_disposition = "CREATE_IF_NEEDED",
write_disposition = "WRITE_TRUNCATE",
fields = bq_fields,
billing = destination_project)
as part of the documentation it looks like there is an ability to add additional arguments
...
Additional arguments passed on to the underlying API call. snake_case names are automatically converted to camelCase.
but im not sure how to use it
ive tried something like that
bq_perform_load(bq_table(destination_project,destination_dataset,destination_table_temp),
file_name,
source_format ="CSV",
nskip="1",
create_disposition = "CREATE_IF_NEEDED",
write_disposition = "WRITE_TRUNCATE",
fields = bq_fields,
maxBadRecords = "500",
billing = destination_project)
but it didn't work
I'm currently trying to connect a Lua Script with a GS WebApp. The connection is working but due to my lack of knowledge in GScripting I'm not sure why it isn't saving my data correctly.
In the Lua side I'm just passing in a hard-code a random name and simple numerical userid.
local HttpService = game:GetService("HttpService")
local scriptID = scriptlink
local WebApp
local function updateSpreadSheet ()
local playerData = (scriptID .. "?userid=123&name:Jhon Smith")
WebApp = HttpService:GetAsync(playerData)
end
do
updateSpreadSheet()
end
On the Google Script side i'm only saving the data on the last row and then add the value of the userid and the name.
function doGet(e) {
console.log(e)
// console.log(f)
callName(e.parameter.userid,e.parameter.name);
}
function callName(userid,name) {
// Get the last Row and add the name provided
var sheet = SpreadsheetApp.getActiveSheet();
sheet.getRange(sheet.getLastRow() + 1,1).setValues([userid],[name]);
}
However, the only data the script is saving is the name, bypassing the the userid for reasons I have yet to discover.
setValues() requires a 2D array and range dimensions should correspond to that array. The script is only getting 1 x 1 range and setValues argument is not a 2D array. Fix the syntax or use appendRow
sheet.getRange(sheet.getLastRow() + 1,1,1,2).setValues([[userid,name]]);
//or
sheet.appendRow([userid,name])
References:
appendRow
i am trying to run scollector from bosun.
after I run the scolector, It cannot show me the memory information, but CPU information was right.
this CODE:
Host = "http://localhost:8070"
DisableSelf = true
Freq = 60
Filter = ["snmp-generic", "snmp-ifaces"]
[[SNMP]]
Community = "test"
Host = "name"
MIBs = [ "devicename"]
[Tags]
product = "fw"
[MIBs]
[MIBs.fw]
BaseOid = ".1.3.6.1.4.1.2620"
[[MIBs.fw.Metrics]]
Metric = "os.cpu"
Oid = ".1.6.7.2.4.0"
Unit = "percent"
RateType = "gauge"
[[MIBs.fw.Metrics]]
Metric = "os.mem.used"
Oid = ".1.6.7.4.5.0"
Unit = "bytes"
RateType = "gauge"
THIS IS LOG
**2016/11/07 17:24:42 error: interval.go:64: snmp-generic-name-fw: asn1: structure error: tags don't match (2 vs {class:0 tag:4 length:11 isCompound:false}) {optional:false explicit:false application:false defaultValue:<nil> tag:<nil> stringType:0 timeType:0 set:false omitEmpty:false} #2**
2016/11/07 17:24:43 info: queue.go:90: {"metric":"os.cpu","timestamp":1478539482,"value":2,"tags":{"host":"name","product":"fw"}}
It looks to me like this is an issue converting data types. The error is from deep in the bowels of the asn1 library we are using but I think it boils down to: cpu is represented as an integer, while memory is a string.
Our snmp collector attempts to parse all values into a big.Int, but apparently the string value is not able to be coerced into that by our asn1 library.
Unfortunately I don't see a good way to make this work, except perhaps look for an oid that returns an integer type. Without knowing what device you are using, that is as good as I can offer I'm afraid.