HTTPOISON - insert body parameters in elixir - http

I'm trying to do a http request
def getPage() do
url = "http://myurl"
body = '{
"call": "MyCall",
"app_key": "3347249693",
"param": [
{
"page" : 1,
"registres" : 100,
"filter" : "N"
}
]
}'
headers = [{"Content-type", "application/json"}]
HTTPoison.post(url, body, headers, [])
end
this works for me well.
my question is - how can I insert variables in the body request.
meaning:
def getPage(key, page, registers, filter) do
url = "http://myurl"
body = '{
"call": "MyCall",
"app_key": key,
"param": [
{
"page" : page,
"registres" : registers,
"filter" : filter
}
]
}'
headers = [{"Content-type", "application/json"}]
HTTPoison.post(url, body, headers, [])
end
when I run it I get
%HTTPoison.Response{body: "\nFatal error: Uncaught exception 'Exception' with message 'Invalid JSON object' in /myurl/www/myurl_app/api/lib/php-wsdl/class.phpwsdl.servers.php:...
any suggestions?

You really should be using a JSON encoder like Poison for this.
url = "http://myurl"
body = Poison.encode!(%{
"call": "MyCall",
"app_key": key,
"param": [
%{
"page": page,
"registres": registers,
"filter": filter
}
]
})
headers = [{"Content-type", "application/json"}]
HTTPoison.post(url, body, headers, [])

You need to interpolate the values:
body = '{
"call": "MyCall",
"app_key": "#{key}",
"param": [
{
"page" : #{page},
"registres" : "#{registres}",
"filter" : "#{filter}"
}
]
}'
If you use a JSON library (Poison is a popular choice) Then you could do something like this to turn Elixir data structures into a JSON representation:
body = %{
call: "MyCall",
app_key: key,
param: [
{
page: page,
registres: registers,
filter: filter
}
]
} |> Poison.encode!()

Related

Getting error method not allowed and content-type:text/plain

All my routes are working perfectly except CreateGoal. Whenever I add data using the Post method it is giving me the message method not allowed.When I checked the Get method Headers Content-type, it is application/json but when I checked the Post method Headers Content-type, it is text/plain; charset=utf-8. So I think there must be a problem with Content-type. I am not understanding how to solve this problem. I have attached the screenshots for reference.
Screenshots:
Routes:
func Setup(app *fiber.App) {
app.Get("/goals", controllers.GetGoals)
app.Get("/goals/:id", controllers.GetGoal)
app.Post("/goals/add", controllers.CreateGoal)
app.Put("/goals/:id", controllers.UpdateGoal)
app.Delete("/goals/:id", controllers.DeleteGoal)
}
Controllers:
import (
"strconv"
"github.com/gofiber/fiber/v2"
)
type Goal struct {
Id int `json:"id"`
Title string `json:"title"`
Status bool `json:"status"`
}
var goals = []*Goal{
{
Id: 1,
Title: "Read about Promises",
Status: true,
},
{
Id: 2,
Title: "Read about Closures",
Status: false,
},
}
func GetGoals(c *fiber.Ctx) error {
return c.Status(fiber.StatusOK).JSON(
// "success": true,
// "data": fiber.Map{
// "goals": goals,
// },
goals,
)
}
func CreateGoal(c *fiber.Ctx) error {
type Request struct {
Title string `json:"title"`
}
var body Request
err := c.BodyParser(&body)
if err != nil {
return c.Status(fiber.StatusBadRequest).JSON(fiber.Map{
"success": false,
"message": "Cannot parse JSON",
"error": err,
})
}
goal := &Goal{
Id: len(goals) + 1,
Title: body.Title,
Status: false,
}
goals = append(goals, goal)
return c.Status(fiber.StatusCreated).JSON(fiber.Map{
"success": true,
"data": fiber.Map{
"goal": goal,
},
})
}
Your endpoint is /goals/add in the application for POST method. But in Postman you called /goals
In application /goals expecting a GET request. That is why there method is not allowing.

Insert date as epoch_seconds, output as formatted date

I have a set of timestamps formatted as seconds since the epoch. I'd like to insert to ElasticSearch as epoch_seconds but when querying would like to see the output as a pretty date, e.g. strict_date_optional_time.
My below mapping preserves the format that the input came in - is there any way to normalize the output to just one format via the mapping api?
Current Mapping:
PUT example
{
"mappings": {
"time": {
"properties": {
"time_stamp": {
"type": "date",
"format": "strict_date_optional_time||epoch_second"
}
}
}
}
}
Example docs
POST example/time
{
"time_stamp": "2018-03-18T00:00:00.000Z"
}
POST example/time
{
"time_stamp": "1521389162" // Would like this to output as: 2018-03-18T16:05:50.000Z
}
GET example/_search output:
{
"total": 2,
"max_score": 1,
"hits": [
{
"_source": {
"time_stamp": "1521389162", // Stayed as epoch_second
}
},
{
"_source": {
"time_stamp": "2018-03-18T00:00:00.000Z"
}
}
]
}
Elasticsearch differentiates between the _source and the so called stored fields. The first one is supposed to represent your input.
If you actually use stored fields (by specifying store=true in your mapping) then specify multiple date formats this is easy: (emphasis mine)
Multiple formats can be specified by separating them with || as a separator. Each format will be tried in turn until a matching format is found. The first format will be used to convert the milliseconds-since-the-epoch value back into a string.
I have tested this with elasticsearch 5.6.4 and it works fine:
PUT /test -d '{ "mappings": {"doc": { "properties": {"post_date": {
"type":"date",
"format":"basic_date_time||epoch_millis",
"store":true
} } } } }'
PUT /test/doc/2 -d '{
"user" : "test1",
"post_date" : "20150101T121030.000+01:00"
}'
PUT /test/doc/1 -d '{
"user" : "test2",
"post_date" : 1525167490500
}'
Note how two different input-formats will result in the same format when using GET /test/_search?stored_fields=post_date&pretty=1
{
"hits" : [
{
"_index" : "test",
"_type" : "doc",
"_id" : "2",
"_score" : 1.0,
"fields" : {
"post_date" : [
"20150101T111030.000Z"
]
}
},
{
"_index" : "test",
"_type" : "doc",
"_id" : "1",
"_score" : 1.0,
"fields" : {
"post_date" : [
"20180501T093810.500Z"
]
}
}
]
}
If you want to change the input (in _source) you're not so lucky, the mapping-transform feature has been removed:
This was deprecated in 2.0.0 because it made debugging very difficult. As of now there really isn’t a feature to use in its place other than transforming the document in the client application.
If, instead of changing the stored data you are interested in formatting the output, have a look at this answer to Format date in elasticsearch query (during retrieval)

How can I verify that a map's values are not empty

Suppose I have a contract like this specified in groovy:
org.springframework.cloud.contract.spec.Contract.make {
request {
method "GET"
url "/api/profiles"
headers {
header('Accept': 'application/json;charset=UTF-8')
header('Content-Type': 'application/json;charset=UTF-8')
}
}
response {
status 200
headers {
header('Content-Type': 'application/json;charset=UTF-8')
}
body(
value(
stub(
'''\
[
{
"profile": "profile1",
"myMap": {}
},
{
"profile": "profile2",
"myMap": {
"12345": "FOO",
"asdf": "BAR"
}
}
]
'''
),
test(
[
[
"profile" : regex(nonEmpty()),
"myMap": [
[
??
]
]
]
]
)
)
)
}
}
Now I want to test that the map contains String to String entries where the values must not be empty. The map itself may be empty.
How can I test for dynamic key name?
On the response side of the contract you have to chose whether you're using the map notation or the string notation. If you want to do assertions on pieces of the response you have to embed those assertions inside the body or use the test matchers.
You can put the body as a multiline string and then write the testMatchers section
testMatchers{
jsonPath('$.[*].myMap', byCommand('assertKeys($it)'))
}
then it's enough for you to provide the assertion in the assertKeys method.

API Gateway and DynamoDB PutItem for String Set

I can't seem to find how to correctly call PutItem for a StringSet in DynamoDB through API Gateway. If I call it like I would for a List of Maps, then I get objects returned. Example data is below.
{
"eventId": "Lorem",
"eventName": "Lorem",
"companies": [
{
"companyId": "Lorem",
"companyName": "Lorem"
}
],
"eventTags": [
"Lorem",
"Lorem"
]
}
And my example template call for companies:
"companies" : {
"L": [
#foreach($elem in $inputRoot.companies) {
"M": {
"companyId": {
"S": "$elem.companyId"
},
"companyName": {
"S": "$elem.companyName"
}
}
} #if($foreach.hasNext),#end
#end
]
}
I've tried to call it with String Set listed, but it errors out still and tells me that "Start of structure or map found where not expected" or that serialization failed.
"eventTags" : {
"SS": [
#foreach($elem in $inputRoot.eventTags) {
"S":"$elem"
} #if($foreach.hasNext),#end
#end
]
}
What is the proper way to call PutItem for converting an array of strings to a String Set?
If you are using JavaScript AWS SDK, you can use document client API (docClient.createSet) to store the SET data type.
docClient.createSet - converts the array into SET data type
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName:table,
Item:{
"yearkey": year,
"title": title
"product" : docClient.createSet(['milk','veg'])
}
};

how to get respose of data webscript in share webscript js file

I have one data webscript at alfresco side which return json response.
i want this json response in share webscript to display that json data on share.
following is the my code written in getLocation.get.js file # share.
var result1 = new Array();
var connector = remote.connect("alfresco");
var data = connector.get("/com/portfolio/ds/getlocation");
// create json object from data
if(data.status == 200){
var result = jsonUtils.toJSONString(eval(data.response));
model.docprop = result ;
}else{
model.docprop = "Failed";
}
Following is the output from alfresco side
{
"subgroups": [
{
"name": "grp_pf_india_user" ,
"label": "INDIA"
},
{
"name": "grp_pf_israil_user" ,
"label": "ISRAIL"
},
{
"name": "grp_pf_usa_user" ,
"label": "USA"
}
]
}
use this code to call repo webscripts from share side by using the concept or RMI. (Alfresco.constants.PROXY_URI) = (http://host:port/share/proxy/alfresco/)
var xurl=Alfresco.constants.PROXY_URI+"HR-webscripts/createHRDocument/"+JSON.stringify(o);
//alert(xurl);
var request = $.ajax({
url: xurl ,
type: "POST",
//data: { "groupname" : o},
beforeSend : function(xhr){
/*
Alfresco.util.Ajax & alfresco/core/CoreXhr – will automatically take the token from the cookie and add it as a request header for every request.
Alfresco.forms.Form – will automatically take the token from the cookie and add it as a url parameter to when submitting an multipart/form-data request.
(When submitting a form as JSON the Alfresco.util.Ajax will be used internally)
*/
if (Alfresco.util.CSRFPolicy && Alfresco.util.CSRFPolicy.isFilterEnabled()){
xhr.setRequestHeader(Alfresco.util.CSRFPolicy.getHeader(), Alfresco.util.CSRFPolicy.getToken() );
}
},
dataType: "html"
});
request.done(function(msg) {
//alert( "Request OK: " + msg );
$("#res").html( msg );
});
request.fail(function(jqXHR, textStatus) {
alert( "Request failed: " + textStatus );
});

Resources