I am developing an application using xamarin forms with syncfusion. I couldn't create a solution for the json file below.
The page consists of 3 columns and takes these column headers from the label field. (this is the same for all properties)
it takes the content from the display area.
I can't do normal mapping here. A different perspective is required.
I will be glad if you help me. Thank you.
{
"serviceItems": [
{
"id": "79471",
"ref": "7",
"properties": [
{
"label": "Text Field",
"display": "sample text",
"value": "sample text",
"accessibility": {
"visible": false,
"editable": false,
"required": false
},
"dataType": "string",
"objectType": 0
},
{
"label": "Datetime Field",
"display": "19.03.2020 12:14:00",
"value": "3/19/2020 12:14:00 PM",
"accessibility": {
"visible": false,
"editable": false,
"required": false
},
"dataType": "datetime",
"objectType": 0
},
{
"label": "Date Field",
"display": "19.03.2020",
"value": "3/19/2020 12:00:00 AM",
"accessibility": {
"visible": false,
"editable": false,
"required": false
},
"dataType": "date",
"objectType": 0
}
],
"swipeItems": [
{
"type": "action",
"icon": "approve",
"url": null,
"event": {
"eventId": "5",
"eventText": "Onayla",
"reasonRequired": "false",
"showHistory": false
}
},
{
"type": "action",
"icon": "reject",
"url": null,
"event": {
"eventId": "6",
"eventText": "Reddet",
"reasonRequired": "true",
"showHistory": false
}
},
{
"type": "delete",
"icon": "delete",
"url": null,
"event": null
}
],
"leftSideIcon": "green",
"rightSideIcon": "attachment",
"abortHistory": false
},
{
"id": "79597",
"ref": "7",
"properties": [
{
"label": "Text Field",
"display": "sample text",
"value": "sample text",
"accessibility": {
"visible": false,
"editable": false,
"required": false
},
"dataType": "string",
"objectType": 0
},
{
"label": "Date Field",
"display": "26.03.2020 19:00:36",
"value": "3/26/2020 7:00:36 PM",
"accessibility": {
"visible": false,
"editable": false,
"required": false
},
"dataType": "datetime",
"objectType": 0
},
{
"label": "Datetime Field",
"display": "26.03.2020 19:00:36",
"value": "3/26/2020 7:00:36 PM",
"accessibility": {
"visible": false,
"editable": false,
"required": false
},
"dataType": "date",
"objectType": 0
}
],
"swipeItems": [
{
"type": "action",
"icon": "reject",
"url": null,
"event": {
"eventId": "6",
"eventText": "Reddet",
"reasonRequired": "true",
"showHistory": false
}
},
{
"type": "edit",
"icon": "edit",
"url": null,
"event": null
},
{
"type": "action",
"icon": "approve",
"url": null,
"event": {
"eventId": "5",
"eventText": "Onayla",
"reasonRequired": "false",
"showHistory": false
}
}
],
"leftSideIcon": "red",
"rightSideIcon": "attachment",
"abortHistory": false
}
],
}
We would like to tell, that’s not a proper way to populate the JSON data with SfDataGrid. Since you have initialized Value for all the three properties. So, the text is taken from the Value field. We have already published a KB to populate the JSON data with SfDataGrid.
KB link : https://www.syncfusion.com/kb/7828/how-to-load-sfdatagrid-dynamically-with-json-data-without-poco-classes
Regards,
Karthik Raja
Related
Input file has 3 fields. Each field separated by a | (PIPE).
First field is the key field and sorted. Each key in first field may occur once or twice.
If a same key exists twice in first field, then remove the line of
first occurrence and do not remove the line of second occurrence.
If a key occurs only once then do not remove the line.
Input Data in the third field will be unique through out the file.
Tried the below command which keeps the first duplicate line and removes the rest of duplicate lines. Is there any option in awk command to remove the first matched duplicate line and keep the second matched line. Command other than awk is also okay. Input file size can be 50 GB size. I am testing now on 12 GB file.
awk -F'|' '!a[$1]++'
Input File Content:
1|xxx|{name: "xyz"}
2|xxx|{name: "abcfgs"}
3|xxx|{name: "egg"}
4|xxx|{name: "eggrgg"}
5|xxx|{name: "xsdsyzsgngn"}
5|xxx|{name: "gbgnfxyz"}
6|xxx|{name: "xyz"}
7|xxx|{name: "xynfnfnnnz"}
7|xxx|{name: "bvbv"}
8|xxx|{name: "xyz"}
9|xxx|{name: "xyz"}
....
Output expected after processing the input file:
1|xxx|{name: "xyz"}
2|xxx|{name: "abcfgs"}
3|xxx|{name: "egg"}
4|xxx|{name: "eggrgg"}
5|xxx|{name: "gbgnfxyz"}
6|xxx|{name: "xyz"}
7|xxx|{name: "bvbv"}
8|xxx|{name: "xyz"}
9|xxx|{name: "xyz"}
....
EDIT
Tried below solutions provided by #RavinderSingh13 & #RomanPerekhrest repectively.
For 12GB input file, below solution took 1 minute 20 seconds in first run and 1 minute 46 seconds in second run:
awk '
BEGIN{
FS="|"
}
!a[$1]++{
b[++count]=$1
}
{
c[$1]=$0
}
END{
for(i=1;i<=count;i++){
print c[b[i]]
}
}
' Inputfile > testawk.txt
For 12GB input file, below solution took 2 minutes 31 seconds in first run, 4 minutes 43 seconds in second run and 2 minutes in 3rd run:
awk -F'|' 'prev && $1 != prev{ print row }{ prev=$1; row=$0 }END{ print row }' Inputfile > testawk2.txt
Both the solutions are working as expected. I will use any one of the above after doing few more performance tests.
Efficiently with awk expression:
awk -F'|' 'prev && $1 != prev{ print row }{ prev=$1; row=$0 }END{ print row }' file
The "magic" is based on capturing each current record (efficiently overwriting it without constant accumulation) and performing analysis on next row.
Sample output:
1|xxx|{name: "xyz"}
2|xxx|{name: "abcfgs"}
3|xxx|{name: "egg"}
4|xxx|{name: "eggrgg"}
5|xxx|{name: "gbgnfxyz"}
6|xxx|{name: "xyz"}
7|xxx|{name: "bvbv"}
8|xxx|{name: "xyz"}
9|xxx|{name: "xyz"}
1st solution: If you are not at all worried about order of your lines in output then do simply.
awk 'BEGIN{FS="|"} {a[$1]=$0} END{for(i in a){print a[i]}}' Input_file
2nd solution: Adding 1 more solution with awk less arrays and sort in case you worried about order.
awk 'BEGIN{FS="|"} {a[$1]=$0} END{for(i in a){print a[i]}}' Input_file | sort -t'|' -k1
3rd solution: Could you please try following. If you are worried about order of your output should be same as shown Input_file.
awk '
BEGIN{
FS="|"
}
!a[$1]++{
b[++count]=$1
}
{
c[$1]=$0
}
END{
for(i=1;i<=count;i++){
print c[b[i]]
}
}
' Input_file
Output will be as follows.
1|xxx|{name: "xyz"}
2|xxx|{name: "abcfgs"}
3|xxx|{name: "egg"}
4|xxx|{name: "eggrgg"}
5|xxx|{name: "gbgnfxyz"}
6|xxx|{name: "xyz"}
7|xxx|{name: "bvbv"}
8|xxx|{name: "xyz"}
9|xxx|{name: "xyz"}
This one-liner will only remove the first duplicate (the 2nd occurrence) from your file.
awk 'a[$1]++ !=1' file
Let's see an example:
kent$ cat f
1
2
3
2 <- should be removed
4
3 <- should be removed
5
6
7
8
9
2 <- should be kept
3 <- should be kept
10
kent$ awk 'a[$1]++ !=1' f
1
2
3
4
5
6
7
8
9
2
3
10
Reverse the file and stable unique sort:
cat <<EOF |
1|xxx|{name: "xyz"}
2|xxx|{name: "abcfgs"}
3|xxx|{name: "egg"}
4|xxx|{name: "eggrgg"}
5|xxx|{name: "xsdsyzsgngn"}
5|xxx|{name: "gbgnfxyz"}
6|xxx|{name: "xyz"}
7|xxx|{name: "xynfnfnnnz"}
7|xxx|{name: "bvbv"}
8|xxx|{name: "xyz"}
9|xxx|{name: "xyz"}
EOF
tac | sort -s -t'|' -k1,1 -u
would output:
1|xxx|{name: "xyz"}
2|xxx|{name: "abcfgs"}
3|xxx|{name: "egg"}
4|xxx|{name: "eggrgg"}
5|xxx|{name: "gbgnfxyz"}
6|xxx|{name: "xyz"}
7|xxx|{name: "bvbv"}
8|xxx|{name: "xyz"}
9|xxx|{name: "xyz"}
The tac is a GNU utility. Because your file is big, pass the filename to tac so it can read the file from the back and use -T, --temporary-directory=DIR option with sort to allow it to sort such big files (or not, if you have enough ram).
Below is the sample content of the hb_20190930103450.log file
<------some lines------->
[2019-09-30 19:55:59] [MERGE] : ####### BEGIN - claim_response - '2016-01-15' - #######
<------some lines--------->
[2019-09-30 20:17:11] [MERGE] : ####### BEGIN - compound_ingred - '2016-01-15' - #######
<-------some lines---------->
here $1 is [2019-09-30, $2 is 20:17:11] and $8 is compound_ingred
I am using this command to get list of lines from a bunch of similar files like hb_20190930103450.log I am using hb_2019*.log which includes BEGIN in a line and get $1, $2,$8 columns from the line which includes BEGIN along with filenames and adding
them to startdate.txt file.
awk '/BEGIN/ {print FILENAME,$1,$2,$8}' hb_2019*.log > sdate.txt
Below is outcome of above command which gives all the files in all the log files
hb_20190927121800.log [2019-09-27 20:45:56] ser_message1
hb_20190927121800.log [2019-09-27 20:45:58] claim_response
hb_20190927121800.log [2019-09-27 20:46:00] compound_ingred
hb_20190927121800.log [2019-09-27 20:47:36] pha_ree
hb_20190930103448.log [2019-09-29 10:34:48] ser_message1
hb_20190930103448.log [2019-09-29 11:58:22] claim_response
hb_20190930103448.log [2019-09-29 14:17:28] mcompound_ingred
hb_20190930103448.log [2019-09-29 15:05:48] pha_ree
hb_20190930103450.log [2019-09-30 19:11:25] ser_message1
hb_20190930103450.log [2019-09-30 19:55:59] claim_response
hb_20190930103450.log [2019-09-30 20:17:11] compound_ingred
hb_20190930103450.log [2019-09-30 20:17:13] pha_ree
Below is what I am trying but no luck
awk '/BEGIN/ {print FILENAME,$1,$2,$8}' hb_2019*.log |sort|uniq > sdate.txt
Does anyone face this and had a success?
sample expected output should look like below with only latest of entries of tables
hb_20190930103450.log [2019-09-30 19:11:25] ser_message1
hb_20190930103450.log [2019-09-30 19:55:59] claim_response
hb_20190930103450.log [2019-09-30 20:17:11] compound_ingred
hb_20190930103450.log [2019-09-30 20:17:13] pha_ree
pipe the output to this instead
$ ... | sort -k4 -k2,3r | uniq -f3 | sort -k2,3
hb_20190930103448.log [2019-09-29 14:17:28] mcompound_ingred
hb_20190930103450.log [2019-09-30 19:11:25] ser_message1
hb_20190930103450.log [2019-09-30 19:55:59] claim_response
hb_20190930103450.log [2019-09-30 20:17:11] compound_ingred
hb_20190930103450.log [2019-09-30 20:17:13] pha_ree
sort by name and time stamp (descending order) and pick the first entry with uniq, sort again for time.
What I understand is that you have various keys (ser_message1, claim_response, compound_ingred, ...) from which you want the youngest entry per key off all files hb_2019*.log. We can do this easily by keeping track of the times per key. I will make the assumption that the full concatenated set of files is unordered in time:
$ awk '!/BEGIN/ { next }
{ key=$8; timestring=$1$2 }
(! (key in time)) || (timestring > time[key]) {
time[key]=timestring
msg[key] = FILENAME OFS $1 OFS $2 OFS $8
}
END { for(key in time) print msg[key] }
' hb_2019*.log | sort -k2,3
Another shot in the dark:
$ awk '
FNR==1 { b="" }
/BEGIN/ { b=b (b==""?"":ORS) FILENAME OFS $1 OFS $2 OFS $8 }
END { print b }' hb_2019*.log
Update: Special version for your sample data set, use the above for you actual data (ie. fields FILENAME, $2, $3, $4 instead of FILENAME, $1, $2, $8):
$ awk 'FNR==1{b=""}{b=b (b==""?"":ORS) FILENAME OFS $2 OFS $3 OFS $4}END{print b}' hb_201909*
hb_20190930103450.log [2019-09-30 19:11:25] ser_message1
hb_20190930103450.log [2019-09-30 19:55:59] claim_response
hb_20190930103450.log [2019-09-30 20:17:11] compound_ingred
hb_20190930103450.log [2019-09-30 20:17:13] pha_ree
I use Grails 2.5.6 and I try to config saml with the plugin.
I can call the saml login but after my login I get a blanc site with redirect loop.
BuildConfig:
dependencies{
/*...*/
compile('org.springframework.security.extensions:spring-security-saml2-core:1.0.2.RELEASE'){
export = false
}
compile('org.springframework.security:spring-security-core:3.2.9.RELEASE')
compile('org.springframework.security:spring-security-web:3.2.9.RELEASE')
}
plugins{
/*...*/
compile ":spring-security-core:2.0.0"
compile ":spring-security-saml:2.0.0"
}
Config:
grails.plugin.springsecurity.userLookup.userDomainClassName = "de.streit.user.User"
grails.plugin.springsecurity.userLookup.authorityJoinClassName = "de.streit.security.UserRole"
grails.plugin.springsecurity.authority.className = "de.streit.security.Role"
grails.plugin.springsecurity.requestMap.className = 'de.streit.security.Requestmap'
grails.plugin.springsecurity.securityConfigType = 'Requestmap'
grails.plugin.springsecurity.authenticationFailureUrl = '/login/authfail?login_error=1'
// Define the authentication providers
grails.plugin.springsecurity.providerNames = ["samlAuthenticationProvider"]
grails.plugin.springsecurity.useSwitchUserFilter = true
//SAML
grails.plugin.springsecurity.saml.active = true
grails.plugin.springsecurity.saml.metadata.providers = [idp: 'security/idp.xml']
grails.plugin.springsecurity.saml.metadata.defaultIdp = 'idp'
grails.plugin.springsecurity.saml.metadata.sp.defaults = [
signingKey: 'estar',
encryptionKey: 'estar',
tlsKey: 'estar',
alias : 'http://localhost:8080/Organisationsportal'
]
SP.XML:
<?xml version="1.0" encoding="UTF-8"?>
<md:EntityDescriptor entityID="http://localhost:8080/Organisationsportal" xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata">
<md:SPSSODescriptor AuthnRequestsSigned="true" WantAssertionsSigned="false" protocolSupportEnumeration="urn:oasis:names:tc:SAML:2.0:protocol">
<md:Extensions>
<idpdisco:DiscoveryResponse xmlns:idpdisco="urn:oasis:names:tc:SAML:profiles:SSO:idp-discovery-protocol" Binding="urn:oasis:names:tc:SAML:profiles:SSO:idp-discovery-protocol"
Location="http://localhost:8080/Organisationsportal/spring-security-saml/login/auth"/>
</md:Extensions>
<md:KeyDescriptor use="signing">
<ds:KeyInfo xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<ds:X509Data>
<ds:X509Certificate>
MIIC9jCCArSgAwIBAgIETo67pDALBgcqhkjOOAQDBQAwXjELMAkGA1UEBhMCVUsxEDAOBgNVBAgT
B1Vua25vd24xDzANBgNVBAcTBmxvbmRvbjENMAsGA1UEChMEYnVyYjENMAsGA1UECxMEYnVyYjEO
MAwGA1UEAxMFZmVyb3owHhcNMTExMDA3MDg0MzE2WhcNMTIwMTA1MDg0MzE2WjBeMQswCQYDVQQG
EwJVSzEQMA4GA1UECBMHVW5rbm93bjEPMA0GA1UEBxMGbG9uZG9uMQ0wCwYDVQQKEwRidXJiMQ0w
CwYDVQQLEwRidXJiMQ4wDAYDVQQDEwVmZXJvejCCAbgwggEsBgcqhkjOOAQBMIIBHwKBgQD9f1OB
HXUSKVLfSpwu7OTn9hG3UjzvRADDHj+AtlEmaUVdQCJR+1k9jVj6v8X1ujD2y5tVbNeBO4AdNG/y
ZmC3a5lQpaSfn+gEexAiwk+7qdf+t8Yb+DtX58aophUPBPuD9tPFHsMCNVQTWhaRMvZ1864rYdcq
7/IiAxmd0UgBxwIVAJdgUI8VIwvMspK5gqLrhAvwWBz1AoGBAPfhoIXWmz3ey7yrXDa4V7l5lK+7
+jrqgvlXTAs9B4JnUVlXjrrUWU/mcQcQgYC0SRZxI+hMKBYTt88JMozIpuE8FnqLVHyNKOCjrh4r
s6Z1kW6jfwv6ITVi8ftiegEkO8yk8b6oUZCJqIPf4VrlnwaSi2ZegHtVJWQBTDv+z0kqA4GFAAKB
gQDKBDz1DFPPmmWp9n1FskJOev7CnnVFsKji1NLUDdifvS+uW+cnvnDfD3yPdxzUeknCrPTBRp+B
IvYUvLQ57LMIuLgKQ12RujGl0Oz9JbFMAHuBV2I/7ZykzGQPysSEqKCqG+kDc8VZ4AfIf/S8YnQk
xqdWQ5jLTIzXvcWd0WEYbDALBgcqhkjOOAQDBQADLwAwLAIUGP/oZpi79ZM1793XzZvnmrnmz5gC
FBm4bDN8h/0hAa83jaD8joLr098I
</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:KeyDescriptor use="encryption">
<ds:KeyInfo xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<ds:X509Data>
<ds:X509Certificate>
MIIC9jCCArSgAwIBAgIETo67pDALBgcqhkjOOAQDBQAwXjELMAkGA1UEBhMCVUsxEDAOBgNVBAgT
B1Vua25vd24xDzANBgNVBAcTBmxvbmRvbjENMAsGA1UEChMEYnVyYjENMAsGA1UECxMEYnVyYjEO
MAwGA1UEAxMFZmVyb3owHhcNMTExMDA3MDg0MzE2WhcNMTIwMTA1MDg0MzE2WjBeMQswCQYDVQQG
EwJVSzEQMA4GA1UECBMHVW5rbm93bjEPMA0GA1UEBxMGbG9uZG9uMQ0wCwYDVQQKEwRidXJiMQ0w
CwYDVQQLEwRidXJiMQ4wDAYDVQQDEwVmZXJvejCCAbgwggEsBgcqhkjOOAQBMIIBHwKBgQD9f1OB
HXUSKVLfSpwu7OTn9hG3UjzvRADDHj+AtlEmaUVdQCJR+1k9jVj6v8X1ujD2y5tVbNeBO4AdNG/y
ZmC3a5lQpaSfn+gEexAiwk+7qdf+t8Yb+DtX58aophUPBPuD9tPFHsMCNVQTWhaRMvZ1864rYdcq
7/IiAxmd0UgBxwIVAJdgUI8VIwvMspK5gqLrhAvwWBz1AoGBAPfhoIXWmz3ey7yrXDa4V7l5lK+7
+jrqgvlXTAs9B4JnUVlXjrrUWU/mcQcQgYC0SRZxI+hMKBYTt88JMozIpuE8FnqLVHyNKOCjrh4r
s6Z1kW6jfwv6ITVi8ftiegEkO8yk8b6oUZCJqIPf4VrlnwaSi2ZegHtVJWQBTDv+z0kqA4GFAAKB
gQDKBDz1DFPPmmWp9n1FskJOev7CnnVFsKji1NLUDdifvS+uW+cnvnDfD3yPdxzUeknCrPTBRp+B
IvYUvLQ57LMIuLgKQ12RujGl0Oz9JbFMAHuBV2I/7ZykzGQPysSEqKCqG+kDc8VZ4AfIf/S8YnQk
xqdWQ5jLTIzXvcWd0WEYbDALBgcqhkjOOAQDBQADLwAwLAIUGP/oZpi79ZM1793XzZvnmrnmz5gC
FBm4bDN8h/0hAa83jaD8joLr098I
</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST" Location="http://localhost:8080/Organisationsportal/spring-security-saml/saml/SingleLogout"/>
<md:SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect" Location="http://localhost:8080/Organisationsportal/spring-security-saml/saml/SingleLogout"/>
<md:SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:SOAP" Location="http://localhost:8080/Organisationsportal/spring-security-saml/saml/SingleLogout"/>
<md:NameIDFormat>urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress</md:NameIDFormat>
<md:NameIDFormat>urn:oasis:names:tc:SAML:2.0:nameid-format:transient</md:NameIDFormat>
<md:NameIDFormat>urn:oasis:names:tc:SAML:2.0:nameid-format:persistent</md:NameIDFormat>
<md:NameIDFormat>urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified</md:NameIDFormat>
<md:NameIDFormat>urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName</md:NameIDFormat>
<md:AssertionConsumerService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST" Location="http://localhost:8080/Organisationsportal/spring-security-saml/saml/SSO" index="0" isDefault="true"/>
<md:AssertionConsumerService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Artifact" Location="http://localhost:8080/Organisationsportal/spring-security-saml/saml/SSO" index="1" isDefault="false"/>
<md:AssertionConsumerService Binding="urn:oasis:names:tc:SAML:2.0:bindings:PAOS" Location="http://localhost:8080/Organisationsportal/spring-security-saml/saml/SSO" index="2" isDefault="false"/>
</md:SPSSODescriptor>
ipd.xml:
<?xml version="1.0"?>
<md:EntityDescriptor xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata" entityID="http://localhost:8080/Organisationsportal" cacheDuration="PT1440M" ID="XpK4KzotwbSFUKx.-NtBzfGDWti">
<md:IDPSSODescriptor protocolSupportEnumeration="urn:oasis:names:tc:SAML:2.0:protocol">
<md:KeyDescriptor use="signing">
<ds:KeyInfo xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<ds:X509Data>
<ds:X509Certificate>
MIICRTCCAa6gAwIBAgIGAR0gYMbwMA0GCSqGSIb3DQEBBQUAMGYxCzAJBgNVBAYTAlVTMQswCQYD
VQQIEwJDTzEPMA0GA1UEBxMGRGVudmVyMQwwCgYDVQQKEwNEZXYxDTALBgNVBAsTBFBpbmcxHDAa
BgNVBAMTE0NvbmZpZyBTaWduaW5nIENlcnQwHhcNMDgxMDIxMTcwODEyWhcNMTMxMDIwMTcwODEy
WjBmMQswCQYDVQQGEwJVUzELMAkGA1UECBMCQ08xDzANBgNVBAcTBkRlbnZlcjEMMAoGA1UEChMD
RGV2MQ0wCwYDVQQLEwRQaW5nMRwwGgYDVQQDExNDb25maWcgU2lnbmluZyBDZXJ0MIGfMA0GCSqG
SIb3DQEBAQUAA4GNADCBiQKBgQDQeOdW6I2hyXCQn0X/+8/BzLfRfdy1kN54lmVauYEpaPHQo7by
gPPRPUTDC3LgJGfk4NWkPaM+EOeLzuVw9rbD3gjfsex6hUElkvUzPqXqNN3sq/2hm+FJup+GakE9
WCoEP5sGvlJshH00a4MSzjGTBBqqjsXaWDZ7Sy9UAGw5BQIDAQABMA0GCSqGSIb3DQEBBQUAA4GB
AKSNMImzVs7L+tfortt7RBFMzc/JLE8qnulY32FrWA3ZLrD+08EBeIp1iwdJ8AGpii3SFV3oV3xu
92Qy2WqsBwj1erYdKW5mrfAbThkwL5N7jRsjJyXnIcx3IBvRD+O+LIDHck0cSgmN14ghleeslx0Q
15kyBdoxbv6pR0k4xOaF
</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:SingleSignOnService Location="*1" Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect"/>
<md:SingleLogoutService Location="*1" Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect"/>
<md:NameIDFormat>urn:oasis:names:tc:SAML:2.0:nameid-format:transient</md:NameIDFormat>
</md:IDPSSODescriptor>
*1 I removed this because it from my company..
I dont know what I'am missing.
I definde a bean in the resource for the userDetailsService but my spring didnt know that im logged in.
Thanks
Marvin Thör
I could solve the problems I had.
I'll write my solution here.
resources.groovy
userDetailsService(OwnSpringSamlUserDetailsService){
samlUserService = ref("samlUserService")//This is a own Service
grailsApplication = ref("grailsApplication")
}
springSecurityService(OwnSpringSecurityService){
config = SpringSecurityUtils.securityConfig
authenticationTrustResolver = ref('authenticationTrustResolver')
grailsApplication = ref('grailsApplication')
passwordEncoder = ref('passwordEncoder')
objectDefinitionSource = ref('objectDefinitionSource')
userDetailsService = ref('userDetailsService')
userCache = ref('userCache')
}
The Problem here is that the saml spring override the getcurrentuser method.
BuildConfig.groovy
dependencies{
//SAML
compile('org.springframework.security.extensions:spring-security-saml2-core:1.0.2.RELEASE'){
export = false
}
compile('org.springframework.security:spring-security-core:3.2.9.RELEASE')
compile('org.springframework.security:spring-security-web:3.2.9.RELEASE')
}
plugins{
compile ":spring-security-core:2.0.0"
compile ":spring-security-saml:2.0.0"
}
The saml plugin used a old version of the spring core so I include the 3.2.9 Version to work with.
Config.groovy
grails.plugin.springsecurity.logout.filterProcessesUrl = "/saml/SingleLogout"
// Define the authentication providers
grails.plugin.springsecurity.providerNames = ["samlAuthenticationProvider"]
//SAML
grails.plugin.springsecurity.saml.metadata.sp.defaults = [
alias : 'localhost:dev:YOUR-APPNAME',
entityBaseURL: 'http://localhost:8080/YOUR-APPNAME'
]
grails.plugin.springsecurity.saml.metadata.url = "YOUR-METADATA-URL"
grails.plugin.springsecurity.saml.metadata.providers = ['ping': 'security/idp.xml']
you have to set the alias for the sp.xml. For me urls as alias not working.
UrlMapping.groovy
//SAML
"/saml/logout"(controller: 'logout', action: 'index')
I used this urlmapping for the logout
For the sp.xml I used the generated xml but I changed the entityID to:
localhost:dev:YOUR-APPNAME
Hi I have a big log file for which I am trying to get xml data passed into It.
I have a big log file which ressembles this :
2016/01/01 bladh bqskjdqskldjqsdlqskdjqlskdj dazihzmkldjkdjqslkjd
2016/01/01: qsdhqsdlkqsmdjqsldjqslkdjqlskdjqslkdjqslkdjqskdjqsd
2016/01/01: qsjdqmlskdmlqskdmcxxxx [qskjd][qsdjqslkdj] Payload :[<LOG><a>a</a>
<b>b</b>
<c>c</c>
<id>XXXXX</id>
<d>d</d>
</LOG>]]
2016/01/01 bladh bqskjdqskldjqsdlqskdjqlskdj dazihzmkldjkdjqslkjd
2016/01/01: qsdhqsdlkqsmdjqsldjqslkdjqlskdjqslkdjqslkdjqskdjqsd
2016/01/01: qsjdqmlskdmlqskdmcxxxx [qskjd][qsdjqslkdj] Payload :[<LOG> <a>a</a>
<b>b</b>
<c>c</c>
<id>YYYYY</id>
<d>d</d>
</LOG>]]
qskdmqlskdqlsdqlskdqlsdk
qsdlkqsdlkqsdmlkqsdlk
For now I am using
sed -n '/<START/{:start /\/END/!{N;b start};/XXXXX/p}' logFile
and I am getting this
2016/01/01: qsjdqmlskdmlqskdmcxxxx [qskjd][qsdjqslkdj] Payload :[<LOG><a>a</a>
<b>b</b>
<c>c</c>
<id>XXXXX</id>
<d>d</d>
</LOG>]]
I would like to retrieve the whole XML and get :
<LOG>
<a>a</a>
<b>b</b>
<c>c</c>
<id>XXXX</id>
<d>d</d>
</LOG>
Thanks in advance
Solution in TXR:
#(repeat)
# (skip)Payload :[<#tag>#preamble
# (collect)
#middle
# (last)
</#tag>]]
# (end)
# (output)
<#tag>
#(trim-str preamble)
# (repeat)
#middle
# (end)
</#tag>
# (end)
#(end)
Run:
$ txr extract.txr data
<LOG>
<a>a</a>
<b>b</b>
<c>c</c>
<id>XXXXX</id>
<d>d</d>
</LOG>
<LOG>
<a>a</a>
<b>b</b>
<c>c</c>
<id>YYYYY</id>
<d>d</d>
</LOG>
Try this:
sed -n '/<LOG/{:a;/<\/LOG/!{N;ba};s/.*\(<LOG>\)\(.*XXXXX.*<\/LOG>\).*/\1\n\2/p}' logFile
It should do the job but keep in mind that sed is not the right tool for parsing xml. When you'll have to parse valid xml files, you should consider using xmlstarlet or xmllint.
This might work for you (GNU sed):
sed -nr '/<LOG>/,/<\/LOG>/{s/.*(<LOG>)\s*/\1\n/;s/(<\/LOG>).*/\1/;p}' file
Use seds grep-like option to inhibit printing unless explicitly required and utilise the range feature /.../,/.../, top and tailing the string produced.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I have to repeat the key value after every 14 characters in a line
My file looks like this :
KEYVALUE1 201604141111I201604141111C201604141111D201604141111E201604141111F
KEYVALUE1 201604141111G
KEYVALUE2 201604141111I201604141111C201604141111D201604141111E201604141111F
KEYVALUE2 201604141111G201604141111F
KEYVALUE3 201604141111I
KEYVALUE4 201604141111G201604141111I
My output should look like this,
KEYVALUE1 201604141111I
KEYVALUE1 201604141111C
KEYVALUE1 201604141111D
KEYVALUE1 201604141111F
KEYVALUE1 201604141111G
KEYVALUE2 201604141111I
KEYVALUE2 201604141111C
KEYVALUE2 201604141111D
KEYVALUE2 201604141111F
KEYVALUE2 201604141111G
KEYVALUE2 201604141111F
KEYVALUE3 201604141111I
KEYVALUE4 201604141111G
KEYVALUE4 201604141111I
Please help.
sed is for simple substitutions on individual lines, that is all. For anything else you should be using awk for simplicity, clarity, robustness, portability, efficiency, and most other desirable qualities of software:
$ awk '{ while ($2!="") { print $1, substr($2,1,13); $2=substr($2,14) } }' file
KEYVALUE1 201604141111I
KEYVALUE1 201604141111C
KEYVALUE1 201604141111D
KEYVALUE1 201604141111E
KEYVALUE1 201604141111F
KEYVALUE1 201604141111G
KEYVALUE2 201604141111I
KEYVALUE2 201604141111C
KEYVALUE2 201604141111D
KEYVALUE2 201604141111E
KEYVALUE2 201604141111F
KEYVALUE2 201604141111G
KEYVALUE2 201604141111F
KEYVALUE3 201604141111I
KEYVALUE4 201604141111G
KEYVALUE4 201604141111I
If you're open to perl:
perl -lane 'print "$F[0] $_" for $F[1] =~/.{13}/g' file
You can try this sed:
sed -n '/^\([^ ]* \)\(.\{13\}\)/{s/^\([^ ]* \)\(.\{13\}\)/&\n\1/g;P;D;}' file
Or this one:
sed '{
:j
/^[^ ]* .\{13\}.\{13\}/ {
h
s/^\([^ ]*\) \(.\{13\}\)\(.*\)$/\1 \2/
p
x
s/^\([^ ]*\) \(.\{13\}\)\(.*\)$/\1 \3/
h
b j
}
}' file