I am using a combination of prepackaged wso2 identity server and wso2 api manager.
Recently configured a new environment .I installed custom domain names mylotsawso2dev.com and installed ssl.
Sometimes in the log , the below error is printed out.
[2017-05-15 02:47:42,525] ERROR - DataEndpoint Unable to send events to the endpoint.
org.wso2.carbon.databridge.agent.exception.DataEndpointException: Error while trying to publish events to data receiver :mylotsawso2dev.com/30.100.209.68:9612
at org.wso2.carbon.databridge.agent.endpoint.binary.BinaryDataEndpoint.send(BinaryDataEndpoint.java:81)
at org.wso2.carbon.databridge.agent.endpoint.DataEndpoint$EventPublisher.publish(DataEndpoint.java:330)
at org.wso2.carbon.databridge.agent.endpoint.DataEndpoint$EventPublisher.run(DataEndpoint.java:283)
at java.lang.Thread.run(Thread.java:745)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.wso2.carbon.databridge.commons.exception.SessionTimeoutException: 6e9a4d58-3c3b-4e59-a7ff-060322cd7d76 expired
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.wso2.carbon.databridge.agent.endpoint.binary.BinaryEventSender.processResponse(BinaryEventSender.java:164)
at org.wso2.carbon.databridge.agent.endpoint.binary.BinaryDataEndpoint.send(BinaryDataEndpoint.java:76)
... 8 more
What could be the reason for this?
This is a harmless error printed when session timeout occurs in event publisher. Ideally, this should be an info log. Public JIRA is open to track this issue.
Related
I am working with WSO2 APIM 2.6.0 version and trying to integrate with WSO2 APIM Analytics Server as per the link [https://docs.wso2.com/display/AM260/Configuring+APIM+Analytics#MSSQL-AM_USAGE_UPLOADED_FILES]
Already, we have working application of WSO2 APIM 2.5.0 and Analytics ti it and the data is generated as it should be. However, due to technical road block in APIM 2.5.0 ( adding certificate using RestAPIs), I am trying to migrate APIM from 2.5.0 to 2.6.0.
APIM is migrated as it is provided in the documentation link [https://docs.wso2.com/display/AM260/Upgrading+from+the+Previous+Release#code]
but when I am trying to integrate with Analytics, It throws constant error like below
[2019-09-09 10:03:17,367] INFO
{org.wso2.carbon.databridge.core.DataBridge} - user admin connected
[2019-09-09 10:03:17,368] ERROR
{org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping
wrongly formatted event sent
org.wso2.carbon.databridge.core.exception.EventConversionException: Error
when converting loganalyzer:1.0.0 of event bundle with events 1
at
org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:188)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:72)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId loganalyzer:1.0.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:171)
... 7 more
[2019-09-09 10:14:02,374] INFO {org.wso2.extension.siddhi.io.mgwfile.task.MGWFileCleanUpTask} - Uploaded API Usage data in the db will be cleaned up to : 2019-09-04 10:14:02.374
[2019-09-09 10:18:17,343] ERROR {org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping wrongly formatted event sent org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting loganalyzer:1.0.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:188)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:72)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId loganalyzer:1.0.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:171)
... 7 more
Can some one please let me know if the compatibility exists between APIM 2.6.0 and Analytics as while migration process, I read while migrating the Analytics part of APIM - in Note Section [https://docs.wso2.com/display/AM260/Upgrading+from+the+Previous+Release#code]
Step 3.1 - Note that it is mandatory to use a WUM updated WSO2 API Manager Analytics 2.6.0 pack when migrating the configurations for WSO2 API-M Analytics.
Can some one please let me know why this constant errors are thrown at Analytics server. I have started the worker node of stream processor and as I understand there should be one carbon app which can receive the LogAnalyzer Events from APIM.
Thanks
Log analyser analytics is depreciated in APIM analytics 2.6.0. Remove the log analyser publisher from APIM 2.6.0
I'm trying to connect my wso2 API manager to external LDAP/Active Directory. LDAP connection is fine but I'm getting this error while starting the server.
9714 is SSL port
[2018-07-13 14:52:03,250] ERROR - JMSListener Unable to continue server startup as it seems the JMS Provider is not yet started. Please start the JMS provider now.
[2018-07-13 14:52:03,251] ERROR - JMSListener Connection attempt : 5 for JMS Provider failed. Next retry in 320 seconds
[2018-07-13 14:52:27,889] WARN - DataEndpointGroup No receiver is reachable at reconnection, will try to reconnect every 30 sec
[2018-07-13 14:52:27,906] INFO - DataBridge user admin connected
[2018-07-13 14:52:27,914] ERROR - AuthenticationServiceImpl Invalid User : admin
[2018-07-13 14:52:27,915] ERROR - DataEndpointConnectionWorker Error while trying to connect to the endpoint. Cannot borrow client for ssl://10.10.183.27:9714.
org.wso2.carbon.databridge.agent.exception.DataEndpointLoginException: Cannot borrow client for ssl://10.10.183.27:9714.
at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.connect(DataEndpointConnectionWorker.java:134)
at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.run(DataEndpointConnectionWorker.java:59)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.wso2.carbon.databridge.agent.exception.DataEndpointLoginException:
Error while trying to login to data receiver :/10.10.183.27:9714
at org.wso2.carbon.databridge.agent.endpoint.binary.BinaryDataEndpoint.login(BinaryDataEndpoint.java:50) at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.connect(DataEndpointConnectionWorker.java:128)
... 6 more
This error usually is due to the Analytics configuration or Throttling configuration.
Try to disable the Analytics publisher in the api-manager.xml file, if enabled, or review the connection details under DASServerUrl element.
<Analytics>
<!-- Enable Analytics for API Manager -->
<Enabled>false</Enabled>
<!-- Server URL of the remote DAS/CEP server used to collect statistics. Must
be specified in protocol://hostname:port/ format.
An event can also be published to multiple Receiver Groups each having 1 or more receivers. Receiver
Groups are delimited by curly braces whereas receivers are delimited by commas.
Ex - Multiple Receivers within a single group
tcp://localhost:7612/,tcp://localhost:7613/,tcp://localhost:7614/
Ex - Multiple Receiver Groups with two receivers each
{tcp://localhost:7612/,tcp://localhost:7613},{tcp://localhost:7712/,tcp://localhost:7713/} -->
<DASServerURL>{tcp://localhost:7612}</DASServerURL>
<!--DASAuthServerURL>{ssl://localhost:7712}</DASAuthServerURL-->
<!-- Administrator username to login to the remote DAS server. -->
<DASUsername>${admin.username}</DASUsername>
<!-- Administrator password to login to the remote DAS server. -->
<DASPassword>${admin.password}</DASPassword>
You can also try to disable, or review the configuration in ReceiverUrlGroup and AuthUrlGroup, of the Advanced Throttling feature.
<ThrottlingConfigurations>
<EnableAdvanceThrottling>false</EnableAdvanceThrottling>
<DataPublisher>
<Enabled>true</Enabled>
<Type>Binary</Type>
<ReceiverUrlGroup>tcp://${carbon.local.ip}:${receiver.url.port}</ReceiverUrlGroup>
<AuthUrlGroup>ssl://${carbon.local.ip}:${auth.url.port}</AuthUrlGroup>
<Username>${admin.username}</Username>
<Password>${admin.password}</Password>
I faced the same issue. Check the log of traffic manager, if there arn't any exceptions. Kill & start traffic manager helped me.
We have a oAuth server setup using spring oauth2 (version 1.0.4).
On trying to retrieve an access token for the client credentials grant type we get a null pointer error when multiple concurrent requests are made.
Including a snippet of the stack trace:
java.lang.NullPointerException
org.springframework.security.oauth2.provider.token.DefaultAuthenticationKeyGenerator.extractKey(DefaultAuthenticationKeyGenerator.java:43)
org.springframework.security.oauth2.provider.token.JdbcTokenStore.getAccessToken(JdbcTokenStore.java:121)
org.springframework.security.oauth2.provider.token.DefaultTokenServices.createAccessToken(DefaultTokenServices.java:75)
com.marketo.identity.data.impl.IdentityDefaultTokenServices.createAccessToken(IdentityDefaultTokenServices.java:45)
org.springframework.security.oauth2.provider.token.AbstractTokenGranter.getAccessToken(AbstractTokenGranter.java:68)
org.springframework.security.oauth2.provider.token.AbstractTokenGranter.grant(AbstractTokenGranter.java:60)
org.springframework.security.oauth2.provider.client.ClientCredentialsTokenGranter.grant(ClientCredentialsTokenGranter.java:41)
org.springframework.security.oauth2.provider.CompositeTokenGranter.grant(CompositeTokenGranter.java:38)
org.springframework.security.oauth2.provider.endpoint.TokenEndpoint.getAccessToken(TokenEndpoint.java:100)
sun.reflect.GeneratedMethodAccessor167.invoke(Unknown Source)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:601)
The request is something like this :
http://oauth-server-name/oauth/token?client_id={client_id}&client_secret={client_secret}&grant_type=client_credentials
Again this issue does not occur when a single request is made(or a low number of concurrent requests are made).
Some kind of race condition ?
Here's the example I was looking for (in XML):
<tx:advice id="tokenAdvice">
<tx:attributes>
<tx:method name="*" isolation="REPEATABLE_READ" />
</tx:attributes>
</tx:advice>
<aop:config>
<aop:pointcut id="tokenServicesExecutions" expression="execution(* org.springframework.security.oauth2.provider.token.AuthorizationServerTokenServices.*(..))" />
<aop:advisor advice-ref="tokenAdvice" pointcut-ref="tokenServicesExecutions"/>
</aop:config>
im trying to run a load test on a https page. However i keep getting an error even on a simple GET request. Ive tried opening for example https://www.google.com, and that works fine. Ive tried capturing the request, but nothing seems to be sent. Also when i try to record the https page with the jmeter cert i get the same error.
Opening the same page on http works fine. The page im trying to open is a .Net page. Does anyone know what is wrong? Ive stomped my head against this wall long enough now :(.
java.net.SocketException: Software caused connection abort: recv
failed at java.net.SocketInputStream.socketRead0(Native Method) at
java.net.SocketInputStream.read(Unknown Source) at
java.net.SocketInputStream.read(Unknown Source) at
sun.security.ssl.InputRecord.readFully(Unknown Source) at
sun.security.ssl.InputRecord.read(Unknown Source) at
sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source) at
sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source)
at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source) at
sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source) at
org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:436)
at
org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180)
at
org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:294)
at
org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:643)
at
org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:479)
at
org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
at
org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
at
org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.executeRequest(HTTPHC4Impl.java:481)
at
org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:298)
at
org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:74)
at
org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1105)
at
org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1094)
at
org.apache.jmeter.threads.JMeterThread.process_sampler(JMeterThread.java:429)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:257)
at java.lang.Thread.run(Unknown Source)
After some further investigation i found out that apparently the server does not support socket version negotiation (comments in jmeter properties file). And after setting property: https.socket.protocols=SSLv3 in the properties file it worked!
I have two separate servers for alfresco and share. I can successfully login from share to alfresco. I can also change my password too. But when opening Repository Browser page, I got
this error message.
11210005 Failed to execute script 'classpath*:alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js': 11210004 11210008 Failed to run action evaluator: 11210007 Failed processing dictionary information from Alfresco: 11210006 Unable to retrieve dictionary information from Alfresco: 500
Full stack trace is
Caused by: org.alfresco.error.AlfrescoRuntimeException: 11210008 Failed to run action evaluator: 11210007 Failed processing dictionary information from Alfresco: 11210006 Unable to retrieve dictionary information from Alfresco: 500
at org.alfresco.web.evaluator.NodeTypeEvaluator.evaluate(NodeTypeEvaluator.java:98)
at org.alfresco.web.evaluator.BaseEvaluator.evaluate(BaseEvaluator.java:131)
at sun.reflect.GeneratedMethodAccessor97.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.mozilla.javascript.MemberBox.invoke(MemberBox.java:155)
at org.mozilla.javascript.NativeJavaMethod.call(NativeJavaMethod.java:243)
at org.mozilla.javascript.optimizer.OptRuntime.callN(OptRuntime.java:86)
at org.mozilla.javascript.gen.c12._c4(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js:260)
at org.mozilla.javascript.gen.c12.call(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js)
at org.mozilla.javascript.optimizer.OptRuntime.callName(OptRuntime.java:97)
at org.mozilla.javascript.gen.c12._c2(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js:318)
at org.mozilla.javascript.gen.c12.call(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js)
at org.mozilla.javascript.optimizer.OptRuntime.call2(OptRuntime.java:76)
at org.mozilla.javascript.gen.c12._c20(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js:838)
at org.mozilla.javascript.gen.c12.call(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js)
at org.mozilla.javascript.optimizer.OptRuntime.callName0(OptRuntime.java:108)
at org.mozilla.javascript.gen.c12._c0(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js:854)
at org.mozilla.javascript.gen.c12.call(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js)
at org.mozilla.javascript.ContextFactory.doTopCall(ContextFactory.java:393)
at org.mozilla.javascript.ScriptRuntime.doTopCall(ScriptRuntime.java:2834)
at org.mozilla.javascript.gen.c12.call(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js)
at org.mozilla.javascript.gen.c12.exec(file:/D:/alfrescoplatform/development/apache-tomcat-share_ref/webapps/share_ref1/WEB-INF/classes/alfresco/site-webscripts/org/alfresco/components/documentlibrary/data/surf-doclist.get.js)
at org.springframework.extensions.webscripts.processor.JSScriptProcessor.executeScriptImpl(JSScriptProcessor.java:318)
... 39 more
So I trace NodeTypeEvaluator and DictionaryQuery class in deep.I found calling /api/dictionary from alfresco.
I try this. Adding
webserviceclient.properties
repository.location=http://hostname:8080/alfresco/api
in share, but no success.
In calling http://hostname:8080/alfresco/api/ shows
Axis HTTP Servlet
Hi, you have reached the AXIS HTTP Servlet. Normally you would be hitting this URL with a SOAP client rather than a browser.
In case you are interested, my AXIS transport name appears to be 'http
Let me know if you have any idea.
As accepted in the comment the answer for this question:
I run Alfresco and Share almost always on different application server, you just need to change the share-config-custom.xml and alfresco-global.properties for it. There is a How-To on the Alfresco forums. Have you followed it?