openqa selenium Session Not Created Exception. Flutter Automation - automated-tests

I am trying to automate flutter Apk by using valuekey locators. I used following code to automate Apk. I am trying to use Appium and flutter finder for the automation.
package io.github.ashwith.flutter.example;
import java.net.MalformedURLException;
import java.net.URL;
import java.time.Duration;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.remote.DesiredCapabilities;
import org.openqa.selenium.remote.RemoteWebDriver;
import io.appium.java_client.android.AndroidDriver;
import io.github.ashwith.flutter.FlutterFinder;
public class Flutter_Finder {
public static RemoteWebDriver driver;
public static void main(String[] args) throws MalformedURLException {
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability("deviceName", "Android");
capabilities.setCapability("platformName", "Android");
capabilities.setCapability("noReset", true);
capabilities.setCapability("app", "E:\\Testsigma.apk");
capabilities.setCapability("automationName", "flutter");
driver = new AndroidDriver(new URL("http://localhost:4723/wd/hub"), capabilities);
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(30));
FlutterFinder finder = new FlutterFinder(driver);
WebElement element = finder.byValueKey("incrementButton");
element.click();
}
}
When I am trying to run the code I am getting following error code.
Exception in thread "main" org.openqa.selenium.SessionNotCreatedException:
Could not start a new session.
Response code 500.
Message: An unknown server-side error occurred while processing the command.
Original error: Cannot read property 'match' of undefined
I have used following Appium java client version for this automation as my dependencies.
<dependency>
<groupId>io.appium</groupId>
<artifactId>java-client</artifactId>
<version>8.3.0</version>
</dependency>
Please help me to resolve this error.
Thank you very much!

Related

How can I use a Webdriver Testcontainer in Bitbucket Pipelines?

When trying to use a Webdriver Testcontainer in Bitbucket Pipelines, I get the following error messages:
[main] WARN 🐳 [selenium/standalone-chrome:4.1.1] - Unable to mount a file from test host into a running container. This may be a misconfiguration or limitation of your Docker environment. Some features might not work.
[main] ERROR 🐳 [selenium/standalone-chrome:4.1.1] - Could not start container
com.github.dockerjava.api.exception.DockerException: Status 403: {"message":"authorization denied by plugin pipelines: -v only supports $BITBUCKET_CLONE_DIR and its subdirectories"}
My testcontainers version is 1.17.6
Here is the code I'm using while trying to troubleshoot:
package com.byzpass.demo;
import org.junit.*;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.openqa.selenium.remote.RemoteWebDriver;
import org.testcontainers.containers.BrowserWebDriverContainer;
public class SeleniumTest {
public ChromeOptions chromeOptions = new ChromeOptions().addArguments("--no-sandbox").addArguments("--headless").addArguments("--disable-dev-shm-usage");
#Rule
public BrowserWebDriverContainer<?> driverContainer = new BrowserWebDriverContainer<>().withCapabilities(chromeOptions);
#Test
public void openWikipedia() {
WebDriver driver = new RemoteWebDriver(driverContainer.getSeleniumAddress(), chromeOptions);
driver.navigate().to("https://www.wikipedia.org/");
String subtitleText = driver.findElement(By.cssSelector("#www-wikipedia-org h1 strong")).getText();
assert subtitleText.equals("The Free Encyclopedia");
driver.quit();
System.out.println("Finished opening wikipedia. πŸ“– πŸ€“ πŸ” πŸ‘ ✨");
}
}
Here is my bitbucket-pipelines.yml:
pipelines:
default:
- step:
image: amazoncorretto:11
services:
- docker
script:
- export TESTCONTAINERS_RYUK_DISABLED=true
- cd selenium-test ; bash ./mvnw --no-transfer-progress test
definitions:
services:
docker:
memory: 2048
By setting a breakpoint in my test method and using docker inspect -f '{{ .Mounts }}' I was able to discover that the container for the selenium/standalone-chrome:4.1.1 image has [{bind /dev/shm /dev/shm rw true rprivate}]
I thought that using the --disable-dev-shm-usage argument in my chrome options would prevent that, but it didn't. I don't know whether that's what's causing my issue in Bitbucket Pipelines though.
I found that it worked after setting shm size to zero. Here's the code that worked:
package com.byzpass.demo;
import org.junit.*;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.openqa.selenium.remote.RemoteWebDriver;
import org.testcontainers.containers.BrowserWebDriverContainer;
import java.util.ArrayList;
public class SeleniumTest {
#Test
public void openWikipedia() {
ChromeOptions chromeOptions = new ChromeOptions().addArguments("--no-sandbox").addArguments("--headless").addArguments("--disable-dev-shm-usage");
BrowserWebDriverContainer driverContainer = new BrowserWebDriverContainer<>().withRecordingMode(BrowserWebDriverContainer.VncRecordingMode.SKIP, null);
driverContainer.setShmSize(0L);
driverContainer.start();
WebDriver driver = new RemoteWebDriver(driverContainer.getSeleniumAddress(), chromeOptions);
driver.navigate().to("https://www.wikipedia.org/");
String subtitleText = driver.findElement(By.cssSelector("#www-wikipedia-org h1 strong")).getText();
assert subtitleText.equals("The Free Encyclopedia");
driver.quit();
driverContainer.stop();
System.out.println("Finished opening wikipedia. πŸ“– πŸ€“ πŸ” πŸ‘ ✨");
}
}

Using Flyway set variable statement throws error

In my EBevar SQL is working well:
BUT ....
Spring boot flyway application runs, it throws an error:
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'flywayInitializer' defined in class path
resource
[org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration$FlywayConfiguration.class]:
Invocation of init method failed; nested exception is
org.flywaydb.core.internal.command.DbMigrate$FlywayMigrateException:
Migration V1__obsolete.sql failed
-------------------------------------- SQL State : 42601 Error Code : 0 Message : ERROR: syntax error at or near "#" Position: 1
Location : db/migration/pgsql/_env/V1__obsolete.sql
(C:\workspace\t-flyway\target\classes\db\migration\pgsql_env\V1__obsolete.sql)
Line : 1 Statement : #set TESTINGE = te
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1796)
~[spring-beans-5.2.7.RELEASE.jar:5.2.7.RELEASE] at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:595)
~[spring-beans-5.2.7.RELEASE.jar:5.2.7.RELEASE] at org.springframe
I also check with different cases with JDBC.
CASE 1: I got error when I execute this code:
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
public class MysqlCon {
public static void main(String args[]) throws Exception{
Class.forName("org.postgresql.Driver");
Connection con=DriverManager.getConnection(
"jdbc:postgresql://localhost:5432/test1?createDatabaseIfNotExist=true&autoReconnect=true&useSSL=false&allowPublicKeyRetrieval=true","postgres","postgres");
Statement s = con.createStatement();
ResultSet rs = s.executeQuery("#set TESTINGE = te");
}
}
Error:
Exception in thread "main" org.postgresql.util.PSQLException: ERROR:
syntax error at or near "#" Position: 1 at
org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2532)
at
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2267)
at
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:312)
at
org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:448)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:369) at
org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:310)
at
org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:296)
at
org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:273)
at org.postgresql.jdbc.PgStatement.executeQuery(PgStatement.java:226)
at com.techgeeknext.MysqlCon.main(MysqlCon.java:14)
I also checked another way:
CASE 2:
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
public class MysqlCon {
public static void main(String args[]) throws Exception{
Class.forName("org.postgresql.Driver");
Connection con=DriverManager.getConnection(
"jdbc:postgresql://localhost:5432/test1?createDatabaseIfNotExist=true&autoReconnect=true&useSSL=false&allowPublicKeyRetrieval=true","postgres","postgres");
Statement s = con.createStatement();
ResultSet rs = s.executeQuery("SET TESTINGE TO te");
}
}
Error:
Exception in thread "main" org.postgresql.util.PSQLException: ERROR:
unrecognized configuration parameter "testinge" at
org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2532)
at
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2267)
at
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:312)
at
org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:448)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:369) at
org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:310)
at
org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:296)
at
org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:273)
at org.postgresql.jdbc.PgStatement.executeQuery(PgStatement.java:226)
at com.techgeeknext.MysqlCon.main(MysqlCon.java:14)
Thanks in advance
This is because #set is a DBeaver specific command. Itβ€˜s not handled on the server but on your DBeaver client.
see DBeaver handbook (https://dbeaver.com/doc/dbeaver.pdf) section Client Side Commands:
You can use special commands in the SQL scripts. These commands are executed on DBeaver's side, not on the server-side.
If you want to use Variables in Flyway you may want to have a look into Flyway placeholders: https://flywaydb.org/documentation/configuration/placeholder

Failed to meta-introspect annotation interface org.springframework.web.bind.annotation

error message
21:13:46,666 DEBUG AnnotationUtils:1889 - Failed to meta-introspect annotation interface org.springframework.web.bind.annotation.RequestBody: java.lang.NullPointerException
com.alibaba.dubbo.rpc.RpcException: No provider available from registry 172.16.33.23:2181 for service com.itheima.service.CheckItemService on consumer 172.16.33.29 use dubbo version 2.6.0, may be providers disabled or not registered ?
at com.alibaba.dubbo.registry.integration.RegistryDirectory.doList(RegistryDirectory.java:572)
I have checked my Dubbo annotation package and controller's import but find that is right.
import com.alibaba.dubbo.config.annotation.Reference;
import com.itheima.constant.MessageConstant;
import com.itheima.entity.Result;
import com.itheima.pojo.CheckItem;
import com.itheima.service.CheckItemService;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping("/checkitem")
public class CheckItemController {
#Reference
private CheckItemService checkItemService;
#RequestMapping("/add")
public Result add(#RequestBody CheckItem checkItem){
try{
checkItemService.add(checkItem);
}catch (Exception e){
e.printStackTrace();
return new Result(false, MessageConstant.ADD_CHECKITEM_FAIL);
}
return new Result(true, MessageConstant.ADD_CHECKITEM_SUCCESS);
}
Also, my zookeeper started.
[root#localhost bin]# ./zkServer.sh start
JMX enabled by default
Using config: /usr/local/zookeeper-3.4.6/bin/../conf/zoo.cfg
Starting zookeeper ... STARTED
And the front end request successful.
it seems that have no problems
[zk: 127.0.0.1:2181(CONNECTED) 1] ls /dubbo/com.itheima.service.CheckItemService
[consumers, configurators, routers, providers]
anyone who can give a direction?
I can almost understand where the mistake is,but I don't know what to do.
RegisteryDirectoty.class

Reading from PubsubIO writing to DatastoreIO

Is it possible to create a pipeline that reads data from Pub/Sub and writes to Datastore? In my code I specify the PubsubIO as the input, and apply windowing to get a bounded PCollection, but it seems that it is not possible to use the DatastoreIO.writeTo with the options.setStreaming as true, while that is required in order to use PubsubIO as input. Is there a way around this? Or is it simply not possible to read from pubsub and write to datastore?
Here's my code:
DataflowPipelineOptions options = PipelineOptionsFactory.create()
.as(DataflowPipelineOptions.class);
options.setRunner(DataflowPipelineRunner.class);
options.setProject(projectName);
options.setStagingLocation("gs://my-staging-bucket/staging");
options.setStreaming(true);
Pipeline p = Pipeline.create(options);
PCollection<String> input = p.apply(PubsubIO.Read.topic("projects/"+projectName+"/topics/event-streaming"));
PCollection<String> inputWindow = input.apply(Window.<String>into(FixedWindows.of(Duration.standardSeconds(5))).triggering(AfterPane.elementCountAtLeast(1)).discardingFiredPanes().withAllowedLateness(Duration.standardHours(1)));
PCollection<String> inputDecode = inputWindow.apply(ParDo.of(new DoFn<String, String>() {
private static final long serialVersionUID = 1L;
public void processElement(ProcessContext c) {
String msg = c.element();
byte[] decoded = Base64.decodeBase64(msg.getBytes());
String outmsg = new String(decoded);
c.output(outmsg);
}
}));
PCollection<DatastoreV1.Entity> inputEntity = inputDecode.apply(ParDo.of(new CreateEntityFn("stream", "events")));
inputEntity.apply(DatastoreIO.writeTo(datasetid));
p.run();
And this is the exception I get:
Exception in thread "main" java.lang.UnsupportedOperationException: The Write transform is not supported by the Dataflow streaming runner.
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$StreamingWrite.apply(DataflowPipelineRunner.java:488)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$StreamingWrite.apply(DataflowPipelineRunner.java:480)
at com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:74)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:314)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:358)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:267)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:312)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:358)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:267)
at com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:159)
at my.own.project.google.dataflow.EventStreamingDataflow.main(EventStreamingDataflow.java:104)
The DatastoreIO sink is not currently supported in the streaming runner. To write to Datastore from a streaming pipeline, you can make direct calls to the Datastore API from a DoFn.
Ok, after a lot of banging my head against the wall, I finally got it working. Like danielm suggested, I'm making calls to the Datastore API from a ParDo DoFn. One problem was, that I didn't realize there is a separate API for using the Cloud Datastore outside from AppEngine. (com.google.api.services.datastore... vs. com.google.appengine.api.datastore...). Another problem was that apparently there is some kind of bug in the latest version of the Cloud Datastore API (google-api-services-datastore-protobuf v1beta2-rev1-4.0.0, I got an IllegalAccessError), I resolved that by using an older version (v1beta2-rev1-2.1.2).
So, here's my working code:
import com.google.cloud.dataflow.sdk.Pipeline;
import com.google.cloud.dataflow.sdk.io.PubsubIO;
import com.google.cloud.dataflow.sdk.options.DataflowPipelineOptions;
import com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory;
import com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner;
import com.google.cloud.dataflow.sdk.transforms.DoFn;
import com.google.cloud.dataflow.sdk.transforms.ParDo;
import com.google.cloud.dataflow.sdk.values.PCollection;
import com.google.api.services.datastore.DatastoreV1.*;
import com.google.api.services.datastore.client.Datastore;
import com.google.api.services.datastore.client.DatastoreException;
import com.google.api.services.datastore.client.DatastoreFactory;
import static com.google.api.services.datastore.client.DatastoreHelper.*;
import java.security.GeneralSecurityException;
import java.io.IOException;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.json.simple.parser.ParseException;
//--------------------
public static void main(String[] args) {
DataflowPipelineOptions options = PipelineOptionsFactory.create()
.as(DataflowPipelineOptions.class);
options.setRunner(DataflowPipelineRunner.class);
options.setProject(projectName);
options.setStagingLocation("gs://my-staging-bucket/staging");
options.setStreaming(true);
Pipeline p = Pipeline.create(options);
PCollection<String> input = p.apply(PubsubIO.Read.topic("projects/"+projectName+"/topics/my-topic-name"));
input.apply(ParDo.of(new DoFn<String, String>() {
private static final long serialVersionUID = 1L;
public void processElement(ProcessContext c) throws ParseException, DatastoreException {
JSONObject json = (JSONObject)new JSONParser().parse(c.element());
Datastore datastore = null;
try {
datastore = DatastoreFactory.get().create(getOptionsFromEnv()
.dataset(datasetid).build());
} catch (GeneralSecurityException exception) {
System.err.println("Security error connecting to the datastore: " + exception.getMessage());
} catch (IOException exception) {
System.err.println("I/O error connecting to the datastore: " + exception.getMessage());
}
Key.Builder keyBuilder = makeKey("my-kind");
keyBuilder.getPartitionIdBuilder().setNamespace("my-namespace");
Entity.Builder event = Entity.newBuilder()
.setKey(keyBuilder);
event.addProperty(makeProperty("my-prop",makeValue((String)json.get("my-prop"))));
CommitRequest commitRequest = CommitRequest.newBuilder()
.setMode(CommitRequest.Mode.NON_TRANSACTIONAL)
.setMutation(Mutation.newBuilder().addInsertAutoId(event))
.build();
if(datastore!=null){
datastore.commit(commitRequest);
}
}
}));
p.run();
}
And the dependencies in pom.xml:
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>[1.0.0,2.0.0)</version>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-datastore-protobuf</artifactId>
<version>v1beta2-rev1-2.1.2</version>
</dependency>
<dependency>
<groupId>com.google.http-client</groupId>
<artifactId>google-http-client</artifactId>
<version>1.17.0-rc</version>
</dependency>
<!-- Some more.. like JUnit etc.. -->

Error while running client code in EJB

Getting error while running client code in EJB:
Exception in thread "Main Thread" java.lang.NoClassDefFoundError:
weblogic/kernel/KernelStatus at
weblogic.jndi.Environment.(Environment.java:78) at
weblogic.jndi.WLInitialContextFactory.getInitialContext(WLInitialContextFactory.java:117)
at
javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:667)
at
javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:288)
at javax.naming.InitialContext.init(InitialContext.java:223) at
javax.naming.InitialContext.(InitialContext.java:197) at
User.main(User.java:21)
I added wlfullclient.jar, server's Remote interface jar, also I saw article in which I saw to add wlclient.jar and weblogic.jar but then also got same error.
Help highly appreciated.
Client code:
import com.amdocs.Stateful.*;
import java.util.Hashtable;
import javax.naming.InitialContext;
import javax.naming.NamingException;
public class User
{
public static void main(String args[]){
Hashtable<String,String> ht = new Hashtable<String,String>();
ht.put(InitialContext.INITIAL_CONTEXT_FACTORY,"weblogic.jndi.WLInitialContextFactory");// you have to start from root location to search for JNDI tree
//JNDI registry tree is normally maintained as Binary tree, JNDi is normally binary tree which have root node at top
ht.put(InitialContext.PROVIDER_URL,"t3://localhost:7001"); // address of registry
try{
System.out.println("Hello");
InitialContext ic = new InitialContext(ht); // start searching wrt what we set in ht , it's constructor takes HashTable only
System.out.println("Hello2");
MyCartRemote ref=(MyCartRemote)ic.lookup("MyCartBeanJNDI#com.amdocs.Stateful.MyCartRemote");
ref.add("Hi");
ref.add("Jeetendra");
MyCartRemote ref1=(MyCartRemote)ic.lookup("MyCartBeanJNDI#com.amdocs.Stateful.MyCartRemote");
ref1.add("Hi");
ref1.add("Subhash");
ref1.add("Ghai");
System.out.println("Object 1 data: "+ref.show());
System.out.println("Object 2 data: "+ref1.show());
}
catch (NamingException e){ e.printStackTrace(); }
}
}
Try removing WebLogic System Libraries from the Bootstrap Entries section. It worked for me when i got the same error.

Resources