r8 does not create output dex file - jar

I'm trying to use r8 without much success:
$ wget https://repo1.maven.org/maven2/com/mikepenz/fastadapter/3.2.7/fastadapter-3.2.7.aar
$ wget https://maven.google.com/com/android/support/recyclerview-v7/26.0.1/recyclerview-v7-26.0.1.aar
$ unzip fastadapter-3.2.7.aar && mv classes.jar fastadapter.jar
$ unzip recyclerview-v7-26.0.1.aar && mv classes.jar recycleview.jar
$ echo "-dontwarn java.lang.Object" > proguard.cfg
$ java -jar build/libs/r8.jar --lib android-4.1.1.4.jar --lib recycleview.jar --release --output . --pg-conf proguard.cfg fastadapter.jar
$ dexdump -d classes.dex | grep "insns size" | wc -l
Unable to open 'classes.dex' : No such file or directory
0
But with d8 it works perfect (even without --lib android.jar and --lib recycleview.jar):
$ java -jar ../build/libs/d8.jar --release --output . fastadapter.jar
$ dexdump -d classes.dex | grep "insns size" | wc -l
390
EDIT
Trying to create the fastadapter proguard keep rules failed with:
$ cmdline-tools/build-tools/30.0.3/aapt2 link --proguard proguard_fastadapter.cfg -o proguard_fastadapter.cfg --manifest AndroidManifest.xml
AndroidManifest.xml:2: error: attribute android:versionCode not found.
AndroidManifest.xml:2: error: attribute android:versionName not found.
AndroidManifest.xml:7: error: attribute android:minSdkVersion not found.
AndroidManifest.xml:7: error: attribute android:targetSdkVersion not found.
error: failed processing manifest.
which is a bit odd because targetSdkVersion in fact does exist:
$ grep "targetSdkVersion" AndroidManifest.xml
android:targetSdkVersion="27" />

The problem is that the only rule you are passing to R8 is -dontwarn java.lang.Object. There is no single -keep rule, which means that the output will be empty as no entry points are kept.
What is missing is two set of keep rules:
General keep rules for all Android apps
Specific rules for the concrete app with all entry points (reflective access from the platform for e.g. view inflation), which can be generated with aapt2.
For the first set of rules Android Studio (or more precisely AGP) bundles this rules file, which is passed to all builds as a result of getDefaultProguardFile('proguard-android-optimize.txt').
# This is a configuration file for ProGuard.
# http://proguard.sourceforge.net/index.html#manual/usage.html
#
# Starting with version 2.2 of the Android plugin for Gradle, this file is distributed together with
# the plugin and unpacked at build-time. The files in $ANDROID_HOME are no longer maintained and
# will be ignored by new version of the Android plugin for Gradle.
# Optimizations: If you don't want to optimize, use the proguard-android.txt configuration file
# instead of this one, which turns off the optimization flags.
# Adding optimization introduces certain risks, since for example not all optimizations performed by
# ProGuard works on all versions of Dalvik. The following flags turn off various optimizations
# known to have issues, but the list may not be complete or up to date. (The "arithmetic"
# optimization can be used if you are only targeting Android 2.0 or later.) Make sure you test
# thoroughly if you go this route.
-optimizations !code/simplification/arithmetic,!code/simplification/cast,!field/*,!class/merging/*
-optimizationpasses 5
-allowaccessmodification
-dontusemixedcaseclassnames
-dontskipnonpubliclibraryclasses
-verbose
# Preserve some attributes that may be required for reflection.
-keepattributes *Annotation*,Signature,InnerClasses,EnclosingMethod
-keep public class com.google.vending.licensing.ILicensingService
-keep public class com.android.vending.licensing.ILicensingService
-keep public class com.google.android.vending.licensing.ILicensingService
-dontnote com.android.vending.licensing.ILicensingService
-dontnote com.google.vending.licensing.ILicensingService
-dontnote com.google.android.vending.licensing.ILicensingService
# For native methods, see http://proguard.sourceforge.net/manual/examples.html#native
-keepclasseswithmembernames,includedescriptorclasses class * {
native <methods>;
}
# Keep setters in Views so that animations can still work.
-keepclassmembers public class * extends android.view.View {
void set*(***);
*** get*();
}
# We want to keep methods in Activity that could be used in the XML attribute onClick.
-keepclassmembers class * extends android.app.Activity {
public void *(android.view.View);
}
# For enumeration classes, see http://proguard.sourceforge.net/manual/examples.html#enumerations
-keepclassmembers enum * {
public static **[] values();
public static ** valueOf(java.lang.String);
}
-keepclassmembers class * implements android.os.Parcelable {
public static final ** CREATOR;
}
# Preserve annotated Javascript interface methods.
-keepclassmembers class * {
#android.webkit.JavascriptInterface <methods>;
}
# The support libraries contains references to newer platform versions.
# Don't warn about those in case this app is linking against an older
# platform version. We know about them, and they are safe.
-dontnote android.support.**
-dontnote androidx.**
-dontwarn android.support.**
-dontwarn androidx.**
# This class is deprecated, but remains for backward compatibility.
-dontwarn android.util.FloatMath
# Understand the #Keep support annotation.
-keep class android.support.annotation.Keep
-keep class androidx.annotation.Keep
-keep #android.support.annotation.Keep class * {*;}
-keep #androidx.annotation.Keep class * {*;}
-keepclasseswithmembers class * {
#android.support.annotation.Keep <methods>;
}
-keepclasseswithmembers class * {
#androidx.annotation.Keep <methods>;
}
-keepclasseswithmembers class * {
#android.support.annotation.Keep <fields>;
}
-keepclasseswithmembers class * {
#androidx.annotation.Keep <fields>;
}
-keepclasseswithmembers class * {
#android.support.annotation.Keep <init>(...);
}
-keepclasseswithmembers class * {
#androidx.annotation.Keep <init>(...);
}
# These classes are duplicated between android.jar and org.apache.http.legacy.jar.
-dontnote org.apache.http.**
-dontnote android.net.http.**
# These classes are duplicated between android.jar and core-lambda-stubs.jar.
-dontnote java.lang.invoke.**
For the second part use aapt2 link with the --proguard option. That will generate additional required rules like:
-keep class com.example.app.MainActivity { <init>(); }
Both sets of rules will have to be passed to R8 (it can take several --pg-conf options) together with additional application specific rules.
When using Android Studio (again more precisely AGP) this is handled automatically. For reference these rules can be inspected in app/build/intermediates/proguard-files. See Shrink, obfuscate, and optimize your app for more details.

Related

The class Mock_* was not found in the chain configured namespaces App\Entity [Symfony 5.3][PHPUnit 8.5]

I am trying to write PHPUnit tests for my Symfony 5.3 project with some Repositories mocked with others real.
$ bin/console -v
Symfony 5.3.10 (env: dev, debug: true)
$ bin/phpunit -V
PHPUnit 8.5.19 by Sebastian Bergmann and contributors.
Doctrine\Persistence\Mapping\MappingException: The class 'Mock_SType_b1b7aee4' was not found in the chain configured namespaces App\Entity
/var/www/cir/vendor/doctrine/persistence/lib/Doctrine/Persistence/Mapping/MappingException.php:23
/var/www/cir/vendor/doctrine/persistence/lib/Doctrine/Persistence/Mapping/Driver/MappingDriverChain.php:91
/var/www/cir/vendor/doctrine/doctrine-bundle/Mapping/MappingDriver.php:45
/var/www/cir/vendor/doctrine/orm/lib/Doctrine/ORM/Mapping/ClassMetadataFactory.php:156
/var/www/cir/vendor/doctrine/doctrine-bundle/Mapping/ClassMetadataFactory.php:19
/var/www/cir/vendor/doctrine/persistence/lib/Doctrine/Persistence/Mapping/AbstractClassMetadataFactory.php:382
/var/www/cir/vendor/doctrine/orm/lib/Doctrine/ORM/Mapping/ClassMetadataFactory.php:85
/var/www/cir/vendor/doctrine/persistence/lib/Doctrine/Persistence/Mapping/AbstractClassMetadataFactory.php:251
/var/www/cir/vendor/doctrine/orm/lib/Doctrine/ORM/EntityManager.php:293
/var/www/cir/vendor/doctrine/orm/lib/Doctrine/ORM/UnitOfWork.php:1789
/var/www/cir/vendor/doctrine/orm/lib/Doctrine/ORM/UnitOfWork.php:1764
/var/www/cir/vendor/doctrine/orm/lib/Doctrine/ORM/EntityManager.php:629
/var/www/cir/src/Component/User/UserManager.php:1059
/var/www/cir/src/Component/User/UserManager.php:383
/var/www/cir/tests/Component/User/Manager/Creation/ATCPTest.php:146
I have created a KernelTestCase where:
$sTypeRepository = $this->createStub(STypeRepository::class);
So I tried to use the actual Repository:
$sTypeRepository = static::getContainer()->get('doctrine')->getRepository(SType::class);
and I get error:
TypeError: Argument 2 passed to App\Component\User\UserManager::__construct() must be an instance of App\Repository\STypeRepository, instance of Doctrine\ORM\EntityRepository given, called in /var/www/cir/tests/Component/User/Manager/Creation/ATCPTest.php on line 122
/var/www/cir/src/Component/User/UserManager.php:214
/var/www/cir/tests/Component/User/Manager/Creation/ATCPTest.php:122
Other tests run fine for other mocked classes, like here:
$this->SUserRoleRepository = $this->createStub(SUserRoleRepository::class);
So, why do I get a MappingException or TypeError, respectively, for some repositories but not others? Perhaps help me to understand the errors better? Thanks!
The issue was that my SType entity did not have an accurate #ORM\Entity(...) annotation. I had:
/**
* SType.
*
* #ORM\Table(name="s_type", indexes={#ORM\Index(name="idx_s_type_1", columns={"type_group"}), #ORM\Index(name="idx_s_type_2", columns={"code"})})
* #ORM\Entity
*/
class SType {}
Needed the repositoryClass relation:
/**
* SType.
*
* #ORM\Table(name="s_type", indexes={#ORM\Index(name="idx_s_type_1", columns={"type_group"}), #ORM\Index(name="idx_s_type_2", columns={"code"})})
* #ORM\Entity(repositoryClass="App\Repository\STypeRepository")
*/
class SType {}

QBS: Explicitly setting qbs.profiles inside Products causing build to fail

My use-case is this:
I have a static library which I want to be available for some profiles (e.g. "gcc", "arm-gcc", "mips-gcc").
I also have an application which links to this library, but this applications should only build using a specific profile (e.g. "arm-gcc").
For this I am modifying the app-and-lib QBS example.
The lib.qbs file:
import qbs 1.0
Product {
qbs.profiles: ["gcc", "arm-gcc", "mips-gcc"] //I added only this line
type: "staticlibrary"
name: "mylib"
files: [
"lib.cpp",
"lib.h",
]
Depends { name: 'cpp' }
cpp.defines: ['CRUCIAL_DEFINE']
Export {
Depends { name: "cpp" }
cpp.includePaths: [product.sourceDirectory]
}
}
The app.qbs file:
import qbs 1.0
Product {
qbs.profiles: ["arm-gcc"] //I added only this line
type: "application"
consoleApplication: true
files : [ "main.cpp" ]
Depends { name: "cpp" }
Depends { name: "mylib" }
}
The app build fails. Qbs wrongly tries to link to the "gcc" version of the library instead of the "arm-gcc" version, as you can see in the log:
Build graph does not yet exist for configuration 'default'. Starting from scratch.
Resolving project for configuration default
Setting up build graph for configuration default
Building for configuration default
compiling lib.cpp [mylib {"profile":"gcc"}]
compiling lib.cpp [mylib {"profile":"arm-gcc"}]
compiling lib.cpp [mylib {"profile":"mips-gcc"}]
compiling main.cpp [app]
creating libmylib.a [mylib {"profile":"gcc"}]
creating libmylib.a [mylib {"profile":"mips-gcc"}]
creating libmylib.a [mylib {"profile":"arm-gcc"}]
linking app [app]
ERROR: /usr/bin/arm-linux-gnueabihf-g++ -o /home/user/programs/qbs/usr/local/share/qbs/examples/app-and-lib/default/app.7d104347/app /home/user/programs/qbs/usr/local/share/qbs/examples/app-and-lib/default/app.7d104347/3a52ce780950d4d9/main.cpp.o /home/user/programs/qbs/usr/local/share/qbs/examples/app-and-lib/default/mylib.eyJwcm9maWxlIjoiZ2NjIn0-.792f47ec/libmylib.a
ERROR: /home/user/programs/qbs/usr/local/share/qbs/examples/app-and-lib/default/mylib.eyJwcm9maWxlIjoiZ2NjIn0-.792f47ec/libmylib.a: error adding symbols: File format not recognized
collect2: error: ld returned 1 exit status
ERROR: Process failed with exit code 1.
The following products could not be built for configuration default:
app
The build fails only when selecting one profile in the app.qbs file, and this profile should not be the first profile in the qbs.profiles line in the lib.qbs file.
When selecting two or more profiles - the build succeeds.
My analysis:
I think this problem is related to multiplexing:
The lib.qbs contains more than one profile. This turns on multiplexing when building the library, which, in turn, adds additional 'multiplexConfigurationId' to the build-directory name (moduleloader.cpp).
The app.lib contains only one profile, so multiplexing is not turned on and the build-directory name does not get the extra string.
The problem can be solved by changing the code (moduleloader.cpp) so that multiplexing is turned even if there is only one profile i.e. with the following patch:
--- moduleloader.cpp 2018-10-24 16:17:43.633527397 +0300
+++ moduleloader.cpp.new 2018-10-24 16:18:27.541370544 +0300
## -872,7 +872,7 ##
= callWithTemporaryBaseModule<const MultiplexInfo>(dummyContext,
extractMultiplexInfoFromProduct);
- if (multiplexInfo.table.size() > 1)
+ if (multiplexInfo.table.size() > 0)
productItem->setProperty(StringConstants::multiplexedProperty(), VariantValue::trueValue());
VariantValuePtr productNameValue = VariantValue::create(productName);
## -891,7 +891,7 ##
const QString multiplexConfigurationId = multiplexInfo.toIdString(row);
const VariantValuePtr multiplexConfigurationIdValue
= VariantValue::create(multiplexConfigurationId);
- if (multiplexInfo.table.size() > 1 || aggregator) {
+ if (multiplexInfo.table.size() > 0 || aggregator) {
multiplexConfigurationIdValues.push_back(multiplexConfigurationIdValue);
item->setProperty(StringConstants::multiplexConfigurationIdProperty(),
multiplexConfigurationIdValue);
This worked for my use case. I don't know if it make sense in a broader view.
Finally, the questions:
Does it all make sense?
Is this a normal behavior?
Is this use-case simply not supported?
Is there a better solution?
Thanks in advance.
Yes, the default behavior with multiplexing is that the a non-multiplexed product depends on all variants of the dependency. In general, there is no way for a user to change that behavior, but there should be.
However, luckily for you, profiles are special:
Depends { name: "mylib"; profiles: "arm-gcc" }
This should fix your problem.

Why did not found return type in Custom UDAF of presto

Custom Function what I made did not work in presto on EMR.
I want to create a simple UDAF that just return 42.
First, my custom function what I wrote a simple functions, but did not work in presto.
A error is following in presto-cli:
presto> select answer_to_life('the universe');
Query 20180324_120433_00000_7n6s6 failed: answer_to_life(varchar):bigint not found
com.facebook.presto.spi.PrestoException: answer_to_life(varchar):bigint not found
at com.facebook.presto.metadata.FunctionRegistry.doGetSpecializedFunctionKey(FunctionRegistry.java:972)
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:146)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3716)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2424)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2298)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2211)
at com.google.common.cache.LocalCache.get(LocalCache.java:4154)
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4158)
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5147)
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5153)
at com.facebook.presto.metadata.FunctionRegistry.getSpecializedFunctionKey(FunctionRegistry.java:898)
at com.facebook.presto.metadata.FunctionRegistry.getAggregateFunctionImplementation(FunctionRegistry.java:875)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.buildAccumulatorFactory(LocalExecutionPlanner.java:1973)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.planGlobalAggregation(LocalExecutionPlanner.java:1984)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:955)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.AggregationNode.accept(AggregationNode.java:167)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitExchange(LocalExecutionPlanner.java:1919)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitExchange(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.ExchangeNode.accept(ExchangeNode.java:196)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:952)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.AggregationNode.accept(AggregationNode.java:167)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitOutput(LocalExecutionPlanner.java:638)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitOutput(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.OutputNode.accept(OutputNode.java:82)
at com.facebook.presto.sql.planner.LocalExecutionPlanner.plan(LocalExecutionPlanner.java:393)
at com.facebook.presto.sql.planner.LocalExecutionPlanner.plan(LocalExecutionPlanner.java:324)
at com.facebook.presto.execution.SqlTaskExecution.<init>(SqlTaskExecution.java:161)
at com.facebook.presto.execution.SqlTaskExecution.createSqlTaskExecution(SqlTaskExecution.java:121)
at com.facebook.presto.execution.SqlTaskExecutionFactory.create(SqlTaskExecutionFactory.java:71)
at com.facebook.presto.execution.SqlTask.updateTask(SqlTask.java:340)
at com.facebook.presto.execution.SqlTaskManager.updateTask(SqlTaskManager.java:321)
at com.facebook.presto.server.TaskResource.createOrUpdateTask(TaskResource.java:128)
at sun.reflect.GeneratedMethodAccessor311.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:200)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:272)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:268)
at org.glassfish.jersey.internal.Errors.process(Errors.java:316)
at org.glassfish.jersey.internal.Errors.process(Errors.java:298)
at org.glassfish.jersey.internal.Errors.process(Errors.java:268)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:416)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:370)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:389)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:342)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:229)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:841)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
at io.airlift.http.server.TraceTokenFilter.doFilter(TraceTokenFilter.java:63)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1637)
at io.airlift.http.server.TimingFilter.doFilter(TimingFilter.java:52)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1637)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:454)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:190)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1253)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:168)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:166)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1155)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:169)
at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:564)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:317)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:279)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:110)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:124)
at org.eclipse.jetty.util.thread.Invocable.invokePreferred(Invocable.java:128)
at org.eclipse.jetty.util.thread.Invocable$InvocableExecutor.invoke(Invocable.java:222)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:294)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:199)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:673)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:591)
at java.lang.Thread.run(Thread.java:748)
Code of AggregationFunction is following.
I referenced to this presto code
package sample;
import com.facebook.presto.spi.block.BlockBuilder;
import com.facebook.presto.spi.function.*;
import com.facebook.presto.spi.type.BigintType;
import com.facebook.presto.spi.type.StandardTypes;
import io.airlift.slice.Slice;
#AggregationFunction("answer_to_life")
public final class AnswerToLife {
private AnswerToLife() {
}
#InputFunction
public static void input(#AggregationState NullState state, #SqlType(StandardTypes.VARCHAR) Slice value) {
}
#CombineFunction
public static void combine(#AggregationState NullState state, #AggregationState NullState other) {
}
#OutputFunction(StandardTypes.BIGINT)
public static void output(#AggregationState NullState state, BlockBuilder out) {
BigintType.BIGINT.writeLong(out, 42);
}
}
Detail code is here (https://github.com/asari-mtr/presto-udaf/tree/stackoverflow).
The structure what I deployed is follwoing:
$ ls -1 /usr/lib/presto/plugin/my-udaf/
commons-codec-1.4.jar
guava-21.0.jar
hive-udf-1.0-SNAPSHOT.jar
presto-array-0.197.jar
stats-0.155.jar
presto-udaf-1.0-SNAPSHOT.jar
I use emr-5.12.0(Presto 0.188)
Thank you for your time.
Edited1
list in jar file.
% jar -tf target/presto-udaf-1.0-SNAPSHOT.jar
Picked up _JAVA_OPTIONS: -Dfile.encoding=UTF-8
META-INF/
META-INF/MANIFEST.MF
META-INF/services/
sample/
META-INF/services/com.facebook.presto.spi.Plugin
sample/AnswerToLife.class
sample/AnswerToLifePlugin.class
sample/NullState.class
META-INF/maven/
META-INF/maven/sample/
META-INF/maven/sample/presto-udaf/
META-INF/maven/sample/presto-udaf/pom.xml
META-INF/maven/sample/presto-udaf/pom.properties
sever.log
$ grep answer -i /mnt/var/log/presto/server.log
2018-03-26T06:37:14.213Z INFO main com.facebook.presto.server.PluginManager Installing sample.AnswerToLifePlugin
2018-03-26T06:37:14.214Z INFO main com.facebook.presto.server.PluginManager Registering functions from sample.AnswerToLife
And execute show funcstions
presto> show functions;
Function | Return Type | Argument Types
---------------------------------+-----------------------------------------------------+------------------------------------------------------------------------------
ST_Area | double | Geometry
ST_AsText | varchar | Geometry
...
answer_to_life | bigint | varchar
...
Edited2
Delete src/main/resources/META-INF/services/com.facebook.presto.spi.Plugin
Add <packaging>presto-plugin</packaging> to pom.xml
https://github.com/asari-mtr/presto-udaf/commit/f2a2ddbf0339e08f418b378a7ead511020e98a3b
I deployed zip made by Maven under /usr/lib/presto/plugin/.
But, The contents of the error does not change.
Edited3
I got the source from github (branch 0.188) on my Mac and built presto.
When we placed the above UDAF on its presto, it worked perfectly.
Perhaps there is a mistake in the installation procedure for presto on EMR.
Solved!!
The Custom plugin did not work because it was not a plugin creation, but a way of deploying on EMR was wrong.
Since I did not deploy to all nodes in the procedure I performed, I confirmed the operation by registering the following shell as a bootstrap action.
#/bin/bash
aws s3 cp s3://hogehoge/presto-udaf-1.0-SNAPSHOT.zip /tmp/
sudo mkdir -p /usr/lib/presto/plugin/
sudo unzip -d /usr/lib/presto/plugin/ /tmp/presto-udaf-1.0-SNAPSHOT.zip
10.1. SPI Overview — Presto 0.198 Documentation
Plugins must be installed on all nodes in the Presto cluster
(coordinator and workers).
Create Bootstrap Actions to Install Additional Software - Amazon EMR
You can use a bootstrap action to copy objects from Amazon S3 to each
node in a cluster before your applications are installed. The AWS CLI
is installed on each node of a cluster, so your bootstrap action can
call AWS CLI commands.
From what you wrote there are two things which are missing.
make sure that you used maven packaging presto-plugin in your pom.xml
also implement com.facebook.presto.spi.Plugin which in getFunctions method would return your function.
Take a look in presto-geospatial Presto module and see com.facebook.presto.plugin.geospatial.GeoPlugin and its pom.xml as reference.

Exclusion by annotation not working for sbt test/testOnly

Consider the following org.scalatest.TagAnnotation subclass:
public class TestSizeTags {
/** Tests with crazy long runtimes **/
#org.scalatest.TagAnnotation
#Inherited
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.METHOD, ElementType.TYPE})
public static #interface HugeTestClass {}
}
Let us annotate/tag a class with it:
#HugeTestClass
class ItemsJobTest extends FunSuite with BeforeAndAfterEach with DataFrameSuiteBase {
Now we want a quick "smoke test suite" on the codebase; therefore, let us (attempt) to exclude the testcases annotated by HugeTestClass :
Command line:
sbt test * -- -l HugeTestClass
OR maybe:
sbt 'testOnly * -- -l HugeTestClass'
Let us also attempt it within sbt itself:
sbt> testOnly * -- -l HugeTestClass
In all cases above we (unfortunately) still see:
[info] ItemsJobTest:
^C[info] - Run Items Pipeline *** FAILED *** (2 seconds, 796 milliseconds)
So the test actually did run.. contrary to the intention.
So what is the correct syntax / approach to apply a Tag Filter(/Exclusion) via sbt to scalatest classes?
You missed to put the testOnly part in double quote and also give full package to the Tag Annotation to ignore,
sbt "test-only * -- -l full.package.to.HugeTestClass"
example,
Tag annotation
package tags;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
#org.scalatest.TagAnnotation
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.METHOD, ElementType.TYPE})
public #interface ExcludeMePleaseTag {}
Test to exclude
#tags.ExcludeMePleaseTag
class ExcludeMeSpecs extends FlatSpec with Matchers {
"I" should " not run" in {
888 shouldBe 1
}
}
to exclude the tests
sbt "test-only * -- -l tags.ExcludeMePleaseTag"
This github issue was helpful - https://github.com/harrah/xsbt/issues/357#issuecomment-44867814
But it does not work with static Tag annotation,
public class WrapperClass {
#org.scalatest.TagAnnotation
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.METHOD, ElementType.TYPE})
public static #interface ExcludeMePleaseTag {
}
}
sbt "test-only * -- -l tags.WrapperClass.ExcludeMePleaseTag"

How do you create rules in BJAM?

I would like to compile a file with a specific compiler not supported on boost. I made a rule:
rule my_rule ( source : target )
{
compile_specially source target ;
}
actions compile_specially
{
my_compile_command $(my_parameters) $(1) -o $(2)
}
now this code builds the file into the Jamroot directory (obviously). I, however want it to be build in the regular target path (bin/gcc-4.4/release/threading-multi/...). So how do I get/generate the standard path in my_rule?
Thank You in advance.
Use the "make" target (see BBv2 Custom Commands). Of course this doesn't really get you what I think you are after. What you want is to create a new toolset which is considerably more complicated. You'll want to read through the extender chapter of the documentation. In particular the tools and generators section. And join us in the Boost Build mail list for detailed questions and debugging.
Not exactly what you are asking, but looks similar enough to help. In the example below I compile D language source files using gdc and eventually link them with C libraries (writing my own C functions would need to define interface d modules as d name mangling differs from C ones and would raise linking issues). I followed what is described in tools and generators section (see answer by #GrafikRobot) to achieve this and it was quite easy.
Here is sample jam files and code.
gdc.jam
import type ;
type.register D : d ;
import generators ;
generators.register-standard gdc.compile : D : OBJ ;
actions compile
{
# "echo" $(>) $(<)
"gdc" -c -o $(<) $(>)
}
Jamroot
import gdc ;
project hello
: requirements
<cflags>-O3
: default-build release
;
lib gphobos2 : : <file>/usr/lib/gcc/x86_64-linux-gnu/4.6/libgphobos2.a <name>gphobos2 ;
lib m : : <name>m ;
lib z : : <name>z ;
lib rt : : <name>rt ;
lib pthread : : <name>pthread <link>shared ;
exe hello
:
hello.d
bye.d
gphobos2 m z rt pthread
:
<link>static
;
Hello.d
import std.stdio;
void main()
{
writeln("Hello World!");
static import bye ;
bye.bye();
}
Bye.d
module bye;
import std.stdio;
void bye()
{
writeln("Good bye");
}

Resources