I'm trying to add sqlite support in my rust project. I found rusqlite on crates.io. I added the version line to Cargo.toml. I added some imports (unused) that I found on the rusqlite docs page. After running car go build I revived an error.
I hadn't implemented anything yet. I just wanted to get the dependency added and compiling. main.rs:
extern crate rusqlite;
use rusqlite::{Connection, Result};
use rusqlite::NO_PARAMS;
Cargo.toml:
[package]
name = "program"
version = "0.1.0"
authors = ["97"]
[dependencies]
argparse = "0.2.2"
rand = "0.4.0"
rusqlite = "0.20.0"
Error recived:
$ cargo build
Compiling pkg-config v0.3.16
Compiling fallible-iterator v0.2.0
Compiling memchr v2.2.1
Compiling bitflags v1.2.1
Compiling lru-cache v0.1.2
error[E0432]: unresolved import `std::ops::Bound`
--> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/pkg-config-0.3.16/src/lib.rs:72:16
|
72 | use std::ops::{Bound, RangeBounds};
| ^^^^^ no `Bound` in `ops`
error[E0432]: unresolved import `std::ops::RangeBounds`
--> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/pkg-config-0.3.16/src/lib.rs:72:23
|
72 | use std::ops::{Bound, RangeBounds};
| ^^^^^^^^^^^ no `RangeBounds` in `ops`
error[E0658]: `dyn Trait` syntax is unstable (see issue #44662)
--> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/pkg-config-0.3.16/src/lib.rs:143:32
|
143 | fn cause(&self) -> Option<&dyn error::Error> {
| ^^^^^^^^^^^^^^^^
error: aborting due to 3 previous errors
error: Could not compile `pkg-config`.
warning: build failed, waiting for other jobs to finish...
error[E0658]: `crate` in paths is experimental (see issue #45477)
--> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/fallible-iterator-0.2.0/src/lib.rs:98:5
|
98 | use crate::imports::*;
| ^^^^^
error[E0658]: `dyn Trait` syntax is unstable (see issue #44662)
--> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/fallible-iterator-0.2.0/src/lib.rs:2606:24
|
2606 | fn _is_object_safe(_: &dyn DoubleEndedFallibleIterator<Item = (), Error = ()>) {}
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
error: aborting due to 2 previous errors
error: Could not compile `fallible-iterator`.
warning: build failed, waiting for other jobs to finish...
error: build failed
Edit
cargo 0.26.0 (41480f5cc 2018-02-26)
rustc 1.25.0 (84203cac6 2018-03-25)
Most crates support only the latest Rust version. You could try to use an older version of the crate, but the easiest way is to update Rust to a currently-supported version.
If you're using Rust from a Linux distro, then uninstall it, and get it from https://rustup.rs which can keep it up to date. Rust ecosystem moves much faster than most distributions.
Run rustup update.
Related
I'm trying to compile the openindoor6-main library after some changes using Closure compiler and I have JSC_JS_MODULE_LOAD_WARNING messages on the controls.js file.
D:\openindoor>java -jar ./node_modules/google-closure-compiler-java/compiler.jar --flagfile ./node_modules/openindoor6-main/flagfile.conf
./node_modules/openindoor6-main/src/controls.js:1:0: ERROR - [JSC_JS_MODULE_LOAD_WARNING] Failed to load module "#maplibre/maplibre-gl-geocoder"
1| import MaplibreGeocoder from '#maplibre/maplibre-gl-geocoder';
^
./node_modules/openindoor6-main/src/controls.js:4:0: ERROR - [JSC_JS_MODULE_LOAD_WARNING] Failed to load module "#turf/center"
4| import center from '#turf/center';
^
./node_modules/openindoor6-main/src/controls.js:5:0: ERROR - [JSC_JS_MODULE_LOAD_WARNING] Failed to load module "#turf/centroid"
5| import centroid from '#turf/centroid';
I have a flagfile.conf file that includes the --js directive to load these modules that have been installed using npm install.
Any hint on how to proceed ?
I need to use open framework openMDAO for my project in Ubuntu. I have successfully installed mpi4py, petsc, petsc4py by creating new environment in anaconda. I have also installed pyoptsparse and other libraries.
After installation when I run test command $ testflo openmdao -n 1
It gives error as:
(omd) mujahed#Lenovo-G50-80:~$ testflo openmdao -n 1
.............................................................................
+------------------------------------------------------------------------------+
| pyOptSparse Error: There was an error importing the compiled snopt module |
+------------------------------------------------------------------------------+
SSS......../home/mujahed/anaconda3/envs/omd/lib/python3.8/site-packages/openmdao/core/group.py:1113: UserWarning:Group (sub): Attempted to connect from 'tgt.x' to 'cmp.x', but 'tgt.x' is an input. All connections must be from an output to an input.
.......................................................................................................................................................................................................................................................................................................................................................S...............................................................................................................................S........................................................................S.SS........................................................................................................................................................................................................................................................................SSS.SSSSSS..S.................................................................................................................................................................................................................................................................................................................S.......
+------------------------------------------------------------------------------+
| pyOptSparse Error: There was an error importing the compiled snopt module |
+------------------------------------------------------------------------------+
.....................................................................(mpi) /home/mujahed/anaconda3/envs/omd/lib/python3.8/site-packages/openmdao/core/tests/test_prob_remote2.py:ProbRemoteTestCase.test_get_remote ... FAIL (00:00:0.02, 203 MB)
Traceback (most recent call last):
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/site-packages/testflo/test.py", line 425, in _try_call
func()
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/site-packages/openmdao/core/tests/test_prob_remote2.py", line 200, in test_get_remote
prob.get_val('par.c2.x', get_remote=False)
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/unittest/case.py", line 227, in __exit__
self._raiseFailure("{} not raised".format(exc_name))
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/unittest/case.py", line 164, in _raiseFailure
raise self.test_case.failureException(msg)
AssertionError: RuntimeError not raised
...
+------------------------------------------------------------------------------+
| pyOptSparse Error: There was an error importing the compiled snopt module |
+------------------------------------------------------------------------------+
SSS...............................................S.....X......................................................................................................................................................................................
+------------------------------------------------------------------------------+
| pyOptSparse Error: There was an error importing the compiled snopt module |
+------------------------------------------------------------------------------+
..SS............S...capi_return is NULL
Call-back cb_slfunc_in_slsqp__user__routines failed.
.capi_return is NULL
Call-back cb_slfunc_in_slsqp__user__routines failed.
....S..........................SSSSSSS.......
Normal return from subroutine COBYLA
NFVALS = 56 F =-1.080000E+02 MAXCV = 0.000000E+00
X = 3.500001E+00 -3.500001E+00
.................................................
Normal return from subroutine COBYLA
NFVALS = 124 F =-2.733333E+01 MAXCV = 0.000000E+00
X = 6.666667E+00 -7.333332E+00
.....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................SSSSSSSSSSSSSSSSSSSSSSSS..............
Normal return from subroutine COBYLA
NFVALS = 54 F =-2.700000E+01 MAXCV = 0.000000E+00
X = 6.999999E+00 -6.999999E+00
.............................................SSS..............................................................................................................S...........................................................................................................................................................................................................................................................................................................................................................
The following tests failed:
test_prob_remote2.py:ProbRemoteTestCase.test_get_remote
Passed: 2619
Failed: 1
Skipped: 62
Ran 2682 tests using 1 processes
Wall clock time: 00:22:33.07
Passed: 2619
Failed: 1
Skipped: 62
I don’t have computer science background and this is the first time I am dealing with something like this.
I am using:
openMDAO 3.2.1
mpich 3.3.2
mpi4py 3.0.3
petsc 3.13.4
petsc4py 3.13.0
pyoptsparse 2.1.5
python 3.8.5
I have build pyoptsparse from https://github.com/OpenMDAO/build_pyoptsparse by command ./build_pyoptsparse.sh.
I have Ubuntu 20.04.1 LTS and I am using Anaconda3. How can I pass all the tests (Without skipping)?
Can please anyone help me with this?
Apologies for this.. it looks like an extraneous file (test_prob_remote2.py) made it into the distribution.. This is not a legitimate test and you can ignore it.
The "pyOptSparse Error" messages are just telling you that you don't have the SNOPT optimizer installed, which is fine.
Your installation should be good.
Some of that test output can be a bit confusing. The error you're getting isn't actually with SNOPT. Thats a warning spit out by pyoptsparse, but it is not a problem.
The actual failed test is identified by this line:
The following tests failed:
test_prob_remote2.py:ProbRemoteTestCase.test_get_remote
And the error that causes the failure is here:
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/site-packages/testflo/test.py", line 425, in _try_call
func()
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/site-packages/openmdao/core/tests/test_prob_remote2.py", line 200, in test_get_remote
prob.get_val('par.c2.x', get_remote=False)
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/unittest/case.py", line 227, in __exit__
self._raiseFailure("{} not raised".format(exc_name))
File "/home/mujahed/anaconda3/envs/omd/lib/python3.8/unittest/case.py", line 164, in _raiseFailure
raise self.test_case.failureException(msg)
AssertionError: RuntimeError not raised
For some reason your install isn't raising an error when one is expected. Im not sure why thats going on, but if you're not planning to run with MPI then its not a problem.
You have bothered to install MPI, so you are probably be planning to run things in parallel... this one failed test is certainly odd. If you had a serious problem with your MPI install you would get more failed tests I think.
You could try running testflo like this:
testflo openmdao -n 1 --show_skipped to see if more MPI tests are being skipped... that might help you narrow it down.
While I was attempting to use Requests in Julia, the following error was output:
julia> using Requests
INFO: Precompiling module Requests...
ERROR: LoadError: LoadError: error compiling version: could not load library "libz"
libz: cannot open shared object file: No such file or directory
while loading /home/michael/.julia/v0.4/Libz/src/lowlevel.jl, in expression starting on line 110
while loading /home/michael/.julia/v0.4/Libz/src/Libz.jl, in expression starting on line 11
ERROR: LoadError: Failed to precompile Libz to /home/michael/.julia/lib/v0.4/Libz.ji
while loading /home/michael/.julia/v0.4/Requests/src/Requests.jl, in expression starting on line 27
ERROR: Failed to precompile Requests to /home/michael/.julia/lib/v0.4/Requests.ji
in compilecache at ./loading.jl:400
I'm not knowledgeable enough in Julia to discern exactly what is happening, but here is the code from Libz.jl (line 11)...
include("lowlevel.jl")
...from lowlevel.jl (lines 103-110)...
# Functions
# ---------
function version()
return unsafe_string(ccall((:zlibVersion, zlib), Ptr{UInt8}, ()))
end
const zlib_version = version()
...and from Requests.jl (line 27)
using Libz
This problem has persisted after I've removed then reinstalled Libz, MbedTLS, and Requests, and after I've Pkg.update()'ed and restarted julia and my computer. Is anyone well enough versed in Julia to know how to fix this?
Per the comment by Gnimuc K and a tiny bit more research:
sudo apt-get install zlib1g-dev
installs zlib, which Julia needed. Once it was installed...
julia> Pkg.update()
julia> Pkg.build("Libz")
worked all the kinks out.
2016-02-01 15:02:19,152 | ERROR | FelixStartLevel | BootFeaturesInstaller | 20 - org.apache.karaf.features.core - 3.0.5 | Error installing boot features
java.lang.Exception: Could not start bundle mvn:com.fasterxml.jackson.module/jackson-module-scala_2.11/2.6.2 in feature(s) de-support-0.0.0, swagger-2.11-6.1.0: Unresolved constraint in bundle com.fasterxml.jackson.module.jackson.module.scala [274]: Unable to resolve 274.0: missing requirement [274.0] osgi.wiring.package; (&(osgi.wiring.package=com.fasterxml.jackson.module.paranamer)(version>=2.6.0)(!(version>=3.0.0)))
I am getting above exception, while starting karaf even after providing
<bundle>mvn:com.fasterxml.jackson.module/jackson-module-scala_2.11/2.6.2</bundle>
in features.xml file.
You're getting this message because one of your bundles imports the package com.fasterxml.jackson.module.paranamer but you have not included a bundle in your feature which exports this package. A search for com.fasterxml.jackson.module.paranamer brings up the pom for Jackson Paranamer which shows that it exports the package that you need (look at the osgi.export property). So add:
<bundle>mvn:com.fasterxml.jackson.module/jackson-module-paranamer/2.6.2</bundle>
to your feature in your features.xml
set -e pipefail; sbt -Dsbt.log.noformat=true -DchiselVersion="latest.release" "run Parity --genHarness --compile --test --backend c --vcd " | tee Parity.out
Getting org.scala-sbt sbt 0.13.8 ...
problems summary ::
WARNINGS
module not found: org.scala-sbt#sbt;0.13.8
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.scala-sbt#sbt;0.13.8: not found
::::::::::::::::::::::::::::::::::::::::::::::
ERRORS
Server access Error: Connection refused url=https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt/0.13.8/ivys/ivy.xml
Server access Error: Connection refused url=https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.8/sbt-0.13.8.pom
Server access Error: Connection refused url=https://repo1.maven.org/maven2/org/scala-sbt/sbt/0.13.8/sbt-0.13.8.jar
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
unresolved dependency: org.scala-sbt#sbt;0.13.8: not found
Error during sbt execution: Error retrieving required libraries
Error: Could not retrieve sbt 0.13.8
this might be proxy issue.
Edit $SBT_HOME/conf directory/sbtconfig.txt file and add following entries:
-Dhttp.proxyHost=<proxy server>
-Dhttp.proxyPort=<proxy port>
-Dhttp.proxyUser=<username>
-Dhttp.proxyPassword=<password>
-Dhttps.proxyHost=<proxy server>
-Dhttps.proxyPort=<proxy port>
-Dhttps.proxyUser=<username>
-Dhttps.proxyPassword=<password>
-Dftp.proxyHost=<proxy server>
-Dftp.proxyPort=<proxy port>
-Dftp.proxyUser=<username>
-Dftp.proxyPassword=<password>
Notes:
https settings are necessary as many urls referred by the SBT are https based.
Don't include "http://" in the value
I faced a similar issue. Seems like the issue is with the java that is used. By mistake my environment was pointing to jre rather than jdk. After pointing to right JAVA_HOME as below, the sbt clean package compile worked fine.
[root#spark-sql-perf]# update-alternatives --config java
There are 2 programs which provide 'java'.
Selection Command
*+ 1 java-1.8.0-openjdk.ppc64le (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.ppc64le/jre/bin/java)
2 java-1.7.0-openjdk.ppc64le (/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.121-2.6.8.0.el7_3.ppc64le/jre/bin/java)
Enter to keep the current selection[+], or type selection number: q
There are 2 programs which provide 'java'.
Selection Command
*+ 1 java-1.8.0-openjdk.ppc64le (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.ppc64le/jre/bin/java)
2 java-1.7.0-openjdk.ppc64le (/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.121-2.6.8.0.el7_3.ppc64le/jre/bin/java)
Enter to keep the current selection[+], or type selection number: ^C
[root#spark-sql-perf]# export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.ppc64le/
[root#spark-sql-perf]# export PATH=$JAVA_HOME/bin:$PATH