I'm trying to figure out which "safe" ECC curves are supported in Bouncy Castle. I found a few curves in the namespace Org.BouncyCastle.Asn1, but they are hard to find, and I'm sure I'm missing some.
Do any of the following curves exist in Bouncy Castle? (should I use them?)
M-221
E-222
Curve1174
Curve25519
E-382
M-383
Curve383187
Curve41417
Ed448-Goldilocks
M-511
E-521
I found an (apparently) definitive list of the ECC curves supported by Bouncy Castle. It seems to match the named curves defined in the codebase.
None of the curve names match the names you listed.
However, there is nothing preventing you from tracking down1 and using the parameters that define any of the curves you have listed to define an ECParameterSpec ... or a ECNamedCurveParameterSpec.
1 - The parameters are in the paper you linked to. According to #mentalurg, it is not simple to get them into the correct form. However, this is an open source project, so if >>you<< care about this, there is nothing preventing you from doing the work and submitting a patch. Or if you don't have the time, sponsoring them to do the work for you.
#Stephen C: "tracking down and using the parameters that define any of the curves" - wrong. The parameters (A and B) are only available for Weierstrass form. For Edwards or Mongomery forms one has to do a (error prone) coordinate transformation to Weierstrass form, call the encryption, then transform results back to the original coordinate system.
Besides transformation errors, the performance for such transformed curve might be not optimal.
Both Java native implementation and Bouncy Castle are missing direct support of curve forms other than Weierstrass. And that is the problem.
Related
I've read through most of the questions posted on here tagged with OpenMDAO on discrete variables and have reviewed the following documentation here but I cannot find an answer to my question.
Here is a description of my use-case:
Lets start with the circuit example here. Now lets assume that I have a set of R values I would like to use. Perhaps in my box of hardware I have 3 types of resistors available to me that I must take advantage of.
With the resistors available, I would like to find a configuration that constrains the net current to 0 but minimizes the voltages at the nodes. Is OpenMDAO capable of taking in sets of discrete variables to select an optimal design for the other components? What would be the proper methods for this use-case? Is there any documentation or publication that I could use as a reference for this type of effort?
Overall I'm looking to use OpenMDAO to define bespoke hardware requirements in cooperation with available COTS components to meet a performance need. Am I barking up the right tree?
There is some work on discrete optimization in OpenMDAO. There is specifically some support for discrete variables in components. The SimpleGADriver also supports discrete variables. If you are looking for something more advanced than that, you can check out the AMIEGO driver.
Im not fully sure how to pose the optimization problem you've described, but there is some relevant OpenMDAO code for circuit analysis in the Zappy library that you might wan to check out.
I just want to get the main idea/principle of openFOAM and how you create a simulation, please let me know where I go wrong,
So basically you have a object that interacts with gas or liquid and you want to simulate this, so you create model of the object, mesh it, specify where the gas will flow in and out and what are the walls, and set the other correct parameters and then run the program (with the approprate time step etc)?
OpenFOAM is an open source C++ library which implements the finite volume method (FVM), which is widely used in CFD.
What you have explained is a vague understanding of some of the applications of CFD. Those things you specified might not always be the case (i.e. the fluid might not necessarily be (a) gas and so on.
The main stages of a CFD problem are: making the geometry - mesh generation - preprocess - solving - postprocess.
There might be more stages added depending on the resolution and other specifics of the case.
Now OpenFoam is an open source (free for all) tool which is in C++ and helps solve the CFD problems. If the problem is simple and routine, and you have access to a commercial solver such as ANSYS fluent, then you can use that since it is easier and much less work if the problem is not specific. However, if the problem is specific and there are customized criteria OpenFoam is a nice tool.
It is written in C++ thus it is object oriented and also there are many many different solvers already written and available to use, so you will not have to write all the schemes and everything on your own from scratch.
However, my main advice to you is to read more about CFD to have a clear understanding, there are tens of good books avaiable.
I'd like to write a library that's a thin wrapper around some of the functionality in BTreeMap. I'd prefer not to tightly couple it to that particular data structure though. Strictly speaking, I only need a subset of its functionality, something along the lines of the NavigableMap interface in Java. I was hoping to find an analogous trait I could use. I seem to recall that at some point there were traits like Map and MutableMap in the standard library, but they seem to be absent now.
Is there a crate that defines these? Or will they eventually be re-added to std?
No, right now there's only Iterator. MutableMap and Map have been removed somewhere along the road to stabilization of std for Rust 1.0.
There have been various discussions about re-adding traits to std. See these discussions on Rust internals:
Traits that should be in std, but aren’t
or (less recent but more specifically on collections):
Collection Traits, Take 2
Bottom line: everybody wants some form of those traits in std but nobody wants to commit adding and supporting the wrong ones in the standard library until a clearer picture of what is ergonomic emerges.
I have made a few annotators in UIMA and now, i want to check their efficiency.Is there a standardized way to gauge the performance of the Annotators?
UIMA itself does not provide immediate support for comparing annotators and evaluating them against a gold standard.
However, there are various tools/implementations out there that provide such functionality on top of UIMA but typically within the confines of the particular tool, e.g.:
U-Compare supports running multiple annotators doing the same thing and comparing their results
WebAnno is an interactive annotation tool that uses UIMA as its backend and that supports comparing annotations from multiple users to each other. There is a class called "CasDiff2" in the code that generates differences and feeds them into DKPro Statistics in the background for the actual agreement calculation. Unfortunately, CasDiff2 cannot be really used separately from WebAnno (yet).
Disclosure: I'm on the WebAnno team and have implemented CasDiff2 in there.
I'm currently investigating the use of curve25519 for signing. Original distribution and a C implementation (and a second C implementation).
Bernstein suggests to use ECDSA for this but I could not find any code.
ECDSA is specified by ANSI X9.62. That standard defines the kind of curves on which ECDSA is defined, including details curve equations, key representations and so on. These do not match Curve25519: part of the optimizations which make Curve25519 faster than standard curves of the same size rely on the special curve equation, which does not enter in X9.62 formalism. Correspondingly, there cannot be any implementation of ECDSA which both conforms to ANSI X9.62, and uses Curve25519. In practice, I know of no implementation of an ECDSA-like algorithm on Curve25519.
To be brief, you are on your own. You may want to implement ECDSA over the Curve25519 implementation by following X9.62 (there a draft from 1998 which can be downloaded from several places, e.g. there, or you can spend a hundred bucks and get the genuine 2005 version from Techstreet). But be warned that you are walking outside of the carefully trodden paths of analyzed cryptography; in other words I explicitly deny any kind of guarantee on how secure that kind-of-ECDSA would be.
My advice would be to stick to standard curves (such as NIST P-256). Note that while Curve25519 is faster than most curves of the same size, smaller standard curves will be faster, and yet provide adequate security for most purposes. NIST P-192, for instance, provides "96-bit security", somewhat similar to 1536-bit RSA. Also, standard curves already provide performance on the order of several thousands signature per second on a small PC, and I have trouble imagining a scenario where more performance is needed.
To use Curve25519 for this, you'd have to implement a lot of functions that AFAIK aren't currently implemented anywhere for this curve, which would mean getting very substantially into the mathematics of elliptic curve cryptography. The reason is that the existing functions throw away the "y" coordinate of the point and work only with the "x" coordinate. Without the "y" coordinate, the points P and -P look the same. That's fine for ECDH which Curve25519 is designed for, because |x(yG)| = |x(-yG)|. But for ECDSA you need to calculate aG + bP, and |aG + bP| does not in general equal |aG - bP|. I've looked into what would be involved in extending curve25519-donna to support such calculations; it's doable, but far from trivial.
Since what you need most of all is fast verification, I recommend Bernstein's Rabin-Williams scheme.
I recently shared the curve25519 library that I developed awhile back. It is hosted at https://github.com/msotoodeh and provides more functionality, higher security as well as higher performance than any other portable-C library I have tested with. It outperforms curve25519-donna by a factor of almost 2 on 64-bit platforms and a factor of almost 4 on 32-bit targets.
Today, many years after this question was asked, the correct answer is the signature scheme Ed25519.