Can it be proven no polynomial algorithm exists for an NP-Complete prob.? - np-complete

I can't really seem to grasp what it really means to say a problem is NP-Complete. Could anyone help me with the following question?
An NP-complete problem is a problem for which one can prove that an algorithm for solving it in polynomial time does not exist. Is the statement true?
I would want to say this statement isn't true, because can anyone actually prove that such an algorithm doesn't exist for any NP-Complete problem? From looking around on various sources, I understand that no polynomial time algorithm is known for any NP-Complete problem; however, it can't be proven.
Any help would be greatly appreciated. Thanks.

It is possible in some situations to prove that no algorithm exists that is better than a certain limit.
For example the O(n log n) bound for a comparison sort has been proven. No matter how clever we become in the future, we can be sure that no-one will ever invent an O(n) comparison sort.
In this case though, no-one has found a proof. But that doesn't mean it can't be proven.

The statement is more fundamentally wrong: There are problems that cannot be solved in polynomial time which are much harder than NP problems. The point of NP completeness is a polynomial time solution existing is equivalent to P=NP (which means additionally that a solution not existing means P!=NP).

Related

Solver selection : NonlinearBlockGS vs NewtonSolver

I have two openmdao groups with cyclic dependency between the groups. I calculate the derivatives using Complex step. I have a non-linear solver for the dependency and use SLSQP to optimize my objective function. The issue is with the choice of the non-linear solver. When I use NonlinearBlockGS the optimization is successful in 12 iterations. But when I use NewtonSolver with Directsolver or ScipyKrylov the optimization fails (Iteration limit exceeded), even with maxiter=2000. The cyclic connections converge, but it is just that the design variables does not reach the optimal values. The difference between the design variables in consecutive iterations is in the order 1e-5. And this increases the iterations needed. Also when I change the initial guess to a value closer to the optimal value it works.
To check further, I converted the model into IDF (by creating copies of coupling variables and consistency constraints) thereby removing the need for a solver. Now the optimization is successful in 5 iterations and the results are similar to the results when NonlinearBlockGS is used.
Why does this happen? Am I missing something? When should I use NewtonSolver over others? I know that it is difficult to answer without seeing the code. But it is just that my code is long with multiple components and I couldn't recreate the issue with a toy model. So any general insight is much appreciated.
Without seeing the code, you're right that its hard to give specifics.
Very broadly speaking, Newton can sometimes have a lot more trouble converging than NLBGS (Note: this is not absolutely true, but is a good rule of thumb). So what I would guess is happening is that on your first or second iteration, the newton solver isn't actually converging. You can check this by setting newton.options['iprint']=2 and looking at the iteration history as the optimizer iterates.
When you have a solver in your optimization, its critical that you also make sure that you set it to throw an error on non-convergence. Some optimizers can handle this error, and will backtrack on the line search. Others will just die. Either way, its important. Otherwise, you end up giving the optimizer an unconverged case that it doesn't know is unconverged.
This is bad for two reasons. First, the objective and constraints values it gets are going to be wrong! Second, and perhaps more importantly, the derivatives it computes are going to be wrong! You can read the details [in the theory manual,] but in summary the analytic derivative methods that OpenMDAO uses assume that the residuals have gone to 0. If thats not the case, the math breaks down. Even if you were doing full model finite-difference, non-convergenced models are a problem. You'll just get noisy garbage when you try to FD it.2
So, assuming you have set up your model correctly, and that you have the linear solvers set up problems (it sounds like you do since it works with NLBGS), then its most likely that the newton solver isn't converging. Use iprint, possibly combined with driver debug printing, to check this for yourself. If thats the case, you need to figure out how to get newton to behave better.
There are some tips here that are pretty general. You could also try using the armijo line search, which can often stablize a newton solve at the cost of some speed.
Finally... Newton isn't the best answer in all situations. If NLBGS is more stable, and computational cheaper you should use it. I applaud your desire to get it to work with Newton. You should definitely track down why its not, but if it turns out that Newton just can't solve your coupled problem reliably thats ok too!
the set it to throw an error on non-convergence is broken on your answer. I have added the link which I think is the right one. Please correct if the linked one is not the one you were thinking to link.

Fast integer power of nine

I need to calculate 9^n where n is a natural number. I used binary exponentiation, but addition chain is not optimal. Also there exists an optimal solution, but it's proven to be NP-complete and is very hard to calculate. I cannot use lookup table in my task. Also this algorithm still doesn't use the fact that i know the basis. Maybe there are some papers in number theory, or you can suggest a better solution?

Is it necessary to compute modulo 1,000,000,007 (10^9+7) in python3?

While solving competitive programming questions, sometimes it is asked to compute the final answer as
" Since this number may be large, compute it modulo 1,000,000,007 (10^9+7) ".
Also, it is the fact that in python3 plain int type is unbounded.
So, it is necessary to compute modulo 10^9+7 if I am solving my programming question in python 3 ?
It doesn't matter what anyone here believes is fair or warranted. What matters is what the automated system or human in charge expects as an answer.
Of course, since doing modular operations can slow down a possible solution within a given time frame, it would be best to find an optimum alternative, if permitted.

Gauss Jordan over GF(p) - If there are infinitely many solutions, return any of them

so for something I'm doing I need to find a solution to a system of linear equations modulo some prime, which I've implemented with Gauss-Jordan. The thing is, my implementation at the moment only works given that a solution exists: if there are no solutions, or infinitely many solutions, I'm just unable to find a pivot and I return a flag. The thing is, rather than just say "infinitely many solutions", I was hoping for a way to return any solution, hopefully somewhat near to the O(n^3) of Gauss-Jordan.
Thanks, any help is appreciated

Efficient and fast way of solving linear programming problems in R

I am working with a very large dataset, typically dealing with a few millions of combinations.
I want to solve the assignment problem.(maximise the sum)
I had tried solving it on a small test set using adagio::assignment, clue::solve_LSAP
I wasnt able to successfully install the "lpSolve" package on my system, threw some segmentation fault
Wanted to know which of these is faster or any other method which does it faster.
Thanks....
An LP formulation is not a good way to solve the assignment problem, whichever library you use. You have to use the Hungarian algorithm, and it looks like solve_LSAP does exactly that.
No need to try anything else IMHO.
EDIT: An efficient implementation of the Hungarian method should be O(n^3), which is extremely fast for any optimization algorithm. If solve_LSAP is not fast enough for your problem (assumed it is implemented correctly), it is very unlikely that any exact method will work.
You will have to use some sort of heuristic to approximate the solution.

Resources