How to solve non-linear optimization problems in Python

How to solve non-linear optimization problems in Python

Optimization deals with selecting the simplest option among a number of possible choices that are feasible or do not violate constraints. Python is used to optimize parameters in a model to best fit data, increase profitability of a possible engineering style, or meet another form of objective which will be described mathematically with variables and equations.

 

pyOpt is a Python-based package for formulating and solving nonlinear constrained optimization problems in an efficient, reusable and portable manner. Python programming uses object-oriented concepts, such as class inheritance and operator overloading, to maintain a distinct separation between the problem formulation and the optimization approach used to solve the problem.

 

All optimisation downside solvers inherit from the Optimizer abstract category. The category attributes include the solver name (name), an optimizer kind symbol (category), and dictionaries that contain the solver setup parameters (options) and message output settings (informs). The class provides ways to check and alter default solver parameters (getOption, setOption), as well as a method that runs the solver for a given optimisation problem (solve).

 

Optimization solver

A number of constrained optimization solvers are designed to solve the general nonlinear optimization problem.

  1. PSQP: This optimizer is a preconditioned sequential quadratic programming algorithm. This optimizer implements a sequential quadratic programming method with a BFGS variable metric update.
  2. SLSQP: This optimizer is a sequential least squares programming algorithm. SLSQP uses the Han–Powell quasi-Newton method with a BFGS update of the B-matrix and an L1-test function in the step-length algorithm. The optimizer uses a slightly modified version of Lawson and Hanson’s NNLS nonlinear least-squares solver.
  3. CONMIN: This optimizer implements the method of feasible directions. CONMIN solves the nonlinear programming problem by moving from one feasible point to an improved one by choosing at each iteration a feasible direction and step size that improves the objective function.
  4. COBYLA: It is an implementation of Powell’s nonlinear derivative–free constrained optimization that uses a linear approximation approach. The algorithm is a sequential trust–region algorithm that employs linear approximations to the objective and constraint functions.
  5. SOLVOPT: SOLVOPT is a modified version of Shor’s r–algorithm with space dilation to find a local minimum of nonlinear and non–smooth problems.
  6. KSOPT: This code reformulates the constrained problem into an unconstrained one using a composite Kreisselmeier–Steinhauser objective function to create an envelope of the objective function and set of constraints. The envelope function is then optimized using a sequential unconstrained minimization technique.
  7. NSGA2: This optimizer is a non-dominating sorting genetic algorithm that solves non-convex and non-smooth single and multiobjective optimization problems.
  8. ALGENCAN: It solves the general non-linear constrained optimization problem without resorting to the use of matrix manipulations. It uses instead an Augmented Lagrangian approach which is able to solve extremely large problems with moderate computer time.
  9. FILTERSD: It use of a Ritz values approach Linear Constraint Problem solver. Second derivatives and storage of an approximate reduced Hessian matrix is avoided using a limited memory spectral gradient approach based on Ritz values.

 

To solve an optimization problem with pyOpt an optimizer must be initialized. The initialization of one or more optimizers is independent of the initialization of any number of optimization problems. To initialize SLSQP, which is an open-source, sequential least squares programming algorithm that comes as part of the pyOpt package, use:

>>> slsqp = pyOpt.SLSQP()

This initializes an instance of SLSQP with the default options. The setOption method can be used to change any optimizer specific option, for example the internal output flag of SLSQP:

>>> slsqp.setOption('IPRINT', -1)

Now Schittkowski’s constrained problem can be solved using SLSQP and for example, pyOpt’s automatic finite difference for the gradients:

>>> [fstr, xstr, inform] = slsqp(opt_prob,sens_type='FD')

By default, the solution information of an optimizer is also stored in the specific optimization problem. To output solution to the screen one can use:

>>> print opt_prob.solution(0)

 

Example:

The problem is taken from the set of nonlinear programming examples by Hock and Schittkowski and it is defined as

=======================================================================

      min            − x1x2x3

     x1,x2,x3

 

subjected to     x1 + 2x2 + 2x3 − 72 ≤ 0

                        − x1 − 2x2 − 2x3 ≤ 0

 

                        0 ≤ x1 ≤ 42

                        0 ≤ x2 ≤ 42

                        0 ≤ x3 ≤ 42

 

The optimum of this problem is at (x1∗ , x2∗ , x3* ) = (24, 12, 12), with an objective function value of f ∗ = −3456, and constraint values g (x∗ ) = (0, −72).

 

#======================================================================

# Standard Python modules

#======================================================================

import os, sys, time

import pdb

#======================================================================

# Extension modules

#======================================================================

#from pyOpt import *

from pyOpt import Optimization

from pyOpt import PSQP

from pyOpt import SLSQP

from pyOpt import CONMIN

from pyOpt import COBYLA

from pyOpt import SOLVOPT

from pyOpt import KSOPT

from pyOpt import NSGA2

from pyOpt import ALGENCAN

from pyOpt import FILTERSD

 

#======================================================================

def objfunc(x):

   

    f = -x[0]*x[1]*x[2]

    g = [0.0]*2

    g[0] = x[0] + 2.*x[1] + 2.*x[2] - 72.0

    g[1] = -x[0] - 2.*x[1] - 2.*x[2]

   

    fail = 0

    return f,g, fail  

 

#======================================================================

# Instantiate Optimization Problem

opt_prob = Optimization('Hock and Schittkowski Constrained Problem',objfunc)

opt_prob.addVar('x1','c',lower=0.0,upper=42.0,value=10.0)

opt_prob.addVar('x2','c',lower=0.0,upper=42.0,value=10.0)

opt_prob.addVar('x3','c',lower=0.0,upper=42.0,value=10.0)

opt_prob.addObj('f')

opt_prob.addCon('g1','i')

opt_prob.addCon('g2','i')

print opt_prob

 

# Instantiate Optimizer (PSQP) & Solve Problem

psqp = PSQP()

psqp.setOption('IPRINT',0)

psqp(opt_prob,sens_type='FD')

print opt_prob.solution(0)

 

# Instantiate Optimizer (SLSQP) & Solve Problem

slsqp = SLSQP()

slsqp.setOption('IPRINT',-1)

slsqp(opt_prob,sens_type='FD')

print opt_prob.solution(1)

 

# Instantiate Optimizer (CONMIN) & Solve Problem

conmin = CONMIN()

conmin.setOption('IPRINT',0)

conmin(opt_prob,sens_type='CS')

print opt_prob.solution(2)

 

# Instantiate Optimizer (COBYLA) & Solve Problem

cobyla = COBYLA()

cobyla.setOption('IPRINT',0)

cobyla(opt_prob)

print opt_prob.solution(3)

 

# Instantiate Optimizer (SOLVOPT) & Solve Problem

solvopt = SOLVOPT()

solvopt.setOption('iprint',-1)

solvopt(opt_prob,sens_type='FD')

print opt_prob.solution(4)

 

# Instantiate Optimizer (KSOPT) & Solve Problem

ksopt = KSOPT()

ksopt.setOption('IPRINT',0)

ksopt(opt_prob,sens_type='FD')

print opt_prob.solution(5)

 

# Instantiate Optimizer (NSGA2) & Solve Problem

nsga2 = NSGA2()

nsga2.setOption('PrintOut',0)

nsga2(opt_prob)

print opt_prob.solution(6)

 

# Instantiate Optimizer (ALGENCAN) & Solve Problem

algencan = ALGENCAN()

algencan.setOption('iprint',0)

algencan(opt_prob)

print opt_prob.solution(7)

 

# Instantiate Optimizer (FILTERSD) & Solve Problem

filtersd = FILTERSD()

filtersd.setOption('iprint',0)

filtersd(opt_prob)

print opt_prob.solution(8)

 

Solving non-linear global optimization problems could be tedious task sometimes. If the problem is not that complex then general purpose solvers could work. However, as the complexity of problem increases, general purpose global optimizers start to take time. That is when need to create your problem specific fast and direct global optimizer’s need arises.

 

We have an specialized team with PHD holders and coders to design and develop customized global optimizers. If you need help with one, please feel free to send your queries to us.

 

We first understand the problem and data by visualizing it. After that we create a solution to your needs.

 

Please do read to understand what a solver is and how it works - If you want to create your own simple solver. This is not exactly how every solver works, however, this will give you a pretty solid idea of what is a solver and how it is supposed to work.

Top Recommended Freelancers

More than 1,000,000 freelancers ready to tackle any kind of project


Saad A.

I am a qualified freelance content writer and graduated developer. I have experience in a wide range of industries, including technology, business, finance, and education. I have a keen eye for detail and a passion for writing, which I believe makes me an excellent candidate for any writing role. I am also a proficient developer, with experience in Python, Java, and HTML. If you are in need of any help, feel free to contact me.

Saad A. | Freelance Content Writer and Graduated Developer



Frequently Asked Questions

What is Python Script?

Python is an interpreted, object-oriented, high-level programming language with dynamic semantics.

Python's simple, easy to learn syntax emphasizes readability and therefore reduces the cost of program maintenance.

Python supports modules and packages, which encourages program modularity and code reuse.

What is a Python Script Freelancer?

Python is an interpreted, object-oriented and extensible programming language. Python can run on many different operating systems.

A freelancer well versed in Python can handle your workload quite easily. To hire freelance programming help for Python post a job today!

What is a Freelancer?

A freelancer or freelance worker, is a term commonly used for a person who is self-employed and is not necessarily committed to a particular employer long-term.

Why hire a Freelancer instead of full time employee?

If there is a long lead time for them to get up and running, using that investment on a full-time employee might be a better option. And if the position requires oversight, hire an employee.

A freelancer might choose to perform the work outside of normal business hours, when you're not able to monitor their progress.

Browse More Related To Python


 
How to create a solver in python
Scripts & Utilities

Python scipy provides a good number of optimizers/solvers. You can use these optimizers to solve various non-linear and linear equations. However, sometimes things might get tricky and you will not be able to calculate and provide jacobian to these solvers. We...