Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
277 views
in Technique[技术] by (71.8m points)

optimization - Minimizing the peak difference of elements of two lists given some constraints

I want to minimize the peak difference of list1[i] - list2[i] using the scipy.optimize.minimize method. The elements in list1 and list2 are floats.

For example:

list1 = [50, 50.5, 52, 53, 55, 55.5, 56, 57, 60, 61]

How do I minimize list1[i] - list2[i] given that I have two constraints:

1. list2 = list1[0]

2. list2[i+1]-list2[i]<=1.5

Basically, two consecutive elements in list2 can not be separated by more than 1.5 and the first element of list2 is the first element of list1.

Maybe there is another way other than scipy.optimize.minimize but I don't know how to do this. I think the optimum values for list2 are maybe: list2 = [50, 50.5, 52, 53, 54.5, 55.5, 56, 57, 58.5, 60]

In this case the peak difference is 1.5. But maybe the algorithm finds a more optimum solution where there is less difference between the elements of list1 and list2.

Here is what I have tried but failed:

import numpy as np
from scipy.optimize import minimize

list1 = [50, 50.5, 52, 53, 55, 55.5, 56, 57,60, 61]
list2 = [list1[0]]

#Define objective
def peakDifference(*args):
    global list2
    peak_error = []
    for list1_i, list2_i in zip(list1, list2):
        peak_error.append(list1_i-list2_i)
    return max(peak_error)

peak_error = peakDifference()

#Define constraints
def constraint1(*args):
    for x in range(len(list2) - 1):
        return list2[x+1] - list2[x] - 1.5

con1 = {'type': 'ineq', 'fun': constraint1}

#Optimize
sol = minimize(peakDifference,list2, constraints=con1)



Traceback (most recent call last):   File "C:/Users/JumpStart/PycharmProjects/anglesimulation/venv/asdfgh.py", line 27, in <module>
    sol = minimize(peakDifference,list2, constraints=con1)   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimize\_minimize.py", line 625, in minimize
    return _minimize_slsqp(fun, x0, args, jac, bounds,   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimizeslsqp.py", line 412, in _minimize_slsqp
    a = _eval_con_normals(x, cons, la, n, m, meq, mieq)   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimizeslsqp.py", line 486, in _eval_con_normals
    a_ieq = vstack([con['jac'](x, *con['args'])   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimizeslsqp.py", line 486, in <listcomp>
    a_ieq = vstack([con['jac'](x, *con['args'])   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimizeslsqp.py", line 284, in cjac
    return approx_derivative(fun, x, method='2-point',   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimize\_numdiff.py", line 426, in approx_derivative
    return _dense_difference(fun_wrapped, x0, f0, h,   File "C:UsersJumpStartanaconda3libsite-packagesscipyoptimize\_numdiff.py", line 497, in _dense_difference
    df = fun(x) - f0 TypeError: unsupported operand type(s) for -: 'NoneType' and 'NoneType'

Process finished with exit code 1
question from:https://stackoverflow.com/questions/66047873/minimizing-the-peak-difference-of-elements-of-two-lists-given-some-constraints

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

NLP model

Here is a "working" version of the NLP model. The model cannot be solved reliably this way, as it is non-differentiable.

import numpy as np
from scipy.optimize import minimize

list1 = np.array([50, 50.5, 52, 53, 55, 55.5, 56, 57,60, 61])
list1_0 = list1[0]
n = len(list1)

# our x variable will have one element less (first element is fixed)
list1x  = np.delete(list1,0)  # version with 1st element dropped
nx = len(list1x)

# objective function
# minimize the the maximum difference
# Notes:
#   - x excludes first element (they are the same by definition)
#   - this is non-differentiable so likely to be non-optimal
def maxDifference(x):
    return np.max(np.abs(list1x - x))

# n-1 constraints
def constraint1(x):
    return  1.5 - np.diff(np.insert(x,0,list1_0),1)

con1 = {'type': 'ineq', 'fun': constraint1}

#Optimize
x = 55*np.ones(nx)  # initial value
sol = minimize(maxDifference, x, constraints=con1)
sol

# optimal is: x = [51.25,51.25,52.75,54.25,54.75,56.25,57.75,59.25,60.25]
# obj = 0.75 

The result is:

     fun: 5.0
     jac: array([0., 0., 0., 0., 0., 0., 0., 0., 0.])
 message: 'Optimization terminated successfully'
    nfev: 20
     nit: 2
    njev: 2
  status: 0
 success: True
       x: array([51.5, 53. , 54.5, 55. , 55. , 55. , 55. , 55. , 56. ])

This is non-optimal: the objective is 5 (instead of 0.75).

LP model

An LP model will find a proven optimal solution. That is much more reliable. E.g.:

import pulp as lp

list1 = [50, 50.5, 52, 53, 55, 55.5, 56, 57,60, 61]
n = len(list1)

model = pulp.LpProblem("Min_difference", pulp.LpMinimize)
x = lp.LpVariable.dicts("x",(i for i in range(n)))
z = lp.LpVariable("z")

# objective
model += z

# constraints
for i in range(n):
    model += z >= x[i]-list1[i]
    model += z >= list1[i]-x[i]
    
for i in range(n-1):
    model += x[i+1] - x[i] <= 1.5
    
model += x[0] == list1[0]

model.solve()
print(lp.LpStatus[model.status])
print("obj:",z.varValue)
print([x[i].varValue for i in range(n)])

This shows:

Optimal
obj: 0.75
[50.0, 51.25, 52.75, 53.75, 55.25, 56.25, 56.75, 57.75, 59.25, 60.25]

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...