I apologize if this has been asked before, but I can't seem to find an answer or I am not searching for the answer correctly.
I am currently writing code in python using numpy and my function takes a input as a matrix. I want to view a 1D array as a (1 by n) 2D array.
Here is a minimal example of my issue. The following function takes input two matrices and adds the upper left element of the first matrix and adds it to the bottom right element of the second matrix.
import numpy as np
def add_corners(A, B):
r = A[0, 0] + B[B.shape[0] - 1, B.shape[1] - 1]
return r
C = np.array([[1, 2, 3], [4, 5, 6]])
D = np.array([[9, 8], [7, 6], [5, 4], [10, 11]])
E = np.array([1, 2, 3, 4, 5])
print(add_corners(C, D))
print(add_corners(C, E))
print(add_corners(C,E))
leads to an error, since E.shape[1]
is not well defined. Is there a way to get around this without having to add an if statement to check if my input contains a 1D array? That is, I want to refer to the entries of E as E[1,x]
as opposed to just E[x]
.
Any help is greatly appreciated!
question from:
https://stackoverflow.com/questions/65645631/how-to-treat-1d-arrays-as-1-by-n-2d-arrays-in-numpy 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…