I'm trying to make a simple matrix multiplication method using multidimensional arrays ([2][2]
). I'm kinda new at this, and I just can't find what it is I'm doing wrong. I'd really appreciate any help in telling me what it is. I'd rather not use libraries or anything like that, I'm mostly doing this to learn how it works. Thank you so much in advance.
I'm declaring my arays in the main method as follows:
Double[][] A={{4.00,3.00},{2.00,1.00}};
Double[][] B={{-0.500,1.500},{1.000,-2.0000}};
A*B should return the identity matrix. It doesn't.
public static Double[][] multiplicar(Double[][] A, Double[][] B){
//the method runs and returns a matrix of the correct dimensions
//(I actually changed the .length function to a specific value to eliminate
//it as a possible issue), but not the correct values
Double[][] C= new Double[2][2];
int i,j;
////I fill the matrix with zeroes, if I don't do this it gives me an error
for(i=0;i<2;i++) {
for(j=0;j<2;j++){
C[i][j]=0.00000;
}
}
///this is where I'm supposed to perform the adding of every element in
//a row of A multiplied by the corresponding element in the
//corresponding column of B, for all columns in B and all rows in A
for(i=0;i<2;i++){
for(j=0;j<2;j++)
C[i][j]+=(A[i][j]*B[j][i]);
}
return C;
}
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…