SciPy ODR
The ODR is an abbreviation form of Orthogonal Distance Regression. It is used in the regression studies. The basic linear regression is used to estimate the relationship between the two variables y and x by drawing the line of the best fit in the graph. Then the question arises why Orthogonal Distance Regression (ODR) needs? Sometimes measurement error occurred in the independent variable (x), not in the dependent variable (y).
The standard linear regression is focused on predict the Y value from the X value, so the useful thing to do is to calculate the error in the Y values (as we shown by dotted black lines in the below image). However, it is better to be account for the error in both X and Y (as shown by the dotted red lines in the following image).
Orthogonal Distance Regression (ODR) is a method that is used to calculate the error perpendicular to the line rather than vertically.
Orthogonal Distance Regression provides ODRPACK to perform ODR with non-linear functions. It is basically a FORTRAN-77 library. It can do explicit or implicit ODR fits. It can also be used to solve the ordinary least square problem (OLS).
Implementation of scipy.odr for Univariate Regression
The Univariate regression can be defined as determining relationship between one independent variable and one dependent variable. Consider the following example:
Output:
Beta: [ 7.62787497 -8.53630181] Beta Std Error: [0.89306061 3.69444539] Beta Covariance: [[ 1.52116591 -5.32408057] [-5.32408057 26.0323407 ]] Residual Variance: 0.5243065494144553 Inverse Condition #: 0.18510252155770376 Reason(s) for Halting: Sum of squares convergence