scipy.linalg contains all the functions in numpy.linalg. plus some other more advanced ones not contained in numpy.linalgSo for all my purposes, I will only use scipy.linalg.
Another advantage of using scipy.linalg over numpy.linalg is that it is always compiled with BLAS/LAPACK support, while for numpy this is optional. Therefore, the scipy version might be faster depending on how numpy was installed.
Therefore, unless you don’t want to add scipy as a dependency to your numpy program, use scipy.linalg instead of numpy.linalg
Sunday, January 11, 2015
EDIT: Use scipy.linalg over numpy.linalg.
Per my previous post, I mistakenly referenced numpy.linalg and scipy.linalg as if they were the same. Upon looking deeper at the documentation for scipy.linalg it clearly states the following:
LAPACK in Python and R
EDIT: See my follow-up post as well!
While porting my dissertation work that I wrote in R to Python, I need to leverage some of the great features of R such as its easy to use wrappers of LAPACK, specifically the Cholesky Decomposition to calculate the inverse of my covariance matrix. For those not familiar with LAPACK, it's a free open source library for calculating Linear Algebra routines. I'm talking all of them, and it's included in many open source scientific software due to its wide range of applicability and free and open nature. The one caveat... It's written in Fortran*.
Now, not to hate on Fortran, but not many people are programming their software or running their data analysis using it. Fortunately for us, some very good computer nerds out there wrote awesome wrappers in R and in Python (through NumPy and Scipy) to access them. It's been recommended to me that one should run optimized LAPACK libraries for your processor and Operating System, and build R (and probably NumPy and SciPy) pointing to those optimized routines. I'll let you read through the R Administration Manual to decide for yourself.
You can find the R discussion on LAPACK routines in the R Extensions Manual here.
As for Python, just check out the documentation for numpy.linalg or scipy.linalg.
Okay, now the real point of this note to myself, is because Googling access to LAPACK in Python led me to this awesome Blog Post:
Linear Solve in Python (NumPy and SciPy)
Evgenii Rudnyi did an awesome tutorial on using Cholesky Decomposition, and I thought I'd pass it along to anybody interested in leveraging these routines.
* I vaguely recall reading somewhere that LAPACK is usually compiled in C after porting LAPACK from Fortran to C using f2c.
While porting my dissertation work that I wrote in R to Python, I need to leverage some of the great features of R such as its easy to use wrappers of LAPACK, specifically the Cholesky Decomposition to calculate the inverse of my covariance matrix. For those not familiar with LAPACK, it's a free open source library for calculating Linear Algebra routines. I'm talking all of them, and it's included in many open source scientific software due to its wide range of applicability and free and open nature. The one caveat... It's written in Fortran*.
Now, not to hate on Fortran, but not many people are programming their software or running their data analysis using it. Fortunately for us, some very good computer nerds out there wrote awesome wrappers in R and in Python (through NumPy and Scipy) to access them. It's been recommended to me that one should run optimized LAPACK libraries for your processor and Operating System, and build R (and probably NumPy and SciPy) pointing to those optimized routines. I'll let you read through the R Administration Manual to decide for yourself.
You can find the R discussion on LAPACK routines in the R Extensions Manual here.
As for Python, just check out the documentation for numpy.linalg or scipy.linalg.
Okay, now the real point of this note to myself, is because Googling access to LAPACK in Python led me to this awesome Blog Post:
Linear Solve in Python (NumPy and SciPy)
Evgenii Rudnyi did an awesome tutorial on using Cholesky Decomposition, and I thought I'd pass it along to anybody interested in leveraging these routines.
* I vaguely recall reading somewhere that LAPACK is usually compiled in C after porting LAPACK from Fortran to C using f2c.
Friday, January 9, 2015
Vectorize your functions in NumPy
One of the features I loved in R, was that I could easily put a matrix into a unitary function. Picture this, I have a spatial covariance function which relies on the distances. All I would need to do is write the spatial covariance function, and then just put in the distance matrix.
Maybe it will be easier to see some sample code:
code /code
Feel free to read a little more here, but it's a great way to avoid writing loops, especially when setting covariance matrices.
Maybe it will be easier to see some sample code:
code /code
Feel free to read a little more here, but it's a great way to avoid writing loops, especially when setting covariance matrices.
Tuesday, January 6, 2015
Statistics vs. Machine Learning
Ha, yes that title is just click bait, although I'm not sure how I'm even soliciting clicks. I'm just as likely to jump into a Bayesian vs. Frequentist debate post. I will only say this, as a Statistician I fully embrace the Machine Learning field. If we were to do a Venn Diagram of the two fields, I believe the intersection would take up most of the Sample Space, and have a probability measure of at least 95%.
With that said, I just thought I'd post a link that I recently stumbled across. Usually I just put these links under my Computing tab (it's already there), but I think this one warranted a blog post. It's a curated list of Machine Learning Algorithms across a variety of programming languages. It's definitely going to come in handy for me.
So without further delay, welcome to:
Awesome Machine Learning
With that said, I just thought I'd post a link that I recently stumbled across. Usually I just put these links under my Computing tab (it's already there), but I think this one warranted a blog post. It's a curated list of Machine Learning Algorithms across a variety of programming languages. It's definitely going to come in handy for me.
So without further delay, welcome to:
Awesome Machine Learning
Subscribe to:
Posts (Atom)