« Help showcase R with the "Applications in Business" contest | Main | Because it's Friday: How to board planes faster »

September 02, 2011


Feed You can follow this conversation by subscribing to the comment feed for this post.

R is stable? Has he ever looked at the NEWS file for R?

I use R daily. Almost every single .0 release of R has changes to the core that break my packages/scripts. The upcoming 2.14.0 release is doing that by introducing required NAMESPACEs, forcing me to re-write some code that was originally written to work with both S3 and S4 generics.

I have sent 100+ bug reports to R-core and package developers over the past 10 years. Way more than with SAS. My most recent bug report to R-core was flatly rejected, probably because it was a very rare bug and would have taken a fair bit of work to fix. Fair enough for a volunteer project, but not the kind of attitude that a commercial product would have.

For mission-critical use, the only way to keep R reliable and stable is to set it up with the necessary packages and then never update/upgrade.

I live R, but if my life depended on R or SAS, I would choose SAS.

SAS has had quite a few bugs over the years as well, to the point where I have had instructions "not to debug SAS." Don't forget the infamous and embarrassing "where/by" bug (where using a where clause with a by statement in SAS 9.1.3 caused one observation per by group to be deleted).

R and SAS are both huge pieces of software. They both have many bugs, and SAS certainly doesn't win in this area in my experience.

I agree that the proprietary vendors face a challenge, but I think it will take longer than 10 years for them to end up like Sun. That's because the migration from Solaris to Linux was a relatively easy one compared to the SAS to R conversion. Companies have thousands of SAS programs to convert to R.

Regarding big data, I don't think virtual memory has anything to do with SAS's advantage. For example, when doing linear regression, simply storing sums of squares and crossproducts is sufficient to solve the problem regardless of the number of observations involved. SAS could analyze billions of records back when mainframe memory was tiny compared to today's desktops. Of course some algorithms require all data be in memory at once, but for those SAS and R face the same challenge.

For anyone interested in seeing how many data analysis tasks are done in SAS, R, SPSS & Stata, see http://r4stats.com.

Bob Muenchen

Hm. This was supposed to be a comment on Steve Miller's site, but somehow it ended up here!

Yes there should realize the reader to RSS my feed to RSS commentary, quite simply

In my experience the R cognoscenti do not like to involve themselves
with mundane matters like “quality control”. Recently, Zhang et al.
2011 published some simulation results indicating serious problems with
the lme4 package. I verified some of the results and posted to the
R list. There was absolutely no response whatsoever.

For comparison I used AD Model Builder which is free software. It got results close to those reported by Zhang et al. for SAS NLMIXED.

I certainly would not use R for any serious mixed model analysis.
The link is.


New verion of SAS is much better and have less bugs. We use it in our company. I choose SAS.

The comments to this entry are closed.

Search Revolutions Blog

Got comments or suggestions for the blog editor?
Email David Smith.
Follow revodavid on Twitter Follow David on Twitter: @revodavid
Get this blog via email with Blogtrottr