« Tutorial: Deep Learning with R on Azure with Keras and CNTK | Main | In case you missed it: July 2017 roundup »

August 10, 2017


Feed You can follow this conversation by subscribing to the comment feed for this post.

Sounds interesting!

My scenario: datasets of 100+ gb on a 24 gb win machine. I usually use SAS for my analytics which can process terabytes of data with no problem. Can I switch to R with your packages? What workflow/environment/procedures would that change imply?

Thank you


dplyrXdf is not available in open source R and to use it requires Microsoft's flavor of R, called Microsoft R Server (or MRS for short). It was previously known as Revolution R (thus the blog name) before Microsoft bought the company.

MRS works with any data set that fits on disk rather than being ram limited like open source R and it does this using a proprietary data format called xdf (xdf = "external data frames") which is what dplyrXdf is design to operate on.

If you wish to try MRS download a developer version of SQL Server 2016 and play around.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)

Search Revolutions Blog

Got comments or suggestions for the blog editor?
Email David Smith.
Follow revodavid on Twitter Follow David on Twitter: @revodavid
Get this blog via email with Blogtrottr