« Tutorial: Deep Learning with R on Azure with Keras and CNTK | Main | In case you missed it: July 2017 roundup »

August 10, 2017

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Sounds interesting!

My scenario: datasets of 100+ gb on a 24 gb win machine. I usually use SAS for my analytics which can process terabytes of data with no problem. Can I switch to R with your packages? What workflow/environment/procedures would that change imply?

Thank you

@AS

dplyrXdf is not available in open source R and to use it requires Microsoft's flavor of R, called Microsoft R Server (or MRS for short). It was previously known as Revolution R (thus the blog name) before Microsoft bought the company.

MRS works with any data set that fits on disk rather than being ram limited like open source R and it does this using a proprietary data format called xdf (xdf = "external data frames") which is what dplyrXdf is design to operate on.

If you wish to try MRS download a developer version of SQL Server 2016 and play around.

The comments to this entry are closed.

Search Revolutions Blog




Got comments or suggestions for the blog editor?
Email David Smith.
Follow revodavid on Twitter Follow David on Twitter: @revodavid
Get this blog via email with Blogtrottr