Saving input and output with sp_execute_external_script

Again I was at the point, where I needed to store and save to external file all the R code that was executed through sp_execute_external_script.

Soon, you will find out several interesting things. To show the example, I will start with following example:

USE [WideWorldImporters];

EXEC sys.sp_execute_external_script
     @language = N'R'
    ,@script = N'
       d <- InputDataSet 
       c <- data.frame(Num_V1 = c(1,2,3))
       OutputDataSet <- c'
    ,@input_data_1 = N'SELECT 1 AS Nmbrs_From_R'

WITH RESULT SETS ((Numbers_From_R INT));

The result is a column called “Numbers” with three rows, represented from the data frame. This is very easy and straight-forward.


By using dynamic management view sys.dm_exec_query_stats as following:

     QM_ST.[TEXT] AS [Query]
    sys.dm_exec_query_stats AS DM_QS
    CROSS APPLY sys.dm_exec_sql_text(DM_QS.sql_handle) AS QM_ST
    DM_QS.last_execution_time DESC

Surprisingly I get only the following query returned:

sp_execute_external_script: SELECT 1 AS Nmbrs_From_R

which is far what was executed in the first place!


When using sys.dm_exec_query_plan dynamic management view to generate executed query plan, I get similar result with no R code and little sign of SQL query that was introduced to sp_execute_external_query procedure.

2017-04-10 20_28_38-query_plan1.xml - Microsoft SQL Server Management Studio

Relative the same results emerges when showing actual execution plan in SSMS. Only XML-UDX is showed.

2017-04-10 20_33_13-Saving_Input_Output_R_sp_execute_external_script.sql - SICN-KASTRUN.Plan (SPAR_s

So far, very slim possibility to get some extra and additional information from query statistics DMV or execution plan.


So opening SQL Profiler and running the example sp_execute_external_script code, I was finally able to see the actual R code within profiler:

2017-04-10 20_40_39-Greenshot image editor

Upon closer look, we can see that profiler wraps execution of external procedure with following command SET STATISTICS XML ON/OFF. So we can store the results from profiler into a table or trace file and later filter out the R-code!


Query store is very very useful and new feature with flagship MSSQL2016. Storing the queries and execution times is therefore needed in order to do later performance analysis. So in this phase, let’s just see, if we can store external procedure code in query store.

With execution of R external procedure, I execute following query to check the Query Store (QS):

    sys.query_store_plan AS QSP
    JOIN sys.query_store_query AS QSQ  
    ON QSP.query_id = QSQ.query_id  
    JOIN sys.query_store_query_text AS QSQT  
    ON QSQ.query_text_id = QSQT.query_text_id

And the results are – in a way – not surprising at all, since many of query store statistics base on DMV. So result for my external procedure is again, very little informative in order to extract R code:

2017-04-10 21_01_32-Saving_Input_Output_R_sp_execute_external_script.sql - SICN-KASTRUN.Plan (SPAR_s

Something, we have seen already couple of times. And no sign of execution of R Script. In fact, looking from this, it is hard even to tell, this was passed to RLaunchpad.exe external program.


Sink is a R function to store the output of the executed R code into external file. With execution of any of the two T-SQL code, I will never be able to either get the results nor the R code itself.

In case of results:

EXEC sys.sp_execute_external_script
     @language = N'R'
    ,@script = N'
        d <- InputDataSet
        c <- data.frame(Num_V1 = c(1,2,3))
        OutputDataSet <- c'
    ,@input_data_1 = N'SELECT 1 AS Nmbrs_From_R'
WITH RESULT SETS ((Numbers_From_R INT));

EXEC sys.sp_execute_external_script
     @language = N'R'
    ,@script = N'
        c <- data.frame(Num_V1 = c(1,2,3))
    ,@input_data_1 = N'SELECT 1 AS Nmbrs_From_R'

In both cases the file is created, but it is just that. Empty file. No content whatsoever.


Load will store intermediate results into file for later analysis or for semi aggreagated data, used for further calculations. So, I have tested it as following:

EXEC sys.sp_execute_external_script
     @language = N'R'
    ,@script = N'
        c <- data.frame(Num_V1 = c(1,2,3))
        save(c, file="C:\\DataTK\\logRSQLsession3.rda")
    ,@input_data_1 = N'SELECT 1 AS Nmbrs_From_R'

EXEC sys.sp_execute_external_script
     @language = N'R'
    ,@script = N'
        OutputDataSet <- c'
    ,@input_data_1 = N'SELECT 1 AS Nmbrs_From_R'



Extensibility Log will store information about the session but it will not store the R or R environment information or data, just session information and data. Navigate to:

C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\LOG\ExtensibilityLog

to check the content and to see, if there is anything useful for your needs.


We are very limited in terms of exporting executed R code, results or Logs. Same applies for importing any additional code. We have seen that import, source are not working, whereas Load for loading *.rda files is working. At least something 🙂 There should be more ways to get into the, especially with Rterm or Vanilla R, but the idea was to have everything run comfortably from the SSMS environment.

As you can see, there is little possibilities to store R code separately or store execution R logs in external files. But I presume, I haven’t exhausted all the possibilities, so there should be still some ways to try and do this.

As always, the code is available at Github.

Happy Rrrrr!

Tagged with: , , , , ,
Posted in Uncategorized
9 comments on “Saving input and output with sp_execute_external_script
  1. […] article was first published on R – TomazTsql, and kindly contributed to […]


  2. […] Tomaz Kastrun shows the places where you might be able to track R scripts running on your system: […]


  3. Bob Albright says:


    I’m not sure if this is something that would be useful for you but I’ve setup a system where, at a high level, all R code (.R files, additional libraries) are transferred to the database file system from a local developer (or webserver).

    Indirectly, the local developer will run a script that zips up everything inside a folder on their machine, transfers the zip file to a database table (VARBINARY(MAX)), and then a stored procedure (using sp_execute_external_script) is able to extract that zip file onto the database machine filesystem (using R).

    Calls to the R code via sp_execute_external_script contain little more than source(“path_to_r_code”) and it’s easy to audit what R code exists on the actual machine.



  4. […] my previous blog post, Saving input and output with sp_execute_external_script, I was exploring the possibilities how to capture the R code that external procedure sends it […]


  5. tomaztsql says:

    Thank you Bob for sharing your point of view and your setup of the system. I have added a blog post discussing what you have started.

    Best, Tomaž


  6. kappamega says:

    How can we pass two query as Input Data Set ? I tried @input_data_2 and it does not work


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Follow TomazTsql on
Programs I Use: SQL Search
Programs I Use: R Studio
Programs I Use: Plan Explorer
Programs I use: Scraper API
Rdeči Noski – Charity

Rdeči noski

100% of donations made here go to charity, no deductions, no fees. For CLOWNDOCTORS - encouraging more joy and happiness to children staying in hospitals (


Top SQL Server Bloggers 2018

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond


A daily selection of the best content published on WordPress, collected for you by humans who love to read.


Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond - attaining enlightenment with the Microsoft Data and Cloud Platforms with a sprinkling of Open Source and supporting technologies!

SQL DBA with A Beard

He's a SQL DBA and he has a beard

Reeves Smith's SQL & BI Blog

A blog about SQL Server and the Microsoft Business Intelligence stack with some random Non-Microsoft tools thrown in for good measure.

SQL Server

for Application Developers

Business Analytics 3.0

Data Driven Business Models

SQL Database Engine Blog

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

Search Msdn

Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond


Tomaz doing BI and DEV with SQL Server and R, Python, Power BI, Azure and beyond

Ms SQL Girl

Julie Koesmarno's Journey In Data, BI and SQL World


R news and tutorials contributed by hundreds of R bloggers

Data Until I Die!

Data for Life :)

Paul Turley's SQL Server BI Blog

sharing my experiences with the Microsoft data platform, SQL Server BI, Data Modeling, SSAS Design, Power Pivot, Power BI, SSRS Advanced Design, Power BI, Dashboards & Visualization since 2009

Grant Fritchey

Intimidating Databases and Code

Madhivanan's SQL blog

A modern business theme

Alessandro Alpi's Blog

DevOps could be the disease you die with, but don’t die of.

Paul te Braak

Business Intelligence Blog

Sql Server Insane Asylum (A Blog by Pat Wright)

Information about SQL Server from the Asylum.

Gareth's Blog

A blog about Life, SQL & Everything ...

SQLPam's Blog

Life changes fast and this is where I occasionally take time to ponder what I have learned and experienced. A lot of focus will be on SQL and the SQL community – but life varies.

%d bloggers like this: