How can I export Query Store data?












4















I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



Related




  • Best Practice with the Query Store

  • How Query Store Collects Data

  • Query Store Catalog Views

  • Query Store Stored Procedures

  • Monitoring performance by using the Query Store


Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



Edit after first answer and edits to same
Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)










share|improve this question





























    4















    I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



    While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



    The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



    I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




    At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




    Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



    Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



    Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



    Related




    • Best Practice with the Query Store

    • How Query Store Collects Data

    • Query Store Catalog Views

    • Query Store Stored Procedures

    • Monitoring performance by using the Query Store


    Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



    Edit after first answer and edits to same
    Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)










    share|improve this question



























      4












      4








      4








      I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



      While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



      The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



      I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




      At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




      Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



      Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



      Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



      Related




      • Best Practice with the Query Store

      • How Query Store Collects Data

      • Query Store Catalog Views

      • Query Store Stored Procedures

      • Monitoring performance by using the Query Store


      Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



      Edit after first answer and edits to same
      Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)










      share|improve this question
















      I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



      While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



      The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



      I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




      At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




      Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



      Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



      Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



      Related




      • Best Practice with the Query Store

      • How Query Store Collects Data

      • Query Store Catalog Views

      • Query Store Stored Procedures

      • Monitoring performance by using the Query Store


      Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



      Edit after first answer and edits to same
      Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)







      sql-server sql-server-2017 query-store






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 10 hours ago







      James Jenkins

















      asked 15 hours ago









      James JenkinsJames Jenkins

      1,81121941




      1,81121941






















          1 Answer
          1






          active

          oldest

          votes


















          4














          First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



          Never ending Query Store search



          Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



          The basic approach would be like this:



          SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
          SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
          SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
          SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
          SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
          SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


          The nice thing about this approach is that:




          • you'll only get the data you need (1000 MB)

          • you can add indexes to support your reporting queries, because these are actual tables

          • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)


            • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs



          • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import


          The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.





          Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



          If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



          Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



          USE [master];
          GO
          CREATE DATABASE [Sandbox];
          GO

          USE [YourSourceDatabaseWithTheQueryStoreInfo];
          GO

          BEGIN TRANSACTION;

          INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
          INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
          INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
          INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
          INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
          INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

          COMMIT TRANSACTION;
          GO

          USE [master];
          GO
          ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


          Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



          After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



          It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




          Error: 12434, Severity: 20, State: 56.

          The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




          query store in the "error" state



          I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




          Error: 5571, Severity: 16, State: 2.

          Internal FILESTREAM error: failed to access the garbage collection table.




          You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






          share|improve this answer

























            Your Answer








            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "182"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f231650%2fhow-can-i-export-query-store-data%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            4














            First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



            Never ending Query Store search



            Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



            The basic approach would be like this:



            SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
            SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
            SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
            SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
            SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
            SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


            The nice thing about this approach is that:




            • you'll only get the data you need (1000 MB)

            • you can add indexes to support your reporting queries, because these are actual tables

            • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)


              • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs



            • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import


            The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.





            Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



            If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



            Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



            USE [master];
            GO
            CREATE DATABASE [Sandbox];
            GO

            USE [YourSourceDatabaseWithTheQueryStoreInfo];
            GO

            BEGIN TRANSACTION;

            INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
            INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
            INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
            INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
            INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
            INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

            COMMIT TRANSACTION;
            GO

            USE [master];
            GO
            ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


            Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



            After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



            It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




            Error: 12434, Severity: 20, State: 56.

            The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




            query store in the "error" state



            I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




            Error: 5571, Severity: 16, State: 2.

            Internal FILESTREAM error: failed to access the garbage collection table.




            You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






            share|improve this answer






























              4














              First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



              Never ending Query Store search



              Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



              The basic approach would be like this:



              SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
              SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
              SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
              SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
              SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
              SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


              The nice thing about this approach is that:




              • you'll only get the data you need (1000 MB)

              • you can add indexes to support your reporting queries, because these are actual tables

              • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)


                • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs



              • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import


              The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.





              Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



              If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



              Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



              USE [master];
              GO
              CREATE DATABASE [Sandbox];
              GO

              USE [YourSourceDatabaseWithTheQueryStoreInfo];
              GO

              BEGIN TRANSACTION;

              INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
              INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
              INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
              INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
              INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
              INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

              COMMIT TRANSACTION;
              GO

              USE [master];
              GO
              ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


              Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



              After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



              It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




              Error: 12434, Severity: 20, State: 56.

              The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




              query store in the "error" state



              I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




              Error: 5571, Severity: 16, State: 2.

              Internal FILESTREAM error: failed to access the garbage collection table.




              You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






              share|improve this answer




























                4












                4








                4







                First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



                Never ending Query Store search



                Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



                The basic approach would be like this:



                SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
                SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
                SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
                SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
                SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
                SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


                The nice thing about this approach is that:




                • you'll only get the data you need (1000 MB)

                • you can add indexes to support your reporting queries, because these are actual tables

                • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)


                  • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs



                • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import


                The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.





                Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



                If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



                Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



                USE [master];
                GO
                CREATE DATABASE [Sandbox];
                GO

                USE [YourSourceDatabaseWithTheQueryStoreInfo];
                GO

                BEGIN TRANSACTION;

                INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
                INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
                INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
                INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
                INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
                INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

                COMMIT TRANSACTION;
                GO

                USE [master];
                GO
                ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


                Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



                After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



                It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




                Error: 12434, Severity: 20, State: 56.

                The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




                query store in the "error" state



                I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




                Error: 5571, Severity: 16, State: 2.

                Internal FILESTREAM error: failed to access the garbage collection table.




                You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






                share|improve this answer















                First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



                Never ending Query Store search



                Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



                The basic approach would be like this:



                SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
                SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
                SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
                SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
                SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
                SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


                The nice thing about this approach is that:




                • you'll only get the data you need (1000 MB)

                • you can add indexes to support your reporting queries, because these are actual tables

                • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)


                  • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs



                • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import


                The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.





                Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



                If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



                Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



                USE [master];
                GO
                CREATE DATABASE [Sandbox];
                GO

                USE [YourSourceDatabaseWithTheQueryStoreInfo];
                GO

                BEGIN TRANSACTION;

                INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
                INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
                INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
                INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
                INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
                INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

                COMMIT TRANSACTION;
                GO

                USE [master];
                GO
                ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


                Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



                After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



                It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




                Error: 12434, Severity: 20, State: 56.

                The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




                query store in the "error" state



                I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




                Error: 5571, Severity: 16, State: 2.

                Internal FILESTREAM error: failed to access the garbage collection table.




                You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited 12 hours ago

























                answered 14 hours ago









                jadarnel27jadarnel27

                6,27012038




                6,27012038






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Database Administrators Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f231650%2fhow-can-i-export-query-store-data%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    If I really need a card on my start hand, how many mulligans make sense? [duplicate]

                    Alcedinidae

                    Can an atomic nucleus contain both particles and antiparticles? [duplicate]