How do I intentionally fragment a SQL Server Index?Why do sequential GUID keys perform faster than sequential...

Is there a known non-euclidean geometry where two concentric circles of different radii can intersect? (as in the novel "The Universe Between")

Was Tuvok bluffing when he said that Voyager's transporters rendered the Kazon weapons useless?

Did the British navy fail to take into account the ballistics correction due to Coriolis force during WW1 Falkland Islands battle?

Can you help me understand Modes from the aspect of chord changes?

What is this symbol: semicircles facing eachother

Co-author responds to email by mistake cc'ing the EiC

Nth Problem with TikZ and Extensive Form Games

Shouldn't the "credit score" prevent Americans from going deeper and deeper into personal debt?

On the feasibility of space battleships

Brexit and backstop: would changes require unanimous approval by all EU countries? Does Ireland hold a veto?

How to persuade recruiters to send me the Job Description?

Why is Boris Johnson visiting only Paris & Berlin if every member of the EU needs to agree on a withdrawal deal?

Efficiently pathfinding many flocking enemies around obstacles

How is "sein" conjugated in this sub-sentence?

If I have a 16.67% fail rate (N=24) & I do another 24 tests, what is the likelihood that I get 0 fails by chance?

Why didn’t Doctor Strange stay in the original winning timeline?

In what ways can a Non-paladin access Paladin spells?

Fancy String Replace

Why were movies shot on film shot at 24 frames per second?

Most practical knots for hitching a line to an object while keeping the bitter end as tight as possible, without sag?

How to dismiss intrusive questions from a colleague with whom I don't work?

Why is less being run unnecessarily by git?

Can pay be witheld for hours cleaning up after closing time?

What is the difference between true neutral and unaligned?



How do I intentionally fragment a SQL Server Index?


Why do sequential GUID keys perform faster than sequential INT keys in my test case?Filter Schema In Index Optimize ScriptSQL Server 2008 - Question about index behaviourindex rebuild/reorganize frequencyFragmented clustered primary key (sequential GUID) index after processing - SQL ServerWhat is the definition of a bad index? How do we decide? What logic we should have?SQL Server heap tables fragmentation monitoringHistogram Skew on Clustered IndexOracle 12c (RDS) Questions from a SQL Server DBAunderstanding index fragmentation statisticsIs there a way to restrict automatic statistics updates to a certain timeframe?Background Index Creation on SQL Server






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







9















I would like to create bad indexes conditions intentionally on a SQL Server 2017 test database I have, just to better understand these maintenance scripts? SQL Server Index and Statistics Maintenance



Is there a fast/automatic way to compromise index integrity or increase index fragmentation? Do you know any useful resource I can look at to achieve this?










share|improve this question
















migrated from stackoverflow.com 2 days ago


This question came from our site for professional and enthusiast programmers.




















  • Depending on your definition of ugly you may want to mess up the fill factor as well which won't mess up fragmentation, but will have a degrading affect

    – scsimon
    2 days ago






  • 3





    Do you want for one index or all indexes in a database ? If you want for all indexes, then shrink your database - DBCC SHRINKDATABASE ([yourNONProdDB])

    – Kin Shah
    2 days ago











  • All db indexes would be perfect. Thanks @KinShah

    – mororo
    yesterday


















9















I would like to create bad indexes conditions intentionally on a SQL Server 2017 test database I have, just to better understand these maintenance scripts? SQL Server Index and Statistics Maintenance



Is there a fast/automatic way to compromise index integrity or increase index fragmentation? Do you know any useful resource I can look at to achieve this?










share|improve this question
















migrated from stackoverflow.com 2 days ago


This question came from our site for professional and enthusiast programmers.




















  • Depending on your definition of ugly you may want to mess up the fill factor as well which won't mess up fragmentation, but will have a degrading affect

    – scsimon
    2 days ago






  • 3





    Do you want for one index or all indexes in a database ? If you want for all indexes, then shrink your database - DBCC SHRINKDATABASE ([yourNONProdDB])

    – Kin Shah
    2 days ago











  • All db indexes would be perfect. Thanks @KinShah

    – mororo
    yesterday














9












9








9








I would like to create bad indexes conditions intentionally on a SQL Server 2017 test database I have, just to better understand these maintenance scripts? SQL Server Index and Statistics Maintenance



Is there a fast/automatic way to compromise index integrity or increase index fragmentation? Do you know any useful resource I can look at to achieve this?










share|improve this question
















I would like to create bad indexes conditions intentionally on a SQL Server 2017 test database I have, just to better understand these maintenance scripts? SQL Server Index and Statistics Maintenance



Is there a fast/automatic way to compromise index integrity or increase index fragmentation? Do you know any useful resource I can look at to achieve this?







sql-server index clustered-index






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 2 days ago









ypercubeᵀᴹ

80.6k11 gold badges139 silver badges227 bronze badges




80.6k11 gold badges139 silver badges227 bronze badges










asked 2 days ago









mororomororo

1486 bronze badges




1486 bronze badges





migrated from stackoverflow.com 2 days ago


This question came from our site for professional and enthusiast programmers.











migrated from stackoverflow.com 2 days ago


This question came from our site for professional and enthusiast programmers.









migrated from stackoverflow.com 2 days ago


This question came from our site for professional and enthusiast programmers.















  • Depending on your definition of ugly you may want to mess up the fill factor as well which won't mess up fragmentation, but will have a degrading affect

    – scsimon
    2 days ago






  • 3





    Do you want for one index or all indexes in a database ? If you want for all indexes, then shrink your database - DBCC SHRINKDATABASE ([yourNONProdDB])

    – Kin Shah
    2 days ago











  • All db indexes would be perfect. Thanks @KinShah

    – mororo
    yesterday



















  • Depending on your definition of ugly you may want to mess up the fill factor as well which won't mess up fragmentation, but will have a degrading affect

    – scsimon
    2 days ago






  • 3





    Do you want for one index or all indexes in a database ? If you want for all indexes, then shrink your database - DBCC SHRINKDATABASE ([yourNONProdDB])

    – Kin Shah
    2 days ago











  • All db indexes would be perfect. Thanks @KinShah

    – mororo
    yesterday

















Depending on your definition of ugly you may want to mess up the fill factor as well which won't mess up fragmentation, but will have a degrading affect

– scsimon
2 days ago





Depending on your definition of ugly you may want to mess up the fill factor as well which won't mess up fragmentation, but will have a degrading affect

– scsimon
2 days ago




3




3





Do you want for one index or all indexes in a database ? If you want for all indexes, then shrink your database - DBCC SHRINKDATABASE ([yourNONProdDB])

– Kin Shah
2 days ago





Do you want for one index or all indexes in a database ? If you want for all indexes, then shrink your database - DBCC SHRINKDATABASE ([yourNONProdDB])

– Kin Shah
2 days ago













All db indexes would be perfect. Thanks @KinShah

– mororo
yesterday





All db indexes would be perfect. Thanks @KinShah

– mororo
yesterday










4 Answers
4






active

oldest

votes


















10













One quick way I can imagine is creating a table with UNIQUEIDENTIFIER as a primary key and inserting lots of random values. This could be achieved using this script:



CREATE TABLE dbo.Tests (Id UNIQUEIDENTIFIER PRIMARY KEY);
GO
INSERT INTO dbo.Tests (Id)
WITH x AS (SELECT n FROM (VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) v(n))
SELECT NEWID()
FROM x AS x1, x AS x2, x AS x3, x AS x4, x AS x5, x AS x6;


This will generate million rows.



Knowing that NEWID() does not guarantee any ordering, SQL Server will have to insert into random spots in table - that's going to fragment the primary key.






share|improve this answer























  • 1





    That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

    – David Spillett
    yesterday



















3













I wanted to make several "Ugly" indexes, so I did the following. It worked well



-- Create databases to test index job, each database is about 800MB with 100,000 GUID primary keys, in each of two tables
-- Create 6 database to test index job for DatabasesInParallel Database design based on example https://dba.stackexchange.com/q/9821/21924

--Drop last test
USE [master]
exec asp_kill_user_connections [IndexTest_1]
exec asp_kill_user_connections [IndexTest_2]
exec asp_kill_user_connections [IndexTest_3]
exec asp_kill_user_connections [IndexTest_4]
exec asp_kill_user_connections [IndexTest_5]
exec asp_kill_user_connections [IndexTest_6]
GO

DROP DATABASE [IndexTest_1]
GO
DROP DATABASE [IndexTest_2]
GO
DROP DATABASE [IndexTest_3]
GO
DROP DATABASE [IndexTest_4]
GO
DROP DATABASE [IndexTest_5]
GO
DROP DATABASE [IndexTest_6]
GO

-- create [IndexTest_1]
USE [master];
GO

CREATE DATABASE [IndexTest_1];
GO

USE IndexTest_1

SET NOCOUNT ON

CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

DECLARE @BatchCounter INT = 1
DECLARE @Numrows INT = 100000


WHILE (@BatchCounter <= 20)
BEGIN
BEGIN TRAN

DECLARE @LocalCounter INT = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @LocalCounter = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @BatchCounter +=1
COMMIT
END

-----------------------------------------------

-- create [IndexTest_2]
USE [master];
GO

CREATE DATABASE [IndexTest_2];
GO

USE IndexTest_2

SET NOCOUNT ON

CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

DECLARE @BatchCounter INT = 1
DECLARE @Numrows INT = 100000


WHILE (@BatchCounter <= 20)
BEGIN
BEGIN TRAN

DECLARE @LocalCounter INT = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @LocalCounter = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @BatchCounter +=1
COMMIT
END

------------------------------------

-- create [IndexTest_3]
USE [master];
GO

CREATE DATABASE [IndexTest_3];
GO

USE IndexTest_3

SET NOCOUNT ON

CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

DECLARE @BatchCounter INT = 1
DECLARE @Numrows INT = 100000


WHILE (@BatchCounter <= 20)
BEGIN
BEGIN TRAN

DECLARE @LocalCounter INT = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @LocalCounter = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @BatchCounter +=1
COMMIT
END

----------------------------------------
-- create [IndexTest_4]
USE [master];
GO

CREATE DATABASE [IndexTest_4];
GO

USE IndexTest_4

SET NOCOUNT ON

CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

DECLARE @BatchCounter INT = 1
DECLARE @Numrows INT = 100000


WHILE (@BatchCounter <= 20)
BEGIN
BEGIN TRAN

DECLARE @LocalCounter INT = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @LocalCounter = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @BatchCounter +=1
COMMIT
END

------------------------------------------------
-- create [IndexTest_5]
USE [master];
GO

CREATE DATABASE [IndexTest_5];
GO

USE IndexTest_5

SET NOCOUNT ON

CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

DECLARE @BatchCounter INT = 1
DECLARE @Numrows INT = 100000


WHILE (@BatchCounter <= 20)
BEGIN
BEGIN TRAN

DECLARE @LocalCounter INT = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @LocalCounter = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @BatchCounter +=1
COMMIT
END
--------------------------------------------

-- create [IndexTest_6]
USE [master];
GO

CREATE DATABASE [IndexTest_6];
GO

USE IndexTest_6

SET NOCOUNT ON

CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

DECLARE @BatchCounter INT = 1
DECLARE @Numrows INT = 100000


WHILE (@BatchCounter <= 20)
BEGIN
BEGIN TRAN

DECLARE @LocalCounter INT = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @LocalCounter = 0

WHILE (@LocalCounter <= @NumRows)
BEGIN
INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
SET @LocalCounter +=1
END

SET @BatchCounter +=1
COMMIT
END
-------------------------------
use master
DBCC FREEPROCCACHE -- Clear plan cache for next text.





share|improve this answer

































    1













    Usually, Index fragmentation happens when there is Update or Insert operation happens on the table.



    If you want quickly produce the issue (Index fragmentation), create an Index in your test table with less fill factor and do heavy Update or Insert operation on that table. You can work with these scripts..






    share|improve this answer

































      0













      You can also use CRYPT_GEN_RANDOM as I did in this answer: Filter Schema In Index Optimize Script.



      You can insert data in a numeric column which has an index on it to fragment it like this:



      -- Fill with random integers to create fragmentation
      INSERT INTO [ProdTable] (c1, c2) VALUES (CRYPT_GEN_RANDOM(8000), 'filler');
      GO 12800


      You could also update data or convert to a string instead of a number if that's what you need.






      share|improve this answer




























        Your Answer








        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "182"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });














        draft saved

        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f245693%2fhow-do-i-intentionally-fragment-a-sql-server-index%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        4 Answers
        4






        active

        oldest

        votes








        4 Answers
        4






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        10













        One quick way I can imagine is creating a table with UNIQUEIDENTIFIER as a primary key and inserting lots of random values. This could be achieved using this script:



        CREATE TABLE dbo.Tests (Id UNIQUEIDENTIFIER PRIMARY KEY);
        GO
        INSERT INTO dbo.Tests (Id)
        WITH x AS (SELECT n FROM (VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) v(n))
        SELECT NEWID()
        FROM x AS x1, x AS x2, x AS x3, x AS x4, x AS x5, x AS x6;


        This will generate million rows.



        Knowing that NEWID() does not guarantee any ordering, SQL Server will have to insert into random spots in table - that's going to fragment the primary key.






        share|improve this answer























        • 1





          That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

          – David Spillett
          yesterday
















        10













        One quick way I can imagine is creating a table with UNIQUEIDENTIFIER as a primary key and inserting lots of random values. This could be achieved using this script:



        CREATE TABLE dbo.Tests (Id UNIQUEIDENTIFIER PRIMARY KEY);
        GO
        INSERT INTO dbo.Tests (Id)
        WITH x AS (SELECT n FROM (VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) v(n))
        SELECT NEWID()
        FROM x AS x1, x AS x2, x AS x3, x AS x4, x AS x5, x AS x6;


        This will generate million rows.



        Knowing that NEWID() does not guarantee any ordering, SQL Server will have to insert into random spots in table - that's going to fragment the primary key.






        share|improve this answer























        • 1





          That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

          – David Spillett
          yesterday














        10












        10








        10







        One quick way I can imagine is creating a table with UNIQUEIDENTIFIER as a primary key and inserting lots of random values. This could be achieved using this script:



        CREATE TABLE dbo.Tests (Id UNIQUEIDENTIFIER PRIMARY KEY);
        GO
        INSERT INTO dbo.Tests (Id)
        WITH x AS (SELECT n FROM (VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) v(n))
        SELECT NEWID()
        FROM x AS x1, x AS x2, x AS x3, x AS x4, x AS x5, x AS x6;


        This will generate million rows.



        Knowing that NEWID() does not guarantee any ordering, SQL Server will have to insert into random spots in table - that's going to fragment the primary key.






        share|improve this answer















        One quick way I can imagine is creating a table with UNIQUEIDENTIFIER as a primary key and inserting lots of random values. This could be achieved using this script:



        CREATE TABLE dbo.Tests (Id UNIQUEIDENTIFIER PRIMARY KEY);
        GO
        INSERT INTO dbo.Tests (Id)
        WITH x AS (SELECT n FROM (VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) v(n))
        SELECT NEWID()
        FROM x AS x1, x AS x2, x AS x3, x AS x4, x AS x5, x AS x6;


        This will generate million rows.



        Knowing that NEWID() does not guarantee any ordering, SQL Server will have to insert into random spots in table - that's going to fragment the primary key.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited 2 days ago

























        answered 2 days ago









        Evaldas BuinauskasEvaldas Buinauskas

        7477 silver badges16 bronze badges




        7477 silver badges16 bronze badges











        • 1





          That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

          – David Spillett
          yesterday














        • 1





          That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

          – David Spillett
          yesterday








        1




        1





        That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

        – David Spillett
        yesterday





        That won't work or at least isn't guaranteed to: SQL Server can and probably will sort the rows (spooling to disk if needed due to the number of rows) before adding them to the index. To force fragmentation you need to perform many individual inserts - in SSMS: INSERT dbo.Tests (Id) SELECT NEWID(); GO 1000000; - this will obviously take more time. See pastebin.com/SVLtiRnP for an example I chucked together for another question. Using variable length rows and updating them at random might more efficiently produce fragmentation?

        – David Spillett
        yesterday













        3













        I wanted to make several "Ugly" indexes, so I did the following. It worked well



        -- Create databases to test index job, each database is about 800MB with 100,000 GUID primary keys, in each of two tables
        -- Create 6 database to test index job for DatabasesInParallel Database design based on example https://dba.stackexchange.com/q/9821/21924

        --Drop last test
        USE [master]
        exec asp_kill_user_connections [IndexTest_1]
        exec asp_kill_user_connections [IndexTest_2]
        exec asp_kill_user_connections [IndexTest_3]
        exec asp_kill_user_connections [IndexTest_4]
        exec asp_kill_user_connections [IndexTest_5]
        exec asp_kill_user_connections [IndexTest_6]
        GO

        DROP DATABASE [IndexTest_1]
        GO
        DROP DATABASE [IndexTest_2]
        GO
        DROP DATABASE [IndexTest_3]
        GO
        DROP DATABASE [IndexTest_4]
        GO
        DROP DATABASE [IndexTest_5]
        GO
        DROP DATABASE [IndexTest_6]
        GO

        -- create [IndexTest_1]
        USE [master];
        GO

        CREATE DATABASE [IndexTest_1];
        GO

        USE IndexTest_1

        SET NOCOUNT ON

        CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        DECLARE @BatchCounter INT = 1
        DECLARE @Numrows INT = 100000


        WHILE (@BatchCounter <= 20)
        BEGIN
        BEGIN TRAN

        DECLARE @LocalCounter INT = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @LocalCounter = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @BatchCounter +=1
        COMMIT
        END

        -----------------------------------------------

        -- create [IndexTest_2]
        USE [master];
        GO

        CREATE DATABASE [IndexTest_2];
        GO

        USE IndexTest_2

        SET NOCOUNT ON

        CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        DECLARE @BatchCounter INT = 1
        DECLARE @Numrows INT = 100000


        WHILE (@BatchCounter <= 20)
        BEGIN
        BEGIN TRAN

        DECLARE @LocalCounter INT = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @LocalCounter = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @BatchCounter +=1
        COMMIT
        END

        ------------------------------------

        -- create [IndexTest_3]
        USE [master];
        GO

        CREATE DATABASE [IndexTest_3];
        GO

        USE IndexTest_3

        SET NOCOUNT ON

        CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        DECLARE @BatchCounter INT = 1
        DECLARE @Numrows INT = 100000


        WHILE (@BatchCounter <= 20)
        BEGIN
        BEGIN TRAN

        DECLARE @LocalCounter INT = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @LocalCounter = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @BatchCounter +=1
        COMMIT
        END

        ----------------------------------------
        -- create [IndexTest_4]
        USE [master];
        GO

        CREATE DATABASE [IndexTest_4];
        GO

        USE IndexTest_4

        SET NOCOUNT ON

        CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        DECLARE @BatchCounter INT = 1
        DECLARE @Numrows INT = 100000


        WHILE (@BatchCounter <= 20)
        BEGIN
        BEGIN TRAN

        DECLARE @LocalCounter INT = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @LocalCounter = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @BatchCounter +=1
        COMMIT
        END

        ------------------------------------------------
        -- create [IndexTest_5]
        USE [master];
        GO

        CREATE DATABASE [IndexTest_5];
        GO

        USE IndexTest_5

        SET NOCOUNT ON

        CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        DECLARE @BatchCounter INT = 1
        DECLARE @Numrows INT = 100000


        WHILE (@BatchCounter <= 20)
        BEGIN
        BEGIN TRAN

        DECLARE @LocalCounter INT = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @LocalCounter = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @BatchCounter +=1
        COMMIT
        END
        --------------------------------------------

        -- create [IndexTest_6]
        USE [master];
        GO

        CREATE DATABASE [IndexTest_6];
        GO

        USE IndexTest_6

        SET NOCOUNT ON

        CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
        SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

        DECLARE @BatchCounter INT = 1
        DECLARE @Numrows INT = 100000


        WHILE (@BatchCounter <= 20)
        BEGIN
        BEGIN TRAN

        DECLARE @LocalCounter INT = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @LocalCounter = 0

        WHILE (@LocalCounter <= @NumRows)
        BEGIN
        INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
        SET @LocalCounter +=1
        END

        SET @BatchCounter +=1
        COMMIT
        END
        -------------------------------
        use master
        DBCC FREEPROCCACHE -- Clear plan cache for next text.





        share|improve this answer






























          3













          I wanted to make several "Ugly" indexes, so I did the following. It worked well



          -- Create databases to test index job, each database is about 800MB with 100,000 GUID primary keys, in each of two tables
          -- Create 6 database to test index job for DatabasesInParallel Database design based on example https://dba.stackexchange.com/q/9821/21924

          --Drop last test
          USE [master]
          exec asp_kill_user_connections [IndexTest_1]
          exec asp_kill_user_connections [IndexTest_2]
          exec asp_kill_user_connections [IndexTest_3]
          exec asp_kill_user_connections [IndexTest_4]
          exec asp_kill_user_connections [IndexTest_5]
          exec asp_kill_user_connections [IndexTest_6]
          GO

          DROP DATABASE [IndexTest_1]
          GO
          DROP DATABASE [IndexTest_2]
          GO
          DROP DATABASE [IndexTest_3]
          GO
          DROP DATABASE [IndexTest_4]
          GO
          DROP DATABASE [IndexTest_5]
          GO
          DROP DATABASE [IndexTest_6]
          GO

          -- create [IndexTest_1]
          USE [master];
          GO

          CREATE DATABASE [IndexTest_1];
          GO

          USE IndexTest_1

          SET NOCOUNT ON

          CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          DECLARE @BatchCounter INT = 1
          DECLARE @Numrows INT = 100000


          WHILE (@BatchCounter <= 20)
          BEGIN
          BEGIN TRAN

          DECLARE @LocalCounter INT = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @LocalCounter = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @BatchCounter +=1
          COMMIT
          END

          -----------------------------------------------

          -- create [IndexTest_2]
          USE [master];
          GO

          CREATE DATABASE [IndexTest_2];
          GO

          USE IndexTest_2

          SET NOCOUNT ON

          CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          DECLARE @BatchCounter INT = 1
          DECLARE @Numrows INT = 100000


          WHILE (@BatchCounter <= 20)
          BEGIN
          BEGIN TRAN

          DECLARE @LocalCounter INT = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @LocalCounter = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @BatchCounter +=1
          COMMIT
          END

          ------------------------------------

          -- create [IndexTest_3]
          USE [master];
          GO

          CREATE DATABASE [IndexTest_3];
          GO

          USE IndexTest_3

          SET NOCOUNT ON

          CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          DECLARE @BatchCounter INT = 1
          DECLARE @Numrows INT = 100000


          WHILE (@BatchCounter <= 20)
          BEGIN
          BEGIN TRAN

          DECLARE @LocalCounter INT = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @LocalCounter = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @BatchCounter +=1
          COMMIT
          END

          ----------------------------------------
          -- create [IndexTest_4]
          USE [master];
          GO

          CREATE DATABASE [IndexTest_4];
          GO

          USE IndexTest_4

          SET NOCOUNT ON

          CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          DECLARE @BatchCounter INT = 1
          DECLARE @Numrows INT = 100000


          WHILE (@BatchCounter <= 20)
          BEGIN
          BEGIN TRAN

          DECLARE @LocalCounter INT = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @LocalCounter = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @BatchCounter +=1
          COMMIT
          END

          ------------------------------------------------
          -- create [IndexTest_5]
          USE [master];
          GO

          CREATE DATABASE [IndexTest_5];
          GO

          USE IndexTest_5

          SET NOCOUNT ON

          CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          DECLARE @BatchCounter INT = 1
          DECLARE @Numrows INT = 100000


          WHILE (@BatchCounter <= 20)
          BEGIN
          BEGIN TRAN

          DECLARE @LocalCounter INT = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @LocalCounter = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @BatchCounter +=1
          COMMIT
          END
          --------------------------------------------

          -- create [IndexTest_6]
          USE [master];
          GO

          CREATE DATABASE [IndexTest_6];
          GO

          USE IndexTest_6

          SET NOCOUNT ON

          CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
          SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

          DECLARE @BatchCounter INT = 1
          DECLARE @Numrows INT = 100000


          WHILE (@BatchCounter <= 20)
          BEGIN
          BEGIN TRAN

          DECLARE @LocalCounter INT = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @LocalCounter = 0

          WHILE (@LocalCounter <= @NumRows)
          BEGIN
          INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
          SET @LocalCounter +=1
          END

          SET @BatchCounter +=1
          COMMIT
          END
          -------------------------------
          use master
          DBCC FREEPROCCACHE -- Clear plan cache for next text.





          share|improve this answer




























            3












            3








            3







            I wanted to make several "Ugly" indexes, so I did the following. It worked well



            -- Create databases to test index job, each database is about 800MB with 100,000 GUID primary keys, in each of two tables
            -- Create 6 database to test index job for DatabasesInParallel Database design based on example https://dba.stackexchange.com/q/9821/21924

            --Drop last test
            USE [master]
            exec asp_kill_user_connections [IndexTest_1]
            exec asp_kill_user_connections [IndexTest_2]
            exec asp_kill_user_connections [IndexTest_3]
            exec asp_kill_user_connections [IndexTest_4]
            exec asp_kill_user_connections [IndexTest_5]
            exec asp_kill_user_connections [IndexTest_6]
            GO

            DROP DATABASE [IndexTest_1]
            GO
            DROP DATABASE [IndexTest_2]
            GO
            DROP DATABASE [IndexTest_3]
            GO
            DROP DATABASE [IndexTest_4]
            GO
            DROP DATABASE [IndexTest_5]
            GO
            DROP DATABASE [IndexTest_6]
            GO

            -- create [IndexTest_1]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_1];
            GO

            USE IndexTest_1

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            -----------------------------------------------

            -- create [IndexTest_2]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_2];
            GO

            USE IndexTest_2

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            ------------------------------------

            -- create [IndexTest_3]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_3];
            GO

            USE IndexTest_3

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            ----------------------------------------
            -- create [IndexTest_4]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_4];
            GO

            USE IndexTest_4

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            ------------------------------------------------
            -- create [IndexTest_5]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_5];
            GO

            USE IndexTest_5

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END
            --------------------------------------------

            -- create [IndexTest_6]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_6];
            GO

            USE IndexTest_6

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END
            -------------------------------
            use master
            DBCC FREEPROCCACHE -- Clear plan cache for next text.





            share|improve this answer













            I wanted to make several "Ugly" indexes, so I did the following. It worked well



            -- Create databases to test index job, each database is about 800MB with 100,000 GUID primary keys, in each of two tables
            -- Create 6 database to test index job for DatabasesInParallel Database design based on example https://dba.stackexchange.com/q/9821/21924

            --Drop last test
            USE [master]
            exec asp_kill_user_connections [IndexTest_1]
            exec asp_kill_user_connections [IndexTest_2]
            exec asp_kill_user_connections [IndexTest_3]
            exec asp_kill_user_connections [IndexTest_4]
            exec asp_kill_user_connections [IndexTest_5]
            exec asp_kill_user_connections [IndexTest_6]
            GO

            DROP DATABASE [IndexTest_1]
            GO
            DROP DATABASE [IndexTest_2]
            GO
            DROP DATABASE [IndexTest_3]
            GO
            DROP DATABASE [IndexTest_4]
            GO
            DROP DATABASE [IndexTest_5]
            GO
            DROP DATABASE [IndexTest_6]
            GO

            -- create [IndexTest_1]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_1];
            GO

            USE IndexTest_1

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            -----------------------------------------------

            -- create [IndexTest_2]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_2];
            GO

            USE IndexTest_2

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            ------------------------------------

            -- create [IndexTest_3]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_3];
            GO

            USE IndexTest_3

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            ----------------------------------------
            -- create [IndexTest_4]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_4];
            GO

            USE IndexTest_4

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END

            ------------------------------------------------
            -- create [IndexTest_5]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_5];
            GO

            USE IndexTest_5

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END
            --------------------------------------------

            -- create [IndexTest_6]
            USE [master];
            GO

            CREATE DATABASE [IndexTest_6];
            GO

            USE IndexTest_6

            SET NOCOUNT ON

            CREATE TABLE TestGuidA (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            CREATE TABLE TestGuidB (Id UNIQUEIDENTIFIER NOT NULL DEFAULT NEWID() PRIMARY KEY,
            SomeDate DATETIME, batchNumber BIGINT, FILLER CHAR(100))

            DECLARE @BatchCounter INT = 1
            DECLARE @Numrows INT = 100000


            WHILE (@BatchCounter <= 20)
            BEGIN
            BEGIN TRAN

            DECLARE @LocalCounter INT = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidA (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @LocalCounter = 0

            WHILE (@LocalCounter <= @NumRows)
            BEGIN
            INSERT TestGuidB (SomeDate,batchNumber) VALUES (GETDATE(),@BatchCounter)
            SET @LocalCounter +=1
            END

            SET @BatchCounter +=1
            COMMIT
            END
            -------------------------------
            use master
            DBCC FREEPROCCACHE -- Clear plan cache for next text.






            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered 2 days ago









            James JenkinsJames Jenkins

            2,6112 gold badges25 silver badges48 bronze badges




            2,6112 gold badges25 silver badges48 bronze badges


























                1













                Usually, Index fragmentation happens when there is Update or Insert operation happens on the table.



                If you want quickly produce the issue (Index fragmentation), create an Index in your test table with less fill factor and do heavy Update or Insert operation on that table. You can work with these scripts..






                share|improve this answer






























                  1













                  Usually, Index fragmentation happens when there is Update or Insert operation happens on the table.



                  If you want quickly produce the issue (Index fragmentation), create an Index in your test table with less fill factor and do heavy Update or Insert operation on that table. You can work with these scripts..






                  share|improve this answer




























                    1












                    1








                    1







                    Usually, Index fragmentation happens when there is Update or Insert operation happens on the table.



                    If you want quickly produce the issue (Index fragmentation), create an Index in your test table with less fill factor and do heavy Update or Insert operation on that table. You can work with these scripts..






                    share|improve this answer













                    Usually, Index fragmentation happens when there is Update or Insert operation happens on the table.



                    If you want quickly produce the issue (Index fragmentation), create an Index in your test table with less fill factor and do heavy Update or Insert operation on that table. You can work with these scripts..







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered 2 days ago









                    Shekar KolaShekar Kola

                    3551 silver badge10 bronze badges




                    3551 silver badge10 bronze badges


























                        0













                        You can also use CRYPT_GEN_RANDOM as I did in this answer: Filter Schema In Index Optimize Script.



                        You can insert data in a numeric column which has an index on it to fragment it like this:



                        -- Fill with random integers to create fragmentation
                        INSERT INTO [ProdTable] (c1, c2) VALUES (CRYPT_GEN_RANDOM(8000), 'filler');
                        GO 12800


                        You could also update data or convert to a string instead of a number if that's what you need.






                        share|improve this answer






























                          0













                          You can also use CRYPT_GEN_RANDOM as I did in this answer: Filter Schema In Index Optimize Script.



                          You can insert data in a numeric column which has an index on it to fragment it like this:



                          -- Fill with random integers to create fragmentation
                          INSERT INTO [ProdTable] (c1, c2) VALUES (CRYPT_GEN_RANDOM(8000), 'filler');
                          GO 12800


                          You could also update data or convert to a string instead of a number if that's what you need.






                          share|improve this answer




























                            0












                            0








                            0







                            You can also use CRYPT_GEN_RANDOM as I did in this answer: Filter Schema In Index Optimize Script.



                            You can insert data in a numeric column which has an index on it to fragment it like this:



                            -- Fill with random integers to create fragmentation
                            INSERT INTO [ProdTable] (c1, c2) VALUES (CRYPT_GEN_RANDOM(8000), 'filler');
                            GO 12800


                            You could also update data or convert to a string instead of a number if that's what you need.






                            share|improve this answer













                            You can also use CRYPT_GEN_RANDOM as I did in this answer: Filter Schema In Index Optimize Script.



                            You can insert data in a numeric column which has an index on it to fragment it like this:



                            -- Fill with random integers to create fragmentation
                            INSERT INTO [ProdTable] (c1, c2) VALUES (CRYPT_GEN_RANDOM(8000), 'filler');
                            GO 12800


                            You could also update data or convert to a string instead of a number if that's what you need.







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered yesterday









                            Tom VTom V

                            14.3k7 gold badges51 silver badges80 bronze badges




                            14.3k7 gold badges51 silver badges80 bronze badges

































                                draft saved

                                draft discarded




















































                                Thanks for contributing an answer to Database Administrators Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f245693%2fhow-do-i-intentionally-fragment-a-sql-server-index%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                Taj Mahal Inhaltsverzeichnis Aufbau | Geschichte | 350-Jahr-Feier | Heutige Bedeutung | Siehe auch |...

                                Baia Sprie Cuprins Etimologie | Istorie | Demografie | Politică și administrație | Arii naturale...

                                Nicolae Petrescu-Găină Cuprins Biografie | Opera | In memoriam | Varia | Controverse, incertitudini...