LogStash dissect with key=value, comma





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















I have a pattern of logs that contain performance&statistical data. I have configured LogStash to dissect this data as csv format in order to save the values to ES.



<1>,www1,3,BISTATS,SCAN,330,712.6,2035,17.3,221.4,656.3



I am using the following LogSTash filter and getting the desired results..



grok {
match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
overwrite => [ "Message" ]
}
csv {
separator => ","
columns => ["pan_scan","pf01","pf02","pf03","kk04","uy05","xd06"]
}


This is currently working well for me as long as the order of the columns doesn't get messed up.



However I want to make this logfile more meaningful and have each column-name in the original log. example-- <1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3



This way I can keep inserting or appending key/values in the middle of the process without corrupting the data. (Using LogStash5.3)










share|improve this question

























  • Instead of grok, which is way too brittle for this, use the csv filter. And for key-value pairs, use the kv filter.

    – baudsp
    Nov 23 '18 at 12:58




















0















I have a pattern of logs that contain performance&statistical data. I have configured LogStash to dissect this data as csv format in order to save the values to ES.



<1>,www1,3,BISTATS,SCAN,330,712.6,2035,17.3,221.4,656.3



I am using the following LogSTash filter and getting the desired results..



grok {
match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
overwrite => [ "Message" ]
}
csv {
separator => ","
columns => ["pan_scan","pf01","pf02","pf03","kk04","uy05","xd06"]
}


This is currently working well for me as long as the order of the columns doesn't get messed up.



However I want to make this logfile more meaningful and have each column-name in the original log. example-- <1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3



This way I can keep inserting or appending key/values in the middle of the process without corrupting the data. (Using LogStash5.3)










share|improve this question

























  • Instead of grok, which is way too brittle for this, use the csv filter. And for key-value pairs, use the kv filter.

    – baudsp
    Nov 23 '18 at 12:58
















0












0








0








I have a pattern of logs that contain performance&statistical data. I have configured LogStash to dissect this data as csv format in order to save the values to ES.



<1>,www1,3,BISTATS,SCAN,330,712.6,2035,17.3,221.4,656.3



I am using the following LogSTash filter and getting the desired results..



grok {
match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
overwrite => [ "Message" ]
}
csv {
separator => ","
columns => ["pan_scan","pf01","pf02","pf03","kk04","uy05","xd06"]
}


This is currently working well for me as long as the order of the columns doesn't get messed up.



However I want to make this logfile more meaningful and have each column-name in the original log. example-- <1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3



This way I can keep inserting or appending key/values in the middle of the process without corrupting the data. (Using LogStash5.3)










share|improve this question
















I have a pattern of logs that contain performance&statistical data. I have configured LogStash to dissect this data as csv format in order to save the values to ES.



<1>,www1,3,BISTATS,SCAN,330,712.6,2035,17.3,221.4,656.3



I am using the following LogSTash filter and getting the desired results..



grok {
match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
overwrite => [ "Message" ]
}
csv {
separator => ","
columns => ["pan_scan","pf01","pf02","pf03","kk04","uy05","xd06"]
}


This is currently working well for me as long as the order of the columns doesn't get messed up.



However I want to make this logfile more meaningful and have each column-name in the original log. example-- <1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3



This way I can keep inserting or appending key/values in the middle of the process without corrupting the data. (Using LogStash5.3)







logstash logstash-grok logstash-configuration






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 23 '18 at 12:49







Shawn

















asked Nov 23 '18 at 12:43









ShawnShawn

1,0681820




1,0681820













  • Instead of grok, which is way too brittle for this, use the csv filter. And for key-value pairs, use the kv filter.

    – baudsp
    Nov 23 '18 at 12:58





















  • Instead of grok, which is way too brittle for this, use the csv filter. And for key-value pairs, use the kv filter.

    – baudsp
    Nov 23 '18 at 12:58



















Instead of grok, which is way too brittle for this, use the csv filter. And for key-value pairs, use the kv filter.

– baudsp
Nov 23 '18 at 12:58







Instead of grok, which is way too brittle for this, use the csv filter. And for key-value pairs, use the kv filter.

– baudsp
Nov 23 '18 at 12:58














1 Answer
1






active

oldest

votes


















0














By using @baudsp recommendations, I was able to formulate the following. I deleted the csv{} block completely and replace it with the kv{} block. The kv{} automatically created all the key values leaving me to only mutate{} the fields into floats and integers.



 json {
source => "message"
remove_field => [ "message", "headers" ]
}
date {
match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSS'Z'" ]
target => "timestamp"
}
grok {
match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
overwrite => [ "Message" ]
}
kv {
allow_duplicate_values => false
field_split_pattern => ","
}


Using the above block, I was able to insert the K=V, pairs anywhere in the message. Thanks again for all the help. I have added a sample code block for anyone trying to accomplish this task.



Note: I am using NLog for logging, which produces JSON outputs. From the C# code, the format looks like this.



var logger = NLog.LogManager.GetCurrentClassLogger();
logger.ExtendedInfo("<1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3");





share|improve this answer
























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53446935%2flogstash-dissect-with-key-value-comma%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    By using @baudsp recommendations, I was able to formulate the following. I deleted the csv{} block completely and replace it with the kv{} block. The kv{} automatically created all the key values leaving me to only mutate{} the fields into floats and integers.



     json {
    source => "message"
    remove_field => [ "message", "headers" ]
    }
    date {
    match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSS'Z'" ]
    target => "timestamp"
    }
    grok {
    match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
    overwrite => [ "Message" ]
    }
    kv {
    allow_duplicate_values => false
    field_split_pattern => ","
    }


    Using the above block, I was able to insert the K=V, pairs anywhere in the message. Thanks again for all the help. I have added a sample code block for anyone trying to accomplish this task.



    Note: I am using NLog for logging, which produces JSON outputs. From the C# code, the format looks like this.



    var logger = NLog.LogManager.GetCurrentClassLogger();
    logger.ExtendedInfo("<1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3");





    share|improve this answer




























      0














      By using @baudsp recommendations, I was able to formulate the following. I deleted the csv{} block completely and replace it with the kv{} block. The kv{} automatically created all the key values leaving me to only mutate{} the fields into floats and integers.



       json {
      source => "message"
      remove_field => [ "message", "headers" ]
      }
      date {
      match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSS'Z'" ]
      target => "timestamp"
      }
      grok {
      match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
      overwrite => [ "Message" ]
      }
      kv {
      allow_duplicate_values => false
      field_split_pattern => ","
      }


      Using the above block, I was able to insert the K=V, pairs anywhere in the message. Thanks again for all the help. I have added a sample code block for anyone trying to accomplish this task.



      Note: I am using NLog for logging, which produces JSON outputs. From the C# code, the format looks like this.



      var logger = NLog.LogManager.GetCurrentClassLogger();
      logger.ExtendedInfo("<1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3");





      share|improve this answer


























        0












        0








        0







        By using @baudsp recommendations, I was able to formulate the following. I deleted the csv{} block completely and replace it with the kv{} block. The kv{} automatically created all the key values leaving me to only mutate{} the fields into floats and integers.



         json {
        source => "message"
        remove_field => [ "message", "headers" ]
        }
        date {
        match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSS'Z'" ]
        target => "timestamp"
        }
        grok {
        match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
        overwrite => [ "Message" ]
        }
        kv {
        allow_duplicate_values => false
        field_split_pattern => ","
        }


        Using the above block, I was able to insert the K=V, pairs anywhere in the message. Thanks again for all the help. I have added a sample code block for anyone trying to accomplish this task.



        Note: I am using NLog for logging, which produces JSON outputs. From the C# code, the format looks like this.



        var logger = NLog.LogManager.GetCurrentClassLogger();
        logger.ExtendedInfo("<1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3");





        share|improve this answer













        By using @baudsp recommendations, I was able to formulate the following. I deleted the csv{} block completely and replace it with the kv{} block. The kv{} automatically created all the key values leaving me to only mutate{} the fields into floats and integers.



         json {
        source => "message"
        remove_field => [ "message", "headers" ]
        }
        date {
        match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss.SSS'Z'" ]
        target => "timestamp"
        }
        grok {
        match => { "Message" => "A<%{POSINT:priority}>,%{DATA:pan_host},%{DATA:pan_serial_number},%{DATA:pan_type},%{GREEDYDATA:message}z" }
        overwrite => [ "Message" ]
        }
        kv {
        allow_duplicate_values => false
        field_split_pattern => ","
        }


        Using the above block, I was able to insert the K=V, pairs anywhere in the message. Thanks again for all the help. I have added a sample code block for anyone trying to accomplish this task.



        Note: I am using NLog for logging, which produces JSON outputs. From the C# code, the format looks like this.



        var logger = NLog.LogManager.GetCurrentClassLogger();
        logger.ExtendedInfo("<1>,www1,30000,BISTATS,SCAN,pf01=330,pf02=712.6,pf03=2035,kk04=17.3,uy05=221.4,xd06=656.3");






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 26 '18 at 6:11









        ShawnShawn

        1,0681820




        1,0681820
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53446935%2flogstash-dissect-with-key-value-comma%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            "Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

            Alcedinidae

            RAC Tourist Trophy