Error while creating hive table through Kylin build cube











up vote
0
down vote

favorite












HI I am trying to build a cube with Kylin, the data gets souced fine from sqoop but the next step for creating hive tables fails . Looking at the command being fired it looks weird as the create statement looks good to me .



I think the issue is with DOUBLE types as when I remove the same the create statement works fine . Can someone please help .



I am using the stack in AWS EMR, kylin 2.5 hive 2.3.0



The errors logs with commands as as below



Command



    hive -e "USE default;
DROP TABLE IF EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368;
CREATE EXTERNAL TABLE IF NOT EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368
(
HOLDINGS_STOCK_INVESTOR_ID string
,STOCK_INVESTORS_CHANNEL string
,STOCK_STOCK_ID string
,STOCK_DOMICILE string
,STOCK_STOCK_NM string
,STOCK_APPROACH string
,STOCK_STOCK_TYP string
,INVESTOR_ID string
,INVESTOR_NM string
,INVESTOR_DOMICILE_CNTRY string
,CLIENT_NM string
,INVESTOR_HOLDINGS_GROSS_ASSETS_USD double(22)
,INVESTOR_HOLDINGS_NET_ASSETS_USD double(22)
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
STORED AS TEXTFILE
LOCATION 's3://wfg1tst-models/kylin/kylin_metadata/kylin-4ae3b18b-831b-da66-eb8c-7318245c4448/kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368';
ALTER TABLE kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368 SET TBLPROPERTIES('auto.purge'='true');

" --hiveconf hive.merge.mapredfiles=false --hiveconf hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf hive.exec.compress.output=true --hiveconf hive.auto.convert.join.noconditionaltask=true --hiveconf mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false --hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf hive.stats.autogather=true


Error is as below



OK
Time taken: 1.315 seconds
OK
Time taken: 0.09 seconds
MismatchedTokenException(334!=347)
at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204)
at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1316)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1456)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1236)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1226)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 15:42 mismatched input '(' expecting ) near 'double' in create table statement









share|improve this question




























    up vote
    0
    down vote

    favorite












    HI I am trying to build a cube with Kylin, the data gets souced fine from sqoop but the next step for creating hive tables fails . Looking at the command being fired it looks weird as the create statement looks good to me .



    I think the issue is with DOUBLE types as when I remove the same the create statement works fine . Can someone please help .



    I am using the stack in AWS EMR, kylin 2.5 hive 2.3.0



    The errors logs with commands as as below



    Command



        hive -e "USE default;
    DROP TABLE IF EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368;
    CREATE EXTERNAL TABLE IF NOT EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368
    (
    HOLDINGS_STOCK_INVESTOR_ID string
    ,STOCK_INVESTORS_CHANNEL string
    ,STOCK_STOCK_ID string
    ,STOCK_DOMICILE string
    ,STOCK_STOCK_NM string
    ,STOCK_APPROACH string
    ,STOCK_STOCK_TYP string
    ,INVESTOR_ID string
    ,INVESTOR_NM string
    ,INVESTOR_DOMICILE_CNTRY string
    ,CLIENT_NM string
    ,INVESTOR_HOLDINGS_GROSS_ASSETS_USD double(22)
    ,INVESTOR_HOLDINGS_NET_ASSETS_USD double(22)
    )
    ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
    STORED AS TEXTFILE
    LOCATION 's3://wfg1tst-models/kylin/kylin_metadata/kylin-4ae3b18b-831b-da66-eb8c-7318245c4448/kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368';
    ALTER TABLE kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368 SET TBLPROPERTIES('auto.purge'='true');

    " --hiveconf hive.merge.mapredfiles=false --hiveconf hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf hive.exec.compress.output=true --hiveconf hive.auto.convert.join.noconditionaltask=true --hiveconf mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false --hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf hive.stats.autogather=true


    Error is as below



    OK
    Time taken: 1.315 seconds
    OK
    Time taken: 0.09 seconds
    MismatchedTokenException(334!=347)
    at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
    at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
    at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
    at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
    at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
    at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
    at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204)
    at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
    at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1316)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1456)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1236)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1226)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
    FAILED: ParseException line 15:42 mismatched input '(' expecting ) near 'double' in create table statement









    share|improve this question


























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      HI I am trying to build a cube with Kylin, the data gets souced fine from sqoop but the next step for creating hive tables fails . Looking at the command being fired it looks weird as the create statement looks good to me .



      I think the issue is with DOUBLE types as when I remove the same the create statement works fine . Can someone please help .



      I am using the stack in AWS EMR, kylin 2.5 hive 2.3.0



      The errors logs with commands as as below



      Command



          hive -e "USE default;
      DROP TABLE IF EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368;
      CREATE EXTERNAL TABLE IF NOT EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368
      (
      HOLDINGS_STOCK_INVESTOR_ID string
      ,STOCK_INVESTORS_CHANNEL string
      ,STOCK_STOCK_ID string
      ,STOCK_DOMICILE string
      ,STOCK_STOCK_NM string
      ,STOCK_APPROACH string
      ,STOCK_STOCK_TYP string
      ,INVESTOR_ID string
      ,INVESTOR_NM string
      ,INVESTOR_DOMICILE_CNTRY string
      ,CLIENT_NM string
      ,INVESTOR_HOLDINGS_GROSS_ASSETS_USD double(22)
      ,INVESTOR_HOLDINGS_NET_ASSETS_USD double(22)
      )
      ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
      STORED AS TEXTFILE
      LOCATION 's3://wfg1tst-models/kylin/kylin_metadata/kylin-4ae3b18b-831b-da66-eb8c-7318245c4448/kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368';
      ALTER TABLE kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368 SET TBLPROPERTIES('auto.purge'='true');

      " --hiveconf hive.merge.mapredfiles=false --hiveconf hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf hive.exec.compress.output=true --hiveconf hive.auto.convert.join.noconditionaltask=true --hiveconf mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false --hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf hive.stats.autogather=true


      Error is as below



      OK
      Time taken: 1.315 seconds
      OK
      Time taken: 0.09 seconds
      MismatchedTokenException(334!=347)
      at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
      at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
      at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
      at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
      at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
      at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
      at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204)
      at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
      at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
      at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1316)
      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1456)
      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1236)
      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1226)
      at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
      at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:498)
      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
      FAILED: ParseException line 15:42 mismatched input '(' expecting ) near 'double' in create table statement









      share|improve this question















      HI I am trying to build a cube with Kylin, the data gets souced fine from sqoop but the next step for creating hive tables fails . Looking at the command being fired it looks weird as the create statement looks good to me .



      I think the issue is with DOUBLE types as when I remove the same the create statement works fine . Can someone please help .



      I am using the stack in AWS EMR, kylin 2.5 hive 2.3.0



      The errors logs with commands as as below



      Command



          hive -e "USE default;
      DROP TABLE IF EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368;
      CREATE EXTERNAL TABLE IF NOT EXISTS kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368
      (
      HOLDINGS_STOCK_INVESTOR_ID string
      ,STOCK_INVESTORS_CHANNEL string
      ,STOCK_STOCK_ID string
      ,STOCK_DOMICILE string
      ,STOCK_STOCK_NM string
      ,STOCK_APPROACH string
      ,STOCK_STOCK_TYP string
      ,INVESTOR_ID string
      ,INVESTOR_NM string
      ,INVESTOR_DOMICILE_CNTRY string
      ,CLIENT_NM string
      ,INVESTOR_HOLDINGS_GROSS_ASSETS_USD double(22)
      ,INVESTOR_HOLDINGS_NET_ASSETS_USD double(22)
      )
      ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
      STORED AS TEXTFILE
      LOCATION 's3://wfg1tst-models/kylin/kylin_metadata/kylin-4ae3b18b-831b-da66-eb8c-7318245c4448/kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368';
      ALTER TABLE kylin_intermediate_fm_inv_holdings_8a1c33df_d12b_3609_13ee_39e169169368 SET TBLPROPERTIES('auto.purge'='true');

      " --hiveconf hive.merge.mapredfiles=false --hiveconf hive.auto.convert.join=true --hiveconf dfs.replication=2 --hiveconf hive.exec.compress.output=true --hiveconf hive.auto.convert.join.noconditionaltask=true --hiveconf mapreduce.job.split.metainfo.maxsize=-1 --hiveconf hive.merge.mapfiles=false --hiveconf hive.auto.convert.join.noconditionaltask.size=100000000 --hiveconf hive.stats.autogather=true


      Error is as below



      OK
      Time taken: 1.315 seconds
      OK
      Time taken: 0.09 seconds
      MismatchedTokenException(334!=347)
      at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
      at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
      at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
      at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
      at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
      at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
      at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204)
      at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
      at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
      at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1316)
      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1456)
      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1236)
      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1226)
      at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336)
      at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:787)
      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:498)
      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
      FAILED: ParseException line 15:42 mismatched input '(' expecting ) near 'double' in create table statement






      hadoop hive kylin






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 19 at 16:24

























      asked Nov 19 at 15:16









      Gaurav Rawat

      6311629




      6311629
























          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote













          Ohh I think I finally Figured it out seems DOUBLE with precision is not supported by Hive . But I think Kylin should take care of this while importing the jdbc metadata into the model .



          Will raise an enhancement or bug in Kylin for the same .






          share|improve this answer




























            up vote
            0
            down vote













            I installed the newest stable hive in my test env, which version is 2.3.4( Hive DDL doc says 2.2.0 support double with precision). I found hive didn't not support double with specific/user-defined precision.



            e.g.




            • [ERROR] create table haha(ha double 10);

            • [ERROR] create table haha(ha double Precision 17);

            • [ERROR] create table haha(ha double(22));

            • [CORRECT] create table haha(ha double Precision);


            Hive CMD:



            hive> use ss;
            OK
            Time taken: 0.015 seconds
            hive> create table haha(ha double 10);
            FAILED: ParseException line 1:28 extraneous input '10' expecting ) near '<EOF>'
            hive> create table haha(ha double Precision 17);
            FAILED: ParseException line 1:38 extraneous input '17' expecting ) near '<EOF>'
            hive> create table haha(ha double(22));
            MismatchedTokenException(334!=347)
            at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
            at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
            at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
            at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
            at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
            at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
            at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:208)
            at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
            at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
            at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
            at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
            at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
            at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
            at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
            at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
            at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
            at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
            at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
            at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
            at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
            at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
            FAILED: ParseException line 1:27 mismatched input '(' expecting ) near 'double' in create table statement
            hive> create table haha(ha double Precision);
            OK
            Time taken: 0.625 seconds


            So, as far as I can concerned, Hive didn't support such data type definition. It can not be resolved by kylin dev.



            If you found any mistake, please let me know. And I found below links which maybe helpful:




            1. https://cwiki.apache.org/confluence/display/hive/LanguageManual+DDL#LanguageManualDDL-CreateTableCreate/Drop/TruncateTable


            2. https://issues.apache.org/jira/browse/HIVE-13556







            share|improve this answer





















              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53377623%2ferror-while-creating-hive-table-through-kylin-build-cube%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              1
              down vote













              Ohh I think I finally Figured it out seems DOUBLE with precision is not supported by Hive . But I think Kylin should take care of this while importing the jdbc metadata into the model .



              Will raise an enhancement or bug in Kylin for the same .






              share|improve this answer

























                up vote
                1
                down vote













                Ohh I think I finally Figured it out seems DOUBLE with precision is not supported by Hive . But I think Kylin should take care of this while importing the jdbc metadata into the model .



                Will raise an enhancement or bug in Kylin for the same .






                share|improve this answer























                  up vote
                  1
                  down vote










                  up vote
                  1
                  down vote









                  Ohh I think I finally Figured it out seems DOUBLE with precision is not supported by Hive . But I think Kylin should take care of this while importing the jdbc metadata into the model .



                  Will raise an enhancement or bug in Kylin for the same .






                  share|improve this answer












                  Ohh I think I finally Figured it out seems DOUBLE with precision is not supported by Hive . But I think Kylin should take care of this while importing the jdbc metadata into the model .



                  Will raise an enhancement or bug in Kylin for the same .







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Nov 19 at 16:27









                  Gaurav Rawat

                  6311629




                  6311629
























                      up vote
                      0
                      down vote













                      I installed the newest stable hive in my test env, which version is 2.3.4( Hive DDL doc says 2.2.0 support double with precision). I found hive didn't not support double with specific/user-defined precision.



                      e.g.




                      • [ERROR] create table haha(ha double 10);

                      • [ERROR] create table haha(ha double Precision 17);

                      • [ERROR] create table haha(ha double(22));

                      • [CORRECT] create table haha(ha double Precision);


                      Hive CMD:



                      hive> use ss;
                      OK
                      Time taken: 0.015 seconds
                      hive> create table haha(ha double 10);
                      FAILED: ParseException line 1:28 extraneous input '10' expecting ) near '<EOF>'
                      hive> create table haha(ha double Precision 17);
                      FAILED: ParseException line 1:38 extraneous input '17' expecting ) near '<EOF>'
                      hive> create table haha(ha double(22));
                      MismatchedTokenException(334!=347)
                      at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
                      at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
                      at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
                      at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
                      at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
                      at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
                      at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:208)
                      at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
                      at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
                      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
                      at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
                      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
                      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
                      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
                      at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
                      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
                      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
                      at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
                      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
                      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
                      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                      at java.lang.reflect.Method.invoke(Method.java:498)
                      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
                      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
                      FAILED: ParseException line 1:27 mismatched input '(' expecting ) near 'double' in create table statement
                      hive> create table haha(ha double Precision);
                      OK
                      Time taken: 0.625 seconds


                      So, as far as I can concerned, Hive didn't support such data type definition. It can not be resolved by kylin dev.



                      If you found any mistake, please let me know. And I found below links which maybe helpful:




                      1. https://cwiki.apache.org/confluence/display/hive/LanguageManual+DDL#LanguageManualDDL-CreateTableCreate/Drop/TruncateTable


                      2. https://issues.apache.org/jira/browse/HIVE-13556







                      share|improve this answer

























                        up vote
                        0
                        down vote













                        I installed the newest stable hive in my test env, which version is 2.3.4( Hive DDL doc says 2.2.0 support double with precision). I found hive didn't not support double with specific/user-defined precision.



                        e.g.




                        • [ERROR] create table haha(ha double 10);

                        • [ERROR] create table haha(ha double Precision 17);

                        • [ERROR] create table haha(ha double(22));

                        • [CORRECT] create table haha(ha double Precision);


                        Hive CMD:



                        hive> use ss;
                        OK
                        Time taken: 0.015 seconds
                        hive> create table haha(ha double 10);
                        FAILED: ParseException line 1:28 extraneous input '10' expecting ) near '<EOF>'
                        hive> create table haha(ha double Precision 17);
                        FAILED: ParseException line 1:38 extraneous input '17' expecting ) near '<EOF>'
                        hive> create table haha(ha double(22));
                        MismatchedTokenException(334!=347)
                        at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
                        at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
                        at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
                        at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
                        at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
                        at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
                        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:208)
                        at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
                        at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
                        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
                        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
                        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
                        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
                        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
                        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
                        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
                        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
                        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
                        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
                        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
                        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                        at java.lang.reflect.Method.invoke(Method.java:498)
                        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
                        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
                        FAILED: ParseException line 1:27 mismatched input '(' expecting ) near 'double' in create table statement
                        hive> create table haha(ha double Precision);
                        OK
                        Time taken: 0.625 seconds


                        So, as far as I can concerned, Hive didn't support such data type definition. It can not be resolved by kylin dev.



                        If you found any mistake, please let me know. And I found below links which maybe helpful:




                        1. https://cwiki.apache.org/confluence/display/hive/LanguageManual+DDL#LanguageManualDDL-CreateTableCreate/Drop/TruncateTable


                        2. https://issues.apache.org/jira/browse/HIVE-13556







                        share|improve this answer























                          up vote
                          0
                          down vote










                          up vote
                          0
                          down vote









                          I installed the newest stable hive in my test env, which version is 2.3.4( Hive DDL doc says 2.2.0 support double with precision). I found hive didn't not support double with specific/user-defined precision.



                          e.g.




                          • [ERROR] create table haha(ha double 10);

                          • [ERROR] create table haha(ha double Precision 17);

                          • [ERROR] create table haha(ha double(22));

                          • [CORRECT] create table haha(ha double Precision);


                          Hive CMD:



                          hive> use ss;
                          OK
                          Time taken: 0.015 seconds
                          hive> create table haha(ha double 10);
                          FAILED: ParseException line 1:28 extraneous input '10' expecting ) near '<EOF>'
                          hive> create table haha(ha double Precision 17);
                          FAILED: ParseException line 1:38 extraneous input '17' expecting ) near '<EOF>'
                          hive> create table haha(ha double(22));
                          MismatchedTokenException(334!=347)
                          at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
                          at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
                          at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:208)
                          at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
                          at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
                          at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
                          at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
                          at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
                          at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
                          at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
                          at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
                          at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
                          at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
                          at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
                          at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
                          at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
                          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                          at java.lang.reflect.Method.invoke(Method.java:498)
                          at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
                          at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
                          FAILED: ParseException line 1:27 mismatched input '(' expecting ) near 'double' in create table statement
                          hive> create table haha(ha double Precision);
                          OK
                          Time taken: 0.625 seconds


                          So, as far as I can concerned, Hive didn't support such data type definition. It can not be resolved by kylin dev.



                          If you found any mistake, please let me know. And I found below links which maybe helpful:




                          1. https://cwiki.apache.org/confluence/display/hive/LanguageManual+DDL#LanguageManualDDL-CreateTableCreate/Drop/TruncateTable


                          2. https://issues.apache.org/jira/browse/HIVE-13556







                          share|improve this answer












                          I installed the newest stable hive in my test env, which version is 2.3.4( Hive DDL doc says 2.2.0 support double with precision). I found hive didn't not support double with specific/user-defined precision.



                          e.g.




                          • [ERROR] create table haha(ha double 10);

                          • [ERROR] create table haha(ha double Precision 17);

                          • [ERROR] create table haha(ha double(22));

                          • [CORRECT] create table haha(ha double Precision);


                          Hive CMD:



                          hive> use ss;
                          OK
                          Time taken: 0.015 seconds
                          hive> create table haha(ha double 10);
                          FAILED: ParseException line 1:28 extraneous input '10' expecting ) near '<EOF>'
                          hive> create table haha(ha double Precision 17);
                          FAILED: ParseException line 1:38 extraneous input '17' expecting ) near '<EOF>'
                          hive> create table haha(ha double(22));
                          MismatchedTokenException(334!=347)
                          at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
                          at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:6179)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3808)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2382)
                          at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1333)
                          at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:208)
                          at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:77)
                          at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:70)
                          at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
                          at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317)
                          at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457)
                          at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
                          at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
                          at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
                          at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
                          at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
                          at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
                          at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
                          at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
                          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                          at java.lang.reflect.Method.invoke(Method.java:498)
                          at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
                          at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
                          FAILED: ParseException line 1:27 mismatched input '(' expecting ) near 'double' in create table statement
                          hive> create table haha(ha double Precision);
                          OK
                          Time taken: 0.625 seconds


                          So, as far as I can concerned, Hive didn't support such data type definition. It can not be resolved by kylin dev.



                          If you found any mistake, please let me know. And I found below links which maybe helpful:




                          1. https://cwiki.apache.org/confluence/display/hive/LanguageManual+DDL#LanguageManualDDL-CreateTableCreate/Drop/TruncateTable


                          2. https://issues.apache.org/jira/browse/HIVE-13556








                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered Nov 27 at 12:00









                          敏 丞

                          488




                          488






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.





                              Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                              Please pay close attention to the following guidance:


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53377623%2ferror-while-creating-hive-table-through-kylin-build-cube%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              "Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

                              Alcedinidae

                              Origin of the phrase “under your belt”?